var/home/core/zuul-output/0000755000175000017500000000000015137411750014531 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015137416616015503 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000271366415137416443020301 0ustar corecore#~ikubelet.log_o[;r)Br'o b-n(!9t%Cs7}g/غIs,r.k9GfD .lgm.&]low|_v-VWYw?.y7O}zi^|1Fr_?c^*߶E٬:rv筼ح_y~̎+\/_p/Bj^ֻ]Eo^O/(_/V?,<']_kmN:`Si{ C2i1Gdē _%Kٻւ(Ĩ$#TLX h~lys%v6:SFA֗f΀QՇ2Kݙ$ӎ;IXN :7sL0x.`6)ɚL}ӄ]C }I4Vv@%٘e#dc0Fn 촂iHSr`岮X7̝4?qKf, # qe䧤 ss]QzH.ad!rJBi`V +|i}}THW{y|*/BP3m3A- ZPmN^iL[NrrݝE)~QGGAj^3}wy/{47[q)&c(޸0"$5ڪҾη*t:%?vEmO5tqÜ3Cyu '~qlN?}|nLFR6f8yWxYd ;K44|CK4UQviYDZh$#*)e\W$IAT;s0Gp}=9ڠedۜ+EaH#QtDV:?7#w4r_۾8ZJ%PgS!][5ߜQZ݇~- MR9z_Z;57xh|_/CWuU%v[_((G yMi@'3Pmz8~Y >hl%}Р`sMC77Aztԝp ,}Nptt%q6& ND lM;ָPZGa(X(2*91n,50/mx'})')SĔv}S%xhRe)a@r AF' ]J)ӨbqMWNjʵ2PK-guZZg !M)a(!H/?R?Q~}% ;]/ľv%T&hoP~(*טj=dߛ_SRzSa™:']*}EXɧM<@:jʨΨrPE%NT&1H>g":ͨ ҄v`tYoTq&OzcP_k(PJ'ήYXFgGہwħkIM*򸆔l=q VJީ#b8&RgX2qBMoN w1ђZGd m 2P/Ɛ!" aGd;0RZ+ 9O5KiPc7CDG.b~?|ђP? -8%JNIt"`HP!]ZrͰ4j8!*(jPcǷ!)'xmv>!0[r_G{j 6JYǹ>zs;tc.mctie:x&"bR4S uV8/0%X8Ua0NET݃jYAT` &AD]Ax95mvXYs"(A+/_+*{b }@UP*5ì"M|܊W7|}N{mL=d]' =MS2[3(/hoj$=Zm Mlh>P>Qwf8*c4˥Ęk(+,«.c%_~&^%80=1Jgͤ39(&ʤdH0Ζ@.!)CGt?~=ˢ>f>\bN<Ⱦtë{{b2hKNh`0=/9Gɺɔ+'Х[)9^iX,N&+1Id0ֶ|}!oѶvhu|8Qz:^S-7;k>U~H><~5i ˿7^0*]h,*aklVIKS7d'qAWEݰLkS :}%J6TIsbFʶ褢sFUC)(k-C"TQ[;4j39_WiZSس:$3w}o$[4x:bl=pd9YfAMpIrv̡}XI{B%ZԎuHvhd`Η|ʣ)-iaE';_j{(8xPA*1bv^JLj&DY3#-1*I+g8a@(*%kX{ Z;#es=oi_)qb㼃{buU?zT u]68 QeC Hl @R SFZuU&uRz[2(A1ZK(O5dc}QQufCdX($0j(HX_$GZaPo|P5q @3ǟ6 mR!c/24مQNֆ^n,hU֝cfT :):[gCa?\&IpW$8!+Uph*/ o/{")qq҈78݇hA sTB*F$6 2C` |ɧJ~iM cO;m#NV?d?TCg5otޔC1s`u.EkB6ga׬9J2&vV,./ӐoQJ*Dw*^sCeyWtɖ9F.[-cʚmD (QMW`zP~n"U'8%kEq*Lr;TY *BCCpJhxUpܺDoGdlaQ&8#v| (~~yZ-VW"T- 0c̖F4BJ2ᮚ苮p(r%Q 6<$(Ӣ(RvA A-^dX?K0D"\KjPQ>Y{Ÿ>14`SČ.HPdp12 (7 _:+$ߗv{wzM$VbήdsOw<}#b[E7imH'Y`;5{$ь'gISzp; AQvDIyHc<槔w w?38v?Lsb s "NDr3\{J KP/ߢ/emPW֦?>Y5p&nr0:9%Ws$Wc0FS=>Qp:!DE5^9-0 R2ڲ]ew۵jI\'iħ1 {\FPG"$$ {+!˨?EP' =@~edF \r!٤ã_e=P1W3c +A)9V ]rVmeK\4? 8'*MTox6[qn2XwK\^-ޖA2U]E_Dm5^"d*MQǜq؈f+C/tfRxeKboc5Iv{K TV}uuyk s" &ﱏҞO/ont~]5\ʅSHwӍq6Ung'!! e#@\YV,4&`-6 E=߶EYE=P?~݆]Ōvton5 lvǫV*k*5]^RFlj]R#Uz |wmTeM kuu8@8/X[1fiMiT+9[ŗ6 BN=rR60#tE#u2k *+e7[YU6Msj$wբh+8kMZY9X\u7Kp:׽ ^҃5M>!6~ö9M( Pnuݮ)`Q6eMӁKzFZf;5IW1i[xU 0FPM]gl}>6sUDO5f p6mD[%ZZvm̓'!n&.TU n$%rIwP(fwnv :Nb=X~ax`;Vw}wvRS1q!z989ep 5w%ZU.]5`s=r&v2FaUM 6/"IiBSpp3n_9>Byݝ0_5bZ8ւ 6{Sf觋-V=Oߖm!6jm3Kx6BDhvzZn8hSlz z6^Q1* _> 8A@>!a:dC<mWu[7-D[9)/*˸PP!j-7BtK|VXnT&eZc~=31mס̈'K^r,W˲vtv|,SԽ[qɑ)6&vד4G&%JLi[? 1A ۥ͟յt9 ",@9 P==s 0py(nWDwpɡ`i?E1Q!:5*6@q\\YWTk sspww0SZ2, uvao=\Sl Uݚu@$Pup՗з҃TXskwqRtYڢLhw KO5C\-&-qQ4Mv8pS俺kCߤ`ZnTV*P,rq<-mOK[[ߢm۽ȑt^, tJbظ&Pg%㢒\QS܁vn` *3UP0Sp8:>m(Zx ,c|!0=0{ P*27ެT|A_mnZ7sDbyT'77J6:ѩ> EKud^5+mn(fnc.^xt4gD638L"!}LpInTeD_1ZrbkI%8zPU:LNTPlI&N:o&2BVb+uxZ`v?7"I8hp A&?a(8E-DHa%LMg2:-ŷX(ǒ>,ݵ𴛾é5Zٵ]z"]òƓVgzEY9[Nj_vZ :jJ2^b_ F w#X6Sho禮<u8.H#',c@V8 iRX &4ڻ8zݽ.7jhvQ:H0Np: qfՋ40oW&&ף \9ys8;ӷL:@۬˨vvn/sc}2N1DDa(kx.L(f"-Da +iP^]OrwY~fwA#ٔ!:*땽Zp!{g4څZtu\1!ѨW(7qZcpL)ύ-G~^rFD+"?_h)yh=x>5ܙQ~O_e琇HBzI7*-Oi* VšPȰһ8hBőa^mX%SHR Fp)$J7A3&ojp/68uK͌iΙINmq&} O L-\ n4f/uc:7k]4NIGv*1藺yx2Qz75cjS LJ5jRS}_D U4x[c) ,`̔Dvckk5Ťã0le۞]o~oW(91ݧ$uxp/Cq6Un9%Z`.hI"!A6:uQզ}@j=Mo<}nYUw1Xw:]e/sm lˣaVۤkĨdԖ)RtS2 "E I"{;ōCb{yex&Td >@).p$`XKxnX~E膂Og\IGֻq<-uˮ◶>waPcPw3``m- } vS¢=j=1 W=&;JW(7b ?Q.|K,ϩ3g)D͵Q5PBj(h<[rqTɈjM-y͢FY~p_~O5-֠kDNTͷItI1mk"@$AǏ}%S5<`d+0o,AրcbvJ2O`gA2Ȏp@c/"ɭex^k$# $V :]PGszy <H(^jJ=䄸-m!AdEږG)շj#v;#y/hbv BO Iߒ {I7!UՆGIl HƗbd#HAF:iI }+2kK:Sov3b:1)'A6@\2X#Ih9N ̢t-mfeF;gUаQ/ .D%ES*;OLRX[vDb:7a}YF30H #iSpʳ]'_'ĕ -׉6tfЮ$zͪO_sYq+q艻*vzh5~Yy;,DiYTP;o./~^.6+zZFD& m@WXe{sa 2tc^XS?irG#^ŲDI'H_Ȯ;RJ&GT.Kwj;of¬zHmmS2ҒN'=zAΈ\b*K ڤUy""&D@iS=3&N+ǵtX^7ǩX"CA⥎å+4@{D/-:u5I꾧fY iʱ= %lHsd6+H~ Δ,&颒$tSL{yєYa$ H>t~q؈xRmkscXQG~gD20zQ*%iQI$!h/Vo^:y1(t˥C"*FFDEMAƚh $ /ɓzwG1Ƙl"oN:*xmS}V<"dH,^)?CpҒ7UΊ,*n.֙J߾?Ϲhӷƀc"@9Fў-Zm1_tH[A$lVE%BDI yȒv $FO[axr Y#%b Hw)j4&hCU_8xS] _N_Z6KhwefӞ@蹃DROo X"%q7<# '9l%w:9^1ee-EKQ'<1=iUNiAp(-I*#iq&CpB.$lٴާt!jU_L~Tb_,֪r>8P_䅱lw1ù=LAЦz38ckʖYz ~kQRL Q rGQ/ȆMC)vg1Xa!&'0Dp\~^=7jv "8O AfI; P|ޓܜ 8qܦzl5tw@,Mڴg$%82h7էoaz32h>`XT>%)pQ}Tgĸ6Coɲ=8f`KݜȆqDDbZ:B#O^?tNGw\Q.pPO @:Cg9dTcxRk&%])ў}VLN]Nbjgg`d]LGϸ.yҵUCL(us6*>B 2K^ sBciۨvtl:J;quӋkKϮ듃ԁ6Y.0O۾'8V%1M@)uIw].5km~Ҷ綝R(mtV3rșjmjJItHڒz>6nOj5~IJ|~!yKڮ2 h 3x}~ے4WYr9Ts] AA$ұ}21;qbUwRK #}u'tLi'^Y&,mCM)eu㠥Ѻ\a}1:V1zMzT}R,IA e<%!vĉq|?mtB|A ?dXuWLGml?*uTC̶V`FVY>ECmDnG+UaKtȃbeb筃kݴO~f^⊈ 8MK?:mM;ߵoz+O~e3݌ƺ(ܸf)*gCQE*pp^~x܃`U'A~E90t~8-2S󹞙nk56s&"mgVKA: X>7QQ-CDC'| #]Y1E-$nP4N0#C'dvܸȯ.vIH"ŐR ;@~y>Kv{) 9AG ćͩ$.!б~N8i"1KФ\L7/,U@.ڮO?mُa ې!rGHw@56DǑq LA!&mYJ*ixz2*{_;IYJXFfQ* 0kA".mݡ"3`Rd1_u6d逖`7xGMf}k/⨼0Κ_pLq7k!dTlIxpI"wS S 'MV+Z:H2d,P4J8 L72?og1>b$]ObsKx̊y`bE&>XYs䀚EƂ@K?n>lhTm' nܡvO+0fqf٠r,$/Zt-1-dė}2Or@3?]^ʧM <mBɃkQ }^an.Fg86}I h5&XӘ8,>b _ z>9!Z>gUŞ}xTL̵ F8ՅX/!gqwߑZȖF 3U>gCCY Hsc`% s8,A_R$קQM17h\EL#w@>omJ/ŵ_iݼGw eIJipFrO{uqy/]c 2ėi_e}L~5&lҬt񗽐0/λL[H* JzeMlTr &|R 2ӗh$cdk?vy̦7]Ạ8ph?z]W_MqKJ> QA^"nYG0_8`N 7{Puٽ/}3ymGqF8RŔ.MMWrO»HzC7ݴLLƓxxi2mW4*@`tF)Ċwml"Ms>\΋"?|NKfֱn !4Vڄ`[nUgu$ B6 [^7 |Xpn1]nr CC5`F`J `rKJ;?28¢E WiBhFa[|ݩSRO3]J-҅31,jl3Y QuH vΎ]n_2a62;VI/ɮ|Lu>'$0&*m.)HzzBvU0h} -_.7^nya+Cs 6K!x^' ^7 l 2Jj.S֔(*CjaS:vp/N6I*x8"EȿQa[qVM/)fpOj4r!:V_IG^nILVG#A7jF};qPU嗈M9VS;a+Ӧ8E8zmMs*7NM~@6 ' 8jp*:'SOANa0rӍ?DT%l)gvN}JT(Ȋqm|dc+lQai,|Dߟ|, d#EjZܴv]pEO7}&gbXԈedKX :+Z|p8"81,w:$TiVD7ֶ]cga@>\X=4OZSܿ* %xccDa.E h :R.qɱMu$ơI8>^V Y. ,BLq~z&0o- ,BLqfx9y:9244ANb n\"X>Y`bb*h%)(*_Gra^ sh6"BzƾH( ."e)B QlKlXt҈t9՚$ضz]'.!-r"1MCĦʸ"66pE{ =CNc\ESD[T4azry !5yY~ :3;Y[Iȧ q:i Ǟ/"8Wxç,vܰtX-LE7 |-D`JLw9|fb>4Nu ߏ3ap5k_JA+A.A~ C~`[KaQ-Ģn9ѧf q:cT >to^ X]j?-ȇlCf0hM`~ ó}0W@o  K[{d+`ze"l |d;L2k%x90ݙ^Oe ]nHfS+.4<#/5߁ݛǪ0q,7FeV/!; 瓠 Li% z}ɯww"O-]J`sdN$@"J`Y13K/9`VTElsX|D^c%֯T][$m;ԝ!,Z5f`XFzȁ=nrSA8; P=uY}r/27OUa%~0;үM3Tu ȩ*'3IC~LG,?.?C3tBYpm_g.~>3ʄ55[c&-Wgy_jVo,?s w*n\7[cpMY<~/"˘oV܉T6nn \_ߋV_}Z=k-nn sn.*upw pX\_ U-C_wS!|q?E-S_w$-#9?mh{R 4ѭm_9p -h2 dֲ 1"j {]]Nk"䁖%5'32hDz O\!f3KX0kIKq"H~%.b@:Oec6^:V8FDza5H`:&Q5 ^hI8nʁu EA~V O8Z-mYO!tO֠υ9G`6qmJc,Qh: ݢKNw2taC0Z' O > f-`:F_Ѫ2)sCj1THɩhS-^p b~?.>, `0!E%ҏ:H =VՑӄ| Ć.lL t1]}r^nʂI-|i*'yW='W6M$oeB,޳X$I6c>EK# 15ۑO2Jh)8Vgl0v/eNEU"Ik dRu˜6Uǖ xs%P ع omWl҈sApX!^ Ɩgv{Xn|$̇d`>1Ljn떚F+B9l"UP۾u2Ja>0c0Vvގj$]p^M+f~@9{bOe@7ȱ^%u~-B竟} |23 Z.`oqD>t@N _7c$h3`lg\)[h+pHBr^J |r\8czEnv@qZbRT1e8V Scc6:$[|a.fpU`ZR֩bKgTlѩynۢ, "1LӰW&jDkM~# (C>ϭQ3{ߤ%EN;?P%ٱm -{2k 8Vbv"wŏݙmn&O1^'}plM)0\n ή ?Cֲa9H] lX9^vCο -vd+OUgRy2Я\ B0!% #>bJPUck\Ul'F瘏Y4Ew`[x٘p,>9V"R1I>bJ` UL'5m1Ԥ:t6I >jz(:W֪Ƹ)!fꠗe[XLE4atGS1px#S]MF˦NJPYDX%ܠꡗhl}i9f?q>b-E'V"mNf""ŦK9kǍ-vU #`uVi<s)/=r=nlӗЩsdLyVIUI':4^6& t,O669Ȁ,EʿkڍfC58$5?DX 4q]ll9W@/zNaZf% >Ę_"+BLu>'Ɩ=xɮ[⠋X((6I#z)2S zp&m?e8 "(O+:Y EaSD]<^(]В|Ǚ8"oRs?]\McZ0ϕ!1hKS`h0O{!L-w]ln2&0Ǚ'0=.T4G7! H/ͺ|@lX)+{{^s1V63 ۗI"*al NJ`Q8B\pup6_3XqCXznL9:{o qcuו8`n{ave=}OR9~yL Z1=W8>É R$|L ]OfJl˪VVg:lDԒ͢Zu[kWۗw{{7st08`J0ꨴU1|z:9dX)z2!S:'q9| 76"Q;D*04Zٚ ?V¼r8/G:T6Fw/ɚ~h?lUc3MکEen壹n\殸,˛_uu.Jssu/*47U0)l?R_^Uon̝f-nnZTeuu nn/*0׷տ·sHH?Et _I`[>>0ւcz/Adh.$@bѨLtT=cKGX nݔ͆!`c|Lu_~ǴT?crO e9d ljB?K_z>p%'3JQK-͗R>KkΤOq,*I|0]Sj%|-Ԟ = Ʃ%>H&t;9`>$& nIdE Ͻq*nŘʰNҁkmW/Ke4Gp6ws "D֍,z8g(Ɏ&6nmH7pԌFUL8Q=J"".,yD0~"PmVCUl1I5CUPaPu U! )Ҁwdq)o M9vPK<ɖOHaQ"7[SYYq{\~(ϣB6Id Syp!_3?X5r<xTCUETvf<-mX,^$²`YI#OxeɠNE2 d7ǣA O=™ 2<'"$S18/8W{ y;\}Ƿ>9JJ WNI闷:%e'ON`1n/b$$$xm|( -&n\.JH'zDZ#Ќ(hD(C|p0\8q#aE<Ф5jrbj{PyVTt1{sބ.e/ɗms^aMH_/Ǵ^nR?W[Ha פ귃_,Q㻒&=}OIl[&}}Ob aJ3:8 mza!5<ٷ( <"(y>YXi/QFȻK!}rxM[d<&LzE+ӹ2+fI< cͣ:jЂR,(Eb${7*:B!/<<^k_L{7o Gv:uI]۵|%*t_^r*š4B>@%s.L;zKD2GF8ubIYU'^>ꨤ(c?KiDH y5;LTŸÁeCkYEǫ6"k{>!ƚ5Ґȳy" LGBx3,.G9/N@|rdQ\P וՇj1ORoXƣK)G˷{+T.'*Z-u^ydehk1SjN-gKGњ )UUbh~03^x̢jK.ƣ?$Q,<[@ (/K xg~aVh`ԂBpC..oyfB($+eUUNGlK$3IVSR68O.T!9ZN g(<> b6J͆xxDLgeȋ< _OjwSw:uϟFg>U򤚝dx}ky}Evz:Q(Ğ+ƷuRŗu-!j gOg?g)eO:Ч/ U$AJﲬj02kL*'E8bI[螖|x0- *ʠd=yιq8 C?NY.V(c=iTԹ Ex ^Uv2\eKy) ({irPPڟ?{wD:(@ŧNoġKcR!& vNTNm=8}J9O{^_x5C9h@%,.:f"TO@`B_i~Jjr\6. q ~寨3 q ~AOeS`᯼7-\R Bظ/Hf%lgĂ֖|CDo^|P5|SionO?_I6۝^8@|H|+In$7NY+D,ZvGuߟx#KUJ8'B7ϧ3"!2_%et3X.!gi\eXj|β=U2'-vapee/S>-j9&T |2OCó '$nT)"g_r @ c5/oGP9nw:InI[9ķ_}[GS0,!8AUF,G XɵԝL l>Qߺ衔&q$p3OMWY૧cm_g9©ik&%x|ԾpEGU8>‘lR=>4$Lj:Ra-AiimY,5G%m~)kuaM,S㰬ySŁm?V.Z R}:C.qw\q΀9m)qty.N_8Δ?9$,#mH,sŠ\{ 3eb@Yyͥ8!FC ?:L41Ѽn8  UXzL.@FT/LOM9^`,Ky-ey ?\m]j!/սxQdV\=t汯͹=~>^OQdU,YdY ijԘf%GAkny:/F}Acja,g@G_jltkhFt$ scaMkrSy˘mHJEe^$|+ ̲#) B/i\׳€uL'q.xzvD!uw!Gk,x; tJl=r -ɰZ%ẋeцpӹ16/YWו,!#K*ƾzC!+Z L:it7U<~wۏe N̮Fq<#6e#kul?kl&]Q[k=z9jZuVkRhA㧦zgptu+OO=\dF[jc-1^iZ:puW}X%p@<7 Wޙz>0n#2 yr,;14X=!.`8%nUh[pEJhUEqGM=Hgz<9/˛LBUGZGTz0DWpGŭbyn(#\*.ՖBT C"L53ѻL 4ʑ)oxRǥTk@g cod>sYUymo:7yæ{^RUROy+˔ZU%3==K΄)n"Cgvq-HUg,ϒl2J=s/ZRW1: C=^-[t?sc`'u,fKR1󮧆9wAtF ^~!ui5O]^@4gPy4҃jBL]`u1cz PÜ L]nR1St] Ĉ~R>BN=Ȅnz{ȡtf&P6SQ5KaN i5-dp\wqj\@3Qϣ5zy03wթÞH}dSS?na~G]>jr9L@/={H8C|06ջf1d7d+pzyh3*/E(qpCŶ(nf2c63-@r3f5muvީyNenoe3bPtuO_m\KB["4i;4mn,0-YڒWbCJ2HjJ;3IC*Ω^8Y|B:7Z>Zs_F~sr6/>֎zP&o05`ZbbYTB 7w!c" M>` Z8sRe@M-yFճrvQ@@g߼|40Wqc2G]8SLzaTݰ^-U/乮H|G~N }f1xC0u7x.ߒ!~5%Zee/:g1@OS{tDEv٬H ]y4˪/QV6'?"OjmRx uij>0niߢ}ӭf?T"xb1UwG<=GOGo^50cw(x\3f[[F^ڛr\A`(]{A(i Xοq9,grBu!L@<lP|ϴ# 4@B@d2^"{hm(vSHݩsXkTa N9C1}{ q}u0Pn P,#-~&^K) 1~h hLmu`46v[MxFvHwx5KUf-2O֡<5Wrr]ﳮmu[HJd\>/tеr߂Syfwqx2@HbZkjmyIYX4BgK1H=Pp9.홈ט+{:\q}g6d-ţ\#h{J8 V`n2ƚ%0&0]=_l]|ڝ-}"eevqxo|uki!knİCCLuvٞﳮ|)E> 75w-&uBՑ2Ԉ= u*-+0j]-DM KN.$LTDXyNY1rjY]!r5v޹\'Y1N59MK_q![rKY8;:|vOrwT Ξ lw̙FS')?˙ӹ<e#ʜi|NM|cm<ԷGZdܑ$yFy>+gP:!tk$2jG8A4@8XX$lHIgI#[T+ ,aѢ&PZWP)st>iAFE>!6,t8R_^$4ދi21pl){O蝅RU ʕőLWo4T­5R)VPbL2^rMi 홬lb:tj!\&;g:Ĵ3gu`+kִp.FDU3=[YDrBW=ad0߆`!y v f"{Ia("90 $hu1R~TvoVd#v\\f7!ɛlRN &Tx-f!/l3u C5\;2E. Qe9 ) ,Jт_"&Os0+A}W^&d  d:*"Fa,ȪIgbb[+ŮK_k}*0[1RrQpa|+駹z;EmWY+s,` XQ^B}1vb(l "6ulGg&Y'|O0,n ~vP6 Mot4#X^uR]$&nnO~zq|$7yc q oz,(58Ъ?ϯ3~ßiiss\<0r7QnL:CN-I^DB4A۾CCi8ّ .(" c1c=X~ 6I\3p:C'1:bӪVգgC[['RBΨ &簬.SW4ZFMlzm659ǫm51E.M׶{UhŞ^TuG}f~G1ש5]Rޢ* S`\=QEs='R:s?2:zNm` TEm.bS۳X"My <0I7,z=8@k0 A1S)<'YduL40U",yp]{.L6ZU \LӯT}4LCp/0m]~n| ۂoƷ0H@]onnxޢ L `?Lmփ5[o2]* c!Ѷo^)n$Amłl'_ȧ|RzIzKb+J>7?Mxb[1'yY&1<+:ZE@+BWuvu1ԴQM`9 q$S@Dr㭬Np/Z0{w>@/8$+fdB=-HFApI~;7p* ܑn @Q2yirǏhAW !vꋨ~yJRlGSJgZ@yr'p>6e- Y[_uv e~&E`;TT<`pϘxj ǫ tu1Ӝ@}&$E8! zܲ4{\ *g62˛|0Qo,Ttfpp<se5dh!OwE%p9AqfI4Ӊ|WRSHήDU% % ;~Z0`>p 62`+=^T=$C@w>$ezszT,pf5`߃H}8w+>HH>XtZ ̪,$fۘ &.7 .)00 W%YYsrboVU ~Q$L(Z`OLibI!SãʩUd/#8l^2 g7U'Q6>*'}\CEr"W+:Y؝[:Lf!5p@IlO!"/K 9RЋOL}Y":cyNCf,hik  jC i5IE‡us+ L"} XQ D:B 5X~ѿajΐq#0(_wQRRЋ<ӡU_k  S`@taCUj saE2A ɷ 5:8?[F"vI-1"__5 ȻdFb"Z?zp_$.<r7Ff1NiT聹?Qvof3+3K*%|5Ѵ˪^3|~vxYzWqc+5O`a P1۽+i{TO}R;ƷWp+fLLXlkt W :0"SI;=_܄yi)_8=;GkDՐ2myγgɑVΞYSO+wзgY|*Gu80G!r]x+'*Ll1+_brvVn1!}OLq_B ^Ls-DBM^zւZllTꁱZl2 u QkI˨L('m8q~݅EtP٦V(ZԐ<= Ze1z<i- U6]Noq6Cq> h,'\K=}Y Jaſ(up7 pԄ -XV0;Dܪw=, R-@0RWݥAgZ: E)XP=P]xYk8)r&`utq"T5;AMT25~zߺ|sSf&tN&SjӭMC,ֽ<5l!]GäM8INٸs~a DuPڻ6e63;AHB5Ĩ  c,x{A\9C::bo]GOf [g`0N{ ~z}oVURIsv93b ו2bYPo؈˟D^ F(i?|d`"d(]r1 R S&,[++Re$ Az+,U# X=OVqU)&mi_$m2Yύ2AS6Gm)Gm*z??_vgtABPf9*bHf"m^鰷@t`i=sAEH SNYV52F*9:YȮ"LM#pj$Eګ̅*V[U%-aR rd(ЫWհǔ_BEYʺ`vmX"Zj7"5WyxFT@*G Z II ( am`a7i+L pbw\Hj#9Hc>H#O $V |,Gޕu,ryR{_ 0*Ѣ(%GATxǩH.|ڿ0y$anR>s5lsȄ_Ab6\iX9e痛[=7.{ُ~(` X-rFq#cUkͿzǛ_a<l<oյMR>N>OvgM#1/2-ug9C}Oxc9shkoB/,oh?~7v7-#ne:' FޚPZ>ovp3cl_eSH[}QCX7j ^-Tx[Jr'FlfϨ(SӲ樂viZ-QZ:HPh8^%'W( <6e(w *Y)Ň'XdY f'Zi+1lt| ]9E 73ˆb|g9*&O\ѩd"3xVmL=y#4\9|Q i&ZdR$4ձb(p6FtWJq )p)xQ$AZ-K%[Ғ ]`ca,}Q (y-[5eh0R$u4Z5̇"#P\BA*%DS=pR:럦9!/ykj#oyɐ\r._d" Zn8FY_H)# +}9 , Gӏi8ܚ8n(P]5P xg/Vd4B/ڔ=tOTdÈK `gk;Hϫ..۟ =tAњ۔D]ԆigSgC»u!T`(. Z$BVhcN̻nplu,2,~'.DʎM1Ĥtj^B19&H:qlVugق4) Xly6Wn4<=JyNy (/51H:9G36+!S;(h@Ik!wL֞ $u NpKͼb*rA[mR+#IM71 Alx*ΨI()RPEA|6];~ѹ=+G; hn\bQH"I'8~v10~p62Z(S28 $#zZڒh8R$฽w.uьEL`XfA$_8Ņh`(Nq~tU>8R"|΋-L,V.?G555Xt7 GFJ4g{Ex J/ub{ βq# c|DX=CC'AL3czLӶ p*A*hDo"iꪋUX.ڱ /^I,) &NCSdj܀1O:V]+o}K<<8Z(Q Z, E҉%٨]pnpԤl[cxbp9 GM}+I?P*78d*x6 T|Qp xI(8@(ij8e mpȝsC2FZD)h|%A`lU%YAߩ@YT^Ԅ<O8>7}tjqJmҙHH̀I2jI"_)٩V\,UW儇P"b<-"mD/ZNڠR|p}[/CSiVN;nBtrC킣`Wّpj/*?|7z09SEL MvYۄ9,5abLy]&cZji|Dj' G+jR,EƵ@o. I$Y1zR/g[}eV|Pd%!#:\Tt7 O({MJuq73"fho&jo3#,@bߚCf'H.!m-Sㄽr(=1S'm'"5u,3ZjǂK mVS}ͣ}B+&2Z-YLT†F>V{'luPIt1vcQG%(N.8۳tϲzk{MX3Fjۖ^}xx$3mIV{"{EQG7IRW ];s6;]:FM` ^&QLm&5S^^0s}}]!s$a~D0zkYcܚR,ےNb5yj]a{LH{Ą6,o[twVw;{9<{fw=/["G}YmW9H+PPr 8 CR\Ch&ښ;]*kM9 + <J jɵ D4S I_=p䃋H)|qqS)FX56i_`۠B bHBɝ-t6.D [:wDZnGZѿ.s$g K K~Y}yypJ7&,-%wpDJ{Q%]H M 9[F[Lt]_8vB?#-}e3 ʾ:zue9o.7ݹ0,Gw[T~O|^zHI]4j` @b*bLJD[mR7t n =t=BiϿvI2^^p%D. C&3E҉:^_ W'WgdW*np]nMf‚gmifXG씡H:96.|Pwޱu:~0_3Vb WO.X_/[KYOOSzVkNv$@씸#ԁA>T&ڢPX(}rPsDD$5AsF'쬹fտz(VlYmFAC.mj1Yj C<8i)ZJ8vQdXhzJ^%3QRued(yʾVU0QSH<)qYCXW:-qi!tf0jlp3pWF$zIPEIΒFɓEǗ.8J}GL;;bhEY9zչ.8o?9k?C5ҠDo3J1A+.V_&H78;p6,Ai,|+>Gcv1ev_.u?Uʤm@X;nDI3*pnQtFB)+e.j![Wf?⬷3c.lv>s{؉&Ʈy^ne;~~6Ū "Yt@NKPIBRH:q?{fp~K97wAX 7C7|$ʡg2q]s`2`W0bB^j0Ve}R#%$m:uq0,EqOߚkO'5#ikINzAx`ʟsgҼrڛg@֡D镻A#Ypk(4)|<@]Lηۚ֌o,8"ԗF6ILstX  }* f^?w2Rڷe raFgz#w3LjHi)5/9Gn k1?IE -ѹj=!jh 2khn\bQH 8E̯n7x:gHRȍ1,gimR>PȾ(b1܈MD)] f;ίJVIET4vrptsWMG9cFzRxQ+&B}j*{Hަc4𹳐3C8Xeu bx a;S{'}8TpbRV>E)''Z݇w H,ۄ69J_[n9 DEn)L{CPJdŭRsrbsg#ᔡ8/QsS BNq~Qͷ{җŘ(qY}6>2.8=}ƠF!eiv!uDg)?7ܦD20J`R;\$㝼i69^)`(#Zs%Q$vۻ.83yCjKC9va5~ d) QD B"H:9/v , {箵MΌ&<ʝ#Ϗ6ٜL]ARp'pnn*Pط]s]%n6?/]p=X{`-e[rKϪTISLF,iD;w,J[)ɤme_&7%N,8'DGYiE£|sˢE3 }+ީ Bg~E(W^rb5Xa6 7 ƣ{]sy/QpFy 5Y'AsLFӑ N;?:aڿDD^-Њ`Y҅K>9Mp3aj"^ҕ2|Bh4ޡwkLZXxk.^*ٿ 1v d.Ev+oYooɕ0o]7M?.<aOYJ^b%AʱB?T首I6ʼ$xM OpQAO}½ʗ30yO~;W8`H?.6e_0F$qo^'|2r| ΦfS'I2ㄪ=E{Ze ~,0Rl]sDg%o'd$;@[h?ۭxh?>tK%Z%X9^_FcW~/QT')^Fi%Vy#BT|U+˫v`f;C< o8 KѾAqM&;+0]=c>v ~ qh ٽ eն:1NK026SouvW^#0>%qKFa=ƣ, \fb3X@ba[c+lu9W"7.,'Ut.Є.U-V]r* v~e<̓]]Pԣ[vW.td2*`R;#b@܋+V4,械, -1[^vN> KOwR_F $pUq~p>&],7 g^'zvWlp~Q8Us,3qk\`@D{U~ x}m期)bkGOMqRHGgg1@lP&vFc* ی|RbVIjb3n ■<m<)n\X,uw/3H0OsPR! s*YzMڔ7KݚYW˶8FbCC+? b@nԗKD)<`c5H`L1 @L,0oU=)(Q0jRaZ ՃZ hA&D۷U?Rtxb[K6(-Ly= lj/*Y,\68*׆mR/G=H⳸@{4\} av7mp%6ڪ(CP1.mMH[dDua7ٜSm+HMOQ\2^'L)խ+( I>f46 ǩ,]OQWQ 16Q՞.Nͦ嵦Ss>i!t]c $v$y @Zqq{@Z9}? K l'#ID񢏬QbdII|[N*JG)c]Rݳ>Ž.Y`9P?In2qx7_KYK@Q,󱳘wIekV"uWg7Np< 8"b <wWpߝJ7{\;5=7в8߻·1C0Վz4VH-Sm08eB it%6roKk[aQY$dWQE;Vj%Z2** ehS+<@Emj6r}yi:ZrKH *dp,=TM%?OA#Z\УB[ sFJGvNyUNatwY 3"JE8ZaAhc#gjZ/$9ރ垁-Kp~Jmuw sw6ݝ`19tY]Oh su)߳X\SrPב5n Z E[(&P,vkV j׈k6@m+L_rR-遀} \,[T/.q/ ~˸λW~4Y,RcUh Vٛ`?Yzoz6C$Z:TmG/1UJگ%aP_v o} #4‰{!8XG8+~4E(>V;G=@ [c?\܌6FۿL"h/aMN /T5i{+I"ܕJo_5_zZBז@9(C/8l"%ڕ7 Qk6oqn͎QZyB[J0QxE~V`n&}JȆ!v`0=t35{[hxے6f()~T^>Ȭ&yqP% m H1eRh*J-5yʂpy&4( ).Xy«〓 Yu+*M3.v3\ ӢJ \J~5cӰlU3Jr⅁OށOx:<'\IiZ إcoPAmu+u+4f0{6Dj=*77e}:xKC$D{I~ <  U`彲p)S(rZZ}nu<Zv{u1jL7۔_Zf)[b4%`,1DqDiN-=#E $پլ^7Y5YYd䚱!V )R$FG ajx`0L;?&\c-fDWV5nE4nk!t6dKSҷ )V詏M\)ݤ)?(Nm.Verg2+FbUCSpV^<|`x6.*N2 x$w[zwt4Ŭ8v QT-~8EG1{+L3!f/*FrP񪹢u#t.oo<ʱ$v.'}N&'7&;oŤy}{ sdjߛa0w|m8|79̃Gs&Oqm';p盅Pp+gfjJ]ojvMVPՌ/kAWYӶLѯ)E ujX>E;<үZaEdJ ɀ Dq9kːLVF hŪGo$R0#hR&&cG x|NH,1NL3=n:rG޼@:0J rm*KmF.bF&:}-)Zf>H%1dIF\(R%nP& j9|9cBCYy%@8r SPEekͰ )OPiGg#8 pbA:n|"!5TKFmPL`6%i͖O1`AC$Dҧ qZyJ)^nPp ʬHP_# DLiݰuY`ׁǯqh_'9QgFh㙔)M⎃ lb0&Dʸٻ޶-WhPC"-f& lCF\Qr{e[6)Zp#"sfΜ%WA4eAY XT #3AĚ4Ԙkk<n!r %,dhd*UJВ` 4TKY*$D'PpmN4Z:A)oN88e`AFH$iPI\V&5W!)֪?.@3uj}H)O_ie)-~"RZDo(h?DOi|x׽ EkuG"$ *bP&D%^ OMqP=hPX@p?h#ebƫSۏ`ĂԣG,`QE3Q6`WӀl QTe!Ѡf wivr0띤kq[-ʎΪnP:VW4kMv! $sISK2'JhL&R?JIB 4cʈ)PC 5NɱSfPiSDE[fdH'بp0 ng4B˕4,6pMdAS/m]nނDˆTby Ow'„(&(6z?jl"#Ɔ¸A{3t-}{kD/ Geal0RD8'Y3⼱!`1:d)d:4$u+HnƪJ&!bF09UɆB'=jí)x >ifM|bAYGLcFZ,qCF z\Ib?5j ^=IA^iÓl{ݒp)Z_mςo4lx׃]3@>ZE6sM檚T6E'lmXXµ/fT\F(yfx9I |B $8wLO$vuNT#sfUeڎsYc^~c.FJ ^cO,z,;7}H LHU=J3.vzh?4;sWTDH3Cc1OUF*@38p}AD3&Z}SrQp»n+w6VF?ߣ8ߣhqn97kJOԌ"I:j&''Dfp) Zkc᥎!i|$ CUWP Tp~90G+jz|.x`ͣ>AcWyz2hA.(ې%3֚Ge'IIODF}tKR$CB4h]YN\D=!8/sCl-Oo#lėrC:a>~+&Z]aVOR)&lZW܌~|?=Efl1`Q Ϣ4QDAElT;Pq<w^Pb8: Ɍ Y+Rn}TO~&W4[n r@QYD},˼CydhI&FNkgl2N+GO G2w.cu܎Y#SXeC̐25ѐ.Ypym E5O;Wژ~Pổ/i~+ʃLi|ϲ8md$+lҡD)XS*y8kl1[x hl"dQN`(yR=XjT}ЁKF8QGNb{6/!`xMQ;v.M-W&CLV4=ԌbB*G1Lw g/K5psMT)}p{MS P{*C͝JrsKh޽{c% ftx#wpzEZJџ_w_E>xC1=gd}zP\Mg2?p n:Cypy꿯q:tkjc\SR2559RCj ȋ5êyG4Lu^ 0'\5˺ݎ-x&!Qy$iquX#*Ri%7Pnɔ݊dڲ {nC7nEJB9j5νnNpQ91  (5is[i@7'g51 U 5 C)6,vZ(Kcr㝾L*E`Ru<67*" mz ZvH,9BTO :"dPS)vt[w/!z .70].'@/dr*+泋+7ܵ*HIFOw>qܴMFD͉{ eͽuRJN"VFLPH>v6&$(Iȝ~4\,#m|?~O& pqtVxJmf='/=MqڳoYZ}AIK=|Ylٷ.nkKWnbfZbΒ -0")7+2⎊ `EB'[dSQpY7J (J v?.h"1K?|W[P)Z /˴L+N޸ŧBM9.gyas|Ƣ'?hv2}ЎhP-!Li<*a;I)(u4$%exL D;A9UA>UոzTk>~m;f*h|`uk#eN_%Ov*lL} ^L Lb?a:~j豜X؀gXoHOԅֹ^v*fK|ʕJAwGyr93Z^]/p/d!ů!B l\{0Ϋo@?/8"c1OXD:6<_d;??Dz=&avМlBVw>,_|CH| PBTċ$k%<_eC ҂`O e&.gQ>}Y5Ӓa:YL"Mln9$ۆ [~\Еp0F* xM4("n#ȣӀx\}sŤKZ0;'C4[AҳuR4udwwL[1V7޿ڴEdh?߿9Wc_|?<юcJ JU|Bי>ǿ哷OX~(Z\|Q`t9<_nygw%_~|6-/Oy3ElnV4۬G% O2rMH:,[q A2z /'+}xl1TMy ltv qЙs΂81Y4ϲ@əҐ`42 O6 6mѹn<$cfe|%`^cRRY%)\Dy&0̇` ؙ<ϙy#]P1T Kɟ?K[qѿy|R^HoZ0޽7߾r;uV?~qF_y^̀L8J}noakHk <m>]chABe*Y|+ا~wc.WX+mf ww:YߝVy@5zɗ4.">Ye,1pBLgHsISD2V'3`C+jC;oDzul~?ߵ fuTSscU,mjTB-&q tůR]tfm~/!kzmj:tDZ?{ʞXWl G X=LJUY^~QMWl# mn߮f8_K jeAMc6 ڊΆ\ .BPI*]Sa C?d;X{6{q>kX|Q2 `̊LtQ0% ti%,]HAۀ ͉#=a] 4 wexk'G NqՐE߶C۩;w$֌Ęҵ3+oİnjޠ0;HS}`w5?{xS<-'jQd|ܹ"|_qoϩ]>`=no̵KygG`y9}?ma`np>yE'fC%, Džyz.?M]/5 )S_~맙/|D:&AaI:ωǴ)Ҵ(y]^V5j_'i6"*#H Aɥ#!5>xal_~hlcjlݰXWkx ^|sUxG浇oR"^k&2ݭKgưiGSے)e`9RC#|Y2"WRfZ2њvq}']03ZL`I;ZDע]̣"]9v_$d !:?GHxY#0uDȉ8VQ7e 8ѣpAkmPIL.)i(V^Kʋ%I,DvDiQESJ S}#쵱>p,L) QJkJNWJF<\W2p??_g Z4?  c;91T#j;S.lMjmv"g9)/H"GF|!Y0i~k q=uL~$y~ǧi @Yr~Z )D` rƽEW܌G3){Mm T,kDˑQagzh*Mz8Xl6(wĸ- jY5r?Ge_ O1o'~K6%G3/\ST9@qԌxfy_5NAGҗ G[l 5HTp4 +àlyOe;ƇKҠ4eF\ |iqߗyo y_.!\g!p"D"W`Wxg@ʀ(66 )FaKt%~Z*9ˆ%氾YW&Rz"PywQb'4};0aj9T)._aI!N^irK_xmˏ9eb@Tk$S5H%gm"}Unɴ_3M\bZ9lP-2*GWOiDmpZP3ڀ1&Ђڐi 90|Hy EF@E'PQ}U.h*߿N-kLC6/ #۠)cmPG[.~>ą*p9,LĂd & YpX+Qq)G#X`cZ P_}hW^ Q_} R @M/OEJ52R ˉऍe-mqgɨal8oq[%ԱIͱ8$+ثãryOs:bA0j9qN7[.F,pH#ah$gFJ2Ҕ])ˠ% Ⱥ(v4d kO 1vn—~bq 5Swlm9^citʈT\.GMmnbfG<~}hVmC/@;m`g6qt#)ןxo(k`qUd@%He֌Exnyw՚y]! 8;`EOS b$j`>ᨂZNftl\<y_<+u_/_EV2TGCb2Hk)4%aLU~GUۧ,(H"cC:I^0#Uɾr`;ќM?ux8^"p&puKgۤ<.?vt CW.].j >\~Y7/Z{"Ŗc"~4 >_|-2 2+.f3ber$mVlHE$,GXxvM^&5j|3 &:|xZCxQ)(zC gmnoLWUFUo_ɹrߥEƄ}݀T$y~qΏO_IzX)c D`΄N[WM速p06:FTf2-h)E WYrU8ؘEW|F^G?% !WNF`#xly_+5T9i|z@f]pfyuMsTfm=]8r%qHns>\nq^Hu{m{muSmYv64*,V#aF npf Znty푍J1j`y"We+wxx\ᖴpF:h8ĕңRf9I4).IJ|qa,cA_TDP~|W,\x+ΦM\AT4" c Nq㭗V4nJEEGޞ>-,&AeJWRgR MNrmM"Z6t u1ꈵWUOe\̦O4x~ g*;\srT jU)A Mh\T$Pe/`qQ (c~2k*ŐWZWp҆]ƬuJG}N%S81ׂƜÍHzQ\If1Fn[.|2]LJź\Mt OW'6"-Łg*A "7)祳V7~6Hl~%>.VDLiHy_&~퇢&vt~??\],Z o?UAVP嗥 p}icX_K;P9s@hgӯuT&<׏wϳ-UoB3.ZOӧ=N"i^+|~̾v{s \Xqwh:-&U@]&=^Oiz1oa,ʚk9K.oF W/?V|sTO5_헠#}g8rȗ}hUu`T<ˬMf,"sANeETp_.q K6Y>|]8i:3y mXV-l*Οy&E9d냟OJ *ծ'8~\i:7|:o?gFmc/&JTl٬pVMJGCZ3|6? O96`uolɻ]+S;y6 Fzstmفny[=Ϸ0{>6+ONI7+T܆!ek kN3t@e΂U9<ʻO ;yb,C]u&:8ʧ嗏P|=r#IErc`\u@,rr)98K*˙DNFݕ8{rLQ D۫74,bM8]>HaG*n =>-`| v0I),ޓ$b0u 00k; I2#tι ]ZmF__Aͨ 9z8ahH֊L0V `4i&D|Mƨ1(߸Qa9)'YoL9ߜL1ՕcpI6\9M(KDYN91j&,(za!ݴPW3=& ՛waJ0C۟%afKOOĠVR#N <"٬Y>fdaDrն "#UID 2F܈q̦ Bu橷A >}m30K3fo]j~-wp]~y|-{TSL"yaV'`#HtN3v}ҨB_"ri-YeD"d ۍ*ػ$v0Y eDUDuվJ\ˇ  UFS,UD06k(UZG~%ez+:qk.Cw4uDV]hD(wTl7al|^sK  Xv}I=&ܞd?>{NZ^vTUB%#)lc0 R v|]u6:,NOa}!Y\}x̘ pSOW+q$U4$Ko1PJl}Xhh&SP|YHf^e-H*?`cey$^%S PqJ#^Ѷ_(wqNQ;uHTpZunY. q8椃Fm#p4-p,Wp-Iu)`]B'{0ff*-Th%oktA[# ,u~^u75Lȃ=8Aȝx#'ਢ:Ӡ73d6aZ-C74 G3ȇ 7}<3d '#"4%5)\f 4f Zl5l:\,h|y}2tA)jWK)})qz_ 1:KE7;E?J|_'ƈ+Xe_tJIm kC?cj Y7}EYȭYWi(w ]Sy=ȲaD :Dyp7< foi0 ,g"^($X|GIGܱp>2pT5k Wr{ *9}UW|ltmtǛґjLx>DH*MUZHp|~%wUDn3DZr]dr`|xg#_f 1/,hJEL\-fN#vڗTn4a>0EA,HSyNaaaa(k%gf@fɖԘ-Mrt =) 3q1K`׈zaFWI܍ŒiEň7-sc&o&5*g4Opf Sݶ4v0R>OMQ{U2|(IEKQk0څl -g~n[;IMes)|h CI]'s\Vůŗo"҆goI Yg p"j"&9-}*j]Ξp+ˈgpP{3C .y ^st] 4=x7%u~@ui^gxh`<ˀs爄:bRsJA?y{8Pa j9+(aӃez"~M"(뢽kdT0&h$,^z6_jJ۔R˫\z_WGgl`[6Qo f\rhoRqh8q7 >SuR6'٪GTLj J$ɀk&*s[ 8Cþjq'#3\8!Ov̂XlGLۙDfﱁ-dDBR?ˈV Udn $|,l(%Uet-dC*p&˽$V)$PίKk:b61<ﲇP)'yi!u7PWtOYjArᇄcJf)C cޕ?&SW3>f[&~R9vע*w^`NoR{K#2snw׾HyRmCiOi8 JiʥR#E^ulܪϦn 8[ڨn'"\ϯ^y#׎)o#\5WKCrk OՍ72uIL.%Vv:ܥF־4#Ϩ[8}#&Gɐ}Pqڦ11Q*\t_C;Z^ϔ<'W894"3' )1\CɞEA#*s8egǀA#2s~sX9f4"3nq}^Έ麌0D\0#1= PX7MA(Fӂq:'4"3'17gΫYG8uB.*""a0*5yiujmiDfLw]NFx#OPGF||w=jcQ74"3w(@6qVoו Q&yCg㉵2'>s8e SSN[qCpG>sZXUշH8hхT˂_YЙ%Z$^9Jqyne/ :aZt*~ˠefT6 n\ O$ ioKȬeQ]jUI7)<6,N[1Pnj.o-{42ջCvu14z.jCǥ7\n^m+>`p*pq|~LU9/m/7 3]%l+b_! s,aM:π;Ek R/k C" K8k\'QT1s%ZD1j<ƻHT7ZZըQ)0kᡀw@}x>y$%OXd+/ Z^(.y5h%o,0F5CfX#> |1 XZ$J$M9@J'-gz+P Z(`ٹy8;mFP0PPɁ'e-9$z{݂x-[Qʨ iYtG6J  ӢȾ%-o.!)s\z:,x6),,Xr:3`t5bӜz{͉c˞S/.v)*)w5݆i+VzDQici%so5Is9/ˤ\fDx(3 Zf(.R?RӸJbHC$ N)x"HpiK)Whz"B 0Vn=ip=XKwUٙP8{Wέ|wg eo`Dy9>ED'Fwbwwܚh]u*Kwb oo5wVkucD`Eu,оm^_bظ@!^:)ITwHw mE"i`$|H2|P"e+#{Zj{|)Ū8pD|'2N72"&kKy:4\l K}n1 #NEu|@+D]:"ڪ1 pOS^e[9}Wv} RׅxM|[,(;m)t) La ű:S'lR(g>?즽K]7[e1ZjnjLYK;[ժ^Vv0yt0grUmV Ps;o!vky qޔ(av#J(1ū8Wuzɻ8Ibyd' `eJXt'e$"IHt~c[U07^h]I綕D6T;{lU+e7%0TE]$ ]/ {@6ҲXɹlJRFVXy"5Ռ)a\ SY#Q- aR N3ݗQw: U 憛Gzxf>,uK)o2}[7eQ}M_B3Uwצ j͝ /NP߾[%4зR~,a]Q?v+Z֥M<-hj1L+.rzXw zQ[bJhъ]oaLn6>`W Rآ\m .U~kޭ?JjNJ=5{T\jho>lͣj\t3 ~kvMTA,Fy?UE;\fFNn>0Ci<`:X.u]ϫ3Lis=aP-W8,# StP!%b0 ^,KF#9EyD9K=.ĸTDG}֟e["Ar/YWe{pu-- &P^!N!P%yl1))I*H3sKU ,bF>nt8t@#9e("Wݵ*6R۝|Z9Tυyx_[fԷy9X#ql ojY 2 aLg K qDL"DXdLm.Β2t`N h FBydwB{Cmybzȝ漄qD1f 'H֊ 륔$[`_sPwA ]w5&XƁsoL4 &pbfHJ]b:pQuq! ߉/[hG3}4=,dNLVFNh-ibBB~dˣ\)fHҧcK"DaLH&-(qc 8`ǒ]>Q*ZiO]>Ţ\&jjF$yKo^/Ukcy] Jbe"Jmk%3DB)US8Ǭsdwa"P,$qd`\Sޛh;Dv>ܚrx5zƖa#/W2Bz9`$| p2{VѨ}4R6ȼ~g|8|yڃmN^ s˿¦StX_w{"ظ͋Y*MCis1t}㯿z Sv0~|\):4:S'ϠzA7 9] dv~V{?G}Y[r.6yX|ao8ugA}3*~U`&&imU-oVj9MSO__5q&}'`-nU8*qm'2X$U` $v [$]>.yɰ߫v"{P%X`:WQI۫>d۴~:غvyO:Ho468~l#ڄ,Yc(n#Czs5YĨ a,3bi頻>t˧ksg=yN#+Dm`s5hHt+X(ھBW=~ }ɱž,<&Sʰ( D,I  a%cpY e("mTPY;k : @\wYs)0X^:A U68!=ݤH)i޼0*/`+l%}atym* [SF9QnlZIZjjLp+.(i=EC* )HCjL%61")F*c0aIĐ=Z.{wCEc *`qO<6nʌlz2N1kmU*i1:!d|qaCNZ&Cie"c_?Z,?p v #dzfv62:`Zk--nTL .4˹{E]bܹvy{u2 :d\qqqaneXڿϛ'\7?.W,ݟߖo/ ?/`m;ΛĆ׆^ۙ~g0{E_ZqJM U1zIpA^2=԰^+9MG TPeռ l][}Y}UTZ&퉂²X5цukUr}~sշYo@cRۑj] l=qQs!K0X&3::ei&b-smJDҗQw: U %<39[*Lymyۺy\.b&k Z97NLPR+(6m";ܩ*֣:JnFAݲBbK)u~`g.L7um8l3Z=n z BpzU^Y7mտ#oU}3ir^y|ە`zF/-?ˢz%].ּ[m˫`. ۇyTvqG ?B/ܦ`@(U]}y(*Ts:zFwjyұ8ނv躮&4䌞0Ζ+Vt<$==Kz,LfsBD((=]|wœ"zTE1md#ż=K _!8MCFG BIڎx?>&8$lKܣ)kpWxp~ay8Q(uē$([7I!u"3nHv]5C5;dϛ}PB6C^WR TEXB. \b͒0 52 DqT"dh*0 r6\w!޹< 0r %:dv _IK{ON>fD8>nb,e3a#!#x̳S"t5CmI[&L==GZS OHEA^EV1dDqqv⋼s7Rw1 1]]gV f/ҹ8%~B:LhT%XaM0<- !0f2I1[ivn=]yCbת1Iԗd&gҤfp< ؤ8H{ouR޵6cb `9e $Kv H˒[v9dIT6K")ohOGϲ۪[P,h7{SѲjsX+23gpp(nf]:1{!MD9atCWZ9?PHϪ1NxaX<ݲ;5psn@-T8&a.Rdf*9sřc\dEl/EB y%cd_6HWQZ{)S-q8Hy$4 5XxAqU9482mZ0]"SOXG $X CkAW^bT3fƨ [Wԛ85hʖ99]E0*g5~ O=KMޔZn:S4_HIt˂Ni)XGvFZ*('$ ,'(HCˌy9_nuxR%=)Ha%L ^/qns#Y-&t02A\&(\ne+N׿E0i/zoPZ+\}{5du JKKzz2O`ʊyDjKhum li-մc,㝊tHEY*y.ճVD1jw2FT*!+s9*9YC㛼k4kΕwPi?y_g9yRǸE^a0=+A龜vym:=Ynnx}W_s"G^C" aOhz -IL`1CSa0b م 93o-J9?V3ްbV~fls?n6,W|{4Kdfh=O=#x"M}T ZewŠw"QΫ#{<)9f7HR'Ox':~gƓoFhO?yk+L9p{:ha! *O:hEp{qgw8o9VL%^ZPQ(:T;'M CB$cˎA׮#:ڎu*c>.xVd6 OuF{>S|͟':c1 YjHitt*:`RfR*|$@gllQ%o4|NL??1GW^e޿;QrOyoḑ\,%73OJّoDX$̽qfNd;Wȉ ׷p֏+ LN2{7N=Qupx CZ|Nwpx9.(݉MXk.Hˋ+{sNxN2~X үb|o v1--[@KA\\ 3aٙ2y:NIVN0|+ dF!U7pTwl58v 5ǒQ33Nϭ)I?9QC_3z_cBbJCAv^dVY=ըFep<4 .S /)uXquny _"&h#xe伃O2jEnfВq8fzIu\YNkP[14ڤࠁJwQĕ($WهM& y=$vf: ίa&Y _M[ZC"sU'Lbdts>PK2vWQ+iM*ZcZ\}wz@"n$A%eDD'k!sP~y!'ArKPnt͑+[v' ЉLNt"ALFqwaj-WH Rd]bt_.$p Oyo3 D gD|a(g31lM28L:JyQPGaA)uZ hʇ^5pQy-lYzhTGΰ,[gX*hz/ȱYj>?"L(h<^pD0JIapШ N؁etY/@z:8fl]U,,dXK9堾 2 ax,ĤNO2>LiKFҴ9uaedFGCdc*BpqVg4WR,KUHU "LhluwXOnM68 Zw( e\HdL:IcNHHCxM28 ¤=/;j>WF^zp a'WɘI p/Ǥ`4qD;VFpDυBWqӁ1=SnY!J60G,HA"'C,'#pcQ HY/JL7Mf3E\'03F2ЁRPt9bKZӨ *vuY{hTǐ#JA?|wRi"eN|lcHA()^] M&XކI+A9HrBzhTǔgne*QJ\V&w Ш (.+1 `'x%p`h BHyo0LZ KʒayDĻPIRdr s`H k.*XwƋ86ekS ME$ky$Բx ѣjG-Y -n,Vhu dUc7z;#)y\bi1H}R@!&B^H-uF 4ª1aevx@35@WJ OF#cDE0=QVi1A`s)urSjTD/ۑ@"Uv.ݶ;5Nd QtyZ)c 4,rQ JŐpY1O~QU!f!92M/=>ߡ9z+ U8XXH(wQuFeph\2E6sRތ(K A]LH/aH׫dk3Y\7+ѶFept{3~Vtd32G*}hxBhVj&:"epF@)ωuN9-eUt|% =s4b)!&'dVe1oHPFYǛL<#h1QhQesШ ,œOᔂY篌5 O ?1Y,d_ 礫54}8RVsuWhYYM{hit=Y:Ϫ|8}eHHtм/nЎkn4s9FQ qr(zg4LQI=4*S@w&#FdWK'eUj-bosnM!u9me$.TiB5/@t8}mHyiVRG;[\ v0 9> G2<@>C3c۩ܔwHx G.[ A 0qhlYӨ}(E2Z @揰^\"t?z/1# C>l/?vݼL/˲Bh<+ay6؁4(~j$qs5v.L9aՠ-G-Æ/Oӿ.lŽ&a6_Q9Ÿ&07d9a״kk:. |if'_>PL%1ew|v<?F8lw͠/#Yb~&uCΧ#tAw~*.y86c.' dA>'=0g ȇǟ=z?bk>B̄ 1a;fl8 R}9GPDGqG1ZS*,,O<iÇ<@Y4MWSd4#oǓr~ WcxqxM\-`\%zmyy$ݥ_Mn'cZ{H_"nFkIvqws!p~ʒBJ^{S=ǐ"9m%[ځY"gjtTL]؞n+Zb&LRVgXo.:盓tCXqcH}|?1n%l/{]mѹ3nqmϤ9(ڻ⼛rl>ur:# f7,m~t r}n֓55 "Nf{=`ٱ mͷ=ogK0iDsАf jN.">.!{zt5,{%I=,bu}}5N1#H~5˛ϳ1W桓lV@RaܾݾP /Es>Po=v/m:7VS]eC}ƿ"WۋI->]˷iЂ0@<|nC'9sysav(WZn7"ٛāh:rp؈avWm.PWSg.{ hAzh`rxS^[VO{#ŏ3wAMw4dǡ*\ies b{;p|~vzg2N5L8򨅤z]:̧m50tzyy7&~P7Tۚ{ xɷ/O&'Cz|GM>y i>|C0.Y#Q{VN(kfф!K09A Zs1$y8;(zu1FyJ[^c N7؏߀q>?YO` =p 5pbY"}jl~ƠjZZb J1* fT Kmj_%zhX?"E.jkAƋoUWhQ\)8?"jTL{C݅xvF8xbO*&ٝd4gO(nc(8|o/, M[SHuTgOs dijz؎.΁WV۳?F=v 8LC_d ?O_EA*0RTMbɘ7hdj.z]O#-[. ކ M 6AN8yNzlOVG@z $F %֐Ȝ @K.KKKJ;OtuNv1`MJ榸&ΈAFygj?]dJ 6Er۔Ĵz!]$^EG,%05R\HD6X!ξVmˠEe+ O5(e_8gY4{g$w6.pV~vmнE}l+vuuT3ƞMұ'ؓ{={r=ǞØxL823ؙpL8v&; G; GW1v&|TkLmOP]ipSaB,);nr"QƐy< Dۋ.?e iYF\ğaFXg} @TFEy]?7(B^o+ "f]"`Ddm_gڠ;V"A]_Λ\y;oruMgKG 'Yoq" bQ/о4k|+gFC{}4@l2临z^htԯᘃ?aks}Ѷ0 G_|K;tS,go&"^GysMGR(lxpHDP~nK9jՃͧ>}2;8%Yt_l[IXWv71!D\ ->.4F{CMW\iv{=Z%}_4:ruYɫhJ[VJj^>?wxe|A_ZP z9v 0"VK8oienQr'vk.=V Y+fUΧG_&Љ|)\# Ώ񎒱a]VҰ#5[8" ]YO*h[o#5,Ћ\YGw]ƛtـOI֮(t+U鶟d[ȻfHU!K|f"%U"J`B.*%A*r{,d rgusZA--R,UqKiߩhnj7>&WIulZΖ6݊ʇbS|tvyy=q<}fUvqP_/Wxegsn,]4cu_B0iUKQU_G-R#K\֒P1Je'qƌnE-]zmxb}!#3 )% wl7WezwVDDc4+H#{Z3;l8%yBgl̄9i~ض MŶh}ҼXďWӅL `+=kfݡ!p>s"9M`yP"g7'TYyөmϵ86? rgv=k&LgmIt &ֱD~`O".Cn. !ɷwwOOr"ӏ gkk762G,:*s$d(3v|Ѥ:P `(7aqZYguTPV:x'P0&1o0dN|vQlCj_ xc^i)XB<%J5Re%b0N!*sUHBe-;E"/P*VjJvxa @vi\ǀeTzۗH0kDDĀv'BU6_"xMo.hIBFlVa%j'\&9%K1j*(+Py'I g 5S!j< (-P`8i_ x&Yu0(º1(g4^B2@B \$3 %&dv$ǤtޖH@PA@ry(%@V&A |ЮVJREPr  `6Y6J$o{IȄ5$u60s P<NJcfs98!' -k`QKY!a*1 ГÐWQj, |3˩ K$/m󂃽\ r`,A|fxBY[@\Qk%ՐhL 5kBk3:)+PbR=v 8 5!j3<"$!~ g(0<aۗH\dvSr8)wsZ"/P<Ѽ6Ó@̣RúqCSmGOn|Ƶg J '3 (E N! 53  84|jMBRiKo :aRhHT0K)$HXLZc u|V=֍1X?,*ҘKC<nb@ZCGFQ1+DLp੬'U|[X4VTj?P(P<3>ZMGRyL& 79^ /P_0T$CyNEZ@",+ ua%jzHg`In(aVYʅ 6_"x?b+䭜4F X>@B Zjo1:9,(Pq珸&<P +Eu(.@FsJ :j{)5C $OdK"d4>ߝjbWi`魓P瀐A6cd)NH P!q>f४\SD4F:H i^G+319͉ٻ6$Wcļ3C@=h{0FaԈ,U$%DJ X-ՑWEd|aUr/v9)Z7l\`Z+4Mp>UQCxi;u)sچ.lDWg4+jjR"yaSB={cT  Z,;@>%y-ua%DPrAY"U*w>%y]˝:DB9ȚVPmqeIrMߧ6:=`*l6M@ t=Jh@2g]O&+SD]9lmEҜ@wUO mtB!)XD*Y P*GžӸpiW>%yKhv> v>ׁe8oSób?|/"s*Ӌ#2eގ-a J-!U G't[5Q00 7%yc-n1(D˜9# JH\# l1P"ڀp@l&aw*c.ERYnxd1 T:97 rĴК#Aa )ڣ0bdu0r~И*@NnqHI]ϐjL dEEP]h&cm SDX%e!ߐR4D&i!<BN͜H1F-ζk*.88?!=.C " D~Ȩc; %ưdU6C{$_Gk_az@[H̳!}wKy⧶<#@OU[>g7@k0t.88v|`9 f[Shܤ-hKmJy4g^=7kNژs~o!͞Un(- qy&%P"q3(EFMt$Āؠ D@I 1C$$FmL*x欤2ܐo2Ju5ׅ vC~\n+Zz+o3N.,',%W[&nnI>&lfsVY2 [g1KVufNSt@s+@2D4"4J S<[3+D9J&6F')q. Iه4Ѳ֜ݺA)0JL'Qڝ(Oh>(4*8$mD#Xj66!ԙc&F#RȳRs>`1dmd:;ZyoyZ:f:RQ瀶r$nl ":s'bA>-$eٿ.:vB:|קM,P`gT3)gOZwj;Ӳ.?++km܄r-Π'y-JK_Y);_$KI| {_{úq'WYwGwlNS1sFB_h.}vD]5xc}wv5ӯg_?:yGE(1G/IڛI>Bnq |sg\ғ?& l x9XDIJw! HJSGsyƱO6_#kPu3ΐM}-~SFs,3wʮ~ηKy}}'p;aCE$>gC4םq]]כ>}q0Eʇ tL$АTÀBК3LIlEi%+ fv7N oT׸T3R4I״/o-Ek7~ړݬpꗩ,%D_y[~dIi8w۶vߐx`o{F_`=].=3FFRx" sA:ă@hBbF{1C$Cw2ycC3jm%4d. QفöX ;3Ĩz%dn$=+x,HI'"f -@P 'HGu: k53@{:'%b[q&Zqs5 mR s8j|}O-.\Xd$T&jY颬:R`UU;aՊ>jXJ :8pmeO::)T^\VuhT2L<9q!sjC:w#xx!8D"IK̈9y APȍ^' p2caiUf.X~6|\|ĝA"D6rnЄZTF"$PdEb@7H*ڃ3n'U2 H\ϴLkɓGkjԈhQʼT*hX и㄄:~"O)A )8)FŐ QL35dg &)/Ldye]DCᤶi Č8r㓦S(Ԯ"5<}C%Ye"x$n}RZ@r [itں۶,9V>8_#KΏq俹)`x9N漋voN:jiԣn7 ;-?ԣ,: w뇉bJf8M"b]ZqN?Qtg1Ng(WɃj'ʈysW_Rg o8UcІ:Gs~_;WF'iy}3aIKhg8/k*!P&>)7p2z??߼|7)3_ϫo~z+Οpm&wp~?Ѐ|U0U57Z8CU]Z8A7$^^-z'$oܮ!h!2 tl9em9?-i@-bCHiV5n'0)İwIʋЌ <7_睸]=!A$F\H "CLaqUR IZ-0F!1fsb$=a6T|;E#6~uM`)Jk%"S)"!mȜRhss"&- ?(:l@m 5Xr4 |!JomvV{f98dky)vz΢P/%~IÓ'}x{d,Ă/=+>H/zz{a9v[I}Ik|e+\ͭG.N3>_dp7ᝋcAGGBGhBMPQ&( ]&( @Z!PU 5AUPjTxYBMP 5A&(P󟅚DKjBMPĿPjBMP 5A&(PjBMP 5A&(PjBMP 5A&(KG_RlBMPb 5A&(PjpAŃ#\&F~Z7&͇ޠ2!BB!|TZ@I2ECV"T@夃*1ɜh JzV4 S8EgL\`VG 4rx2D锨#P%9dtNh:}[|ajH[6J˃&L.D΢M@)D.Zy XA'k58JbFc#dm٫Zsv{fqW~=3! -ufNSt@s+@2K@{?!HєT;~L'QD%(N0#1Iz8 u5fH|$Vj<53Lc^".b,d:-=f+W@iJI{DRp S4^K,Eu( hkT܁q>9Iqqqv>*>O?-VxP~@au"  N.ۼcwUn=y \JЈE@t9*8=u1>14cZi6_=pSwUО^ncwvt5ގ 3RK8BAx@y/@H@)QV۵bG*CDrJQ4A68 H@ ( (N()m&0! *!CH\r.aPZs&)iMQ1xΒzOkA9zxZ>0yU5N{(~Xa>9вއ?|@f>~mrJ2ѐxʛ!&y7(̞,Ef?*qj,0x]Kܺmj'DڈOlC>#m@~IMy;&:) ,vR^Sz>UsyBmN=hZǗ˙z6/a8{2П@%GJ r/K`L49|<fhuv4Qjm275YVDG=o)5mtuz̻ĺm rwjv'8jwbX!.[ZyjcF7WMg4i$r-[R(}4^ax5FRX. 3݁chU%-k$ڛ{N/YjLd> ̌{!#M :.ƿP| $ Gņ"a;d? by4i͍3ʕ+Yr}(JS.7L+6w9c#aGyOS7 ib ̌\2I|nr +;x*j G^Tbu܁> F`?|9.<c64秳p$03F~+b ^.2g~)j6ȋJ_|f1NQ/"/*"wns5r>4qW` EA;EߛYfytjeCVpխrP,W鐕 u+{ܕEnEnʋ 85|stYO(E#chWC-Ma5Y^MYM=cgwR\|&caE`6ZWw U{u-Tmܵ.]-JJ;#DΡ>;ݼ/-(s_)nx6+s6l7E8A9¾Qiar_ٙJ8&:lNW j*vҝ&i0IڅAܓ0]׎- 8 Vn?o_pЃUvD VPZjYpH[ VilK~ jJ8v_ +sk׋g%0x<@? hU@@j t~nCq]-JcD#?0\2+%Ƥ.ua^-6HT`}u1V}&DC1%'{睺`z>; mǝ{p\yeK>9 w^A 6'^.4dV4),+XI/?I?B+s<'FUN=yyr0p9pXvr>&*MTꄐ$uM<^ja( i OғHA5 *۵PBƌ'u"UDREm96JҫJ7"qWVe6+P~&h>;n_|h9[5%a*bBg)ᔣTsʵ#qTh̆6UTqie&IW¾K:*N+\ߝ,z+|w*ԍ@h`^3J%10FL2ͅ <5,% @ o.ԽuJRxI x^*̠'J3n5Ko1iiTT( 6osϷ(e|9?0[gudb${"[ߊ1lprZL]hN痳nC{(g+b/( [N=eUj[ba]3.j)+`YGL>sG=[97 NO#A7d]uc>MYP/8$4"}<Bf>q{85G:V5@o F&@_0z/¿>{-&޾8_ s.wp~=RЁkMwcm5 -m6}T$k:SveEyէg#|0prRyBj)E3+s S\ .G5}]Jay8-H=:XEg߯ ge{n:Бj% •$(&`jDL۔MvֻtpQ)ImmXH8c?}8t _cMzX&Se~Lnx6hn=u4ISR0#`a&&TDF 9a>EҀ2J`Ywʡϯ&jfhh& ԟp;Y3'J+9"!{hWMJڂ+pUWຮu3ڥnKvO^Fp'Qe>K, E3ugp<ɿ]GvgHHR1̩O/AC[3\]Ej{gHt:1qq0ʹ-ogS$NR}|@E1N3 0=iر)u>׊}HUmgzȨ N+FQ&]cvBN(y$wzؾSgy6z\ h0ukbL0KhsnMឱ4F1+'qzťyA{g.76tnmmZqܶrߧ>Uju{/rӉHqcS 5B)Eʅafyv&\#`ٺ9vƝ;2hŰ<:'6`(i@I{-5j k< !Vь |r "1 I灧sRFXp4umWص#$Le!2ns]__R I3)(8M =# gK(Z&z6q>_7Q́U_H:1BՈJ(fXgpc^gjPlK"ǀdD,wu7W nK.夲VFxrwP.Ĕh|44F&i;x:!-zq=w4[\vz5d^(}C]vͶVU+CZw$.VmʰAWBNzlFeyZV}8/GEmA;TTiDsw}CG>krZy50S=Nb74N(1Hk:ԥtҸhFټJ lq@ D iB9-aʅL^L=TX)r؅II S"!&E ;KIj6mZw *Bwч]naS C?2y_xx@fF2;B!7M U(IXtQ.Σu]c8Yiȗ2]}o޼>6CY#bd!k~%Z$0 iO3,(aQb%'x5Tǣ@_WeR!Ӹ-Q!ij6">ߌ\@R\,+Ae=My_bųѱJ(DXbsEuO AsC)CMyDIz{k4.6Wb^ ĖʟI%/{֜ȱJǼ8F4?̎a3va=ȺI i,o߬!р)EЕ]y/2׌F 9r~5;oǯۛU驱gݏkcB=vZ3[ec_^ꘒFh P+X-1I:XE|[FxĠmnA7LPx:4p}:?֌XcSHl"YHrGd"YHhkQC(#d]d"Y`U"YH*R`流|#l+5#$gڣ,BN}P0SH5# i^Z劈O#"_||3"E“$J.m`F$$DHH%A!7R{6{ŕzy)tIg5l{t 'U=#FD"&m4܀ЄZ"R@i/,IAWvz0ˈULȴpN`I*:|.g %5G?Nc38l?g1]d@vwUP3+"+TG?!P&No4fDöoݯC''ŨV `t'qo!Lv 3]?p8G9n2na4O<9Ĉ6O\x59a=$3k團 7Htf]R \Mx{APrڑr~㪑. #aElacp,U|8Nx1f7pj42yۇ\7gBX6u⭆%?>ja??rMoӛ˕QCct?}zN)3駟&!INh/[8܈C?'c|ƹ矫_z?? b/l _AQ/&6y1G! m2b͸?vaEhLaFxnG֣f*!ń?杸5[x]6w~+oIҁ )S@8_*I2F!1[7-$:pk]F}~Gmᙛ:d(\+ "!e<3BNaND43ʐl괲3ю6穁dNlfKYzvkp(.꼘mڐv"/N\l(~̪ٹWV;0Z+kH"-Re-P\Vz0`C+L@fLz ymJ}V]pګq>^zd5X8eGq$RÞ [)sKCrs8NHp ?^! R>-% 'Lܵ-p g n@)D.Z{lA'ܦBPNX%)h t:b, 10r5rAao^69srs6X VPo4IhMh_KNO{B)/HS,)kH @#*T0hãd<1!SK#1 Ԓ+܌~vDH@j<5 |NtsN-b e1@jpVyH*W12l>g,Euy4J!RCB*`Թ.:t,WT^({ۺv+W]扫A%xp oMr|\2>[BAKު8j$K^ƳVn1mIO[d&/+68SDRn I4C5Rq *D$ݚm\Jг4 JIR.g*=uc|biǴC8o^ fBa$Lkicd3VeCI-d;RE@#AvG=ԌM,]mk'^kf\iG﷉=k~C4gnW=tb~7"l>姳"&सF~,~S6Whc(v3-Υ3EޟE-i ehGRm[QUaDe;!w˿t-lIPcJRѐXALEpv%  40KmК3LI`Q)b%|[j 1xPgDSY ̍?š߲#kӠL4RDʛ!&'{\_aNq#ՆJ)`(l/T=M ]-y$ѪC1AwJe8hf]V}PK%zn{QZ?R~ `-4fꊧ_BPE- @x WioOL()Om7[R b8(Sl*Q/e \MziGADVc7kMø8mNԩЈƬphmGaJ-sգ S:~̿~TXo3ɛY;TM ߭Gz=쑞ft#_59gβ!`D&^q2(J^`2Ĺ@ 4c'NUzQ+z@'T +\(|7iƬ&qi΃?_:P\cKǭcw؝ftٱ[ gƊ\rc\:E\sW3-5$ZFcׁv]'lK:Dry|+8eާy8U]5= ;9]˶НQQ꬇r$(P|9v#M q;P,BxX Jԏ~VD}\}@HqM;_E5t96X?]] uKfJvX< F['z|3~vOvOrLn v癞 NA.[~)5Q: މZ.C?'^r!vƒZr\ky&Dnw0s;^COCT=Mˈ+mwZ)o~| t̴곽Scm҂ _F@-ſz$hi@q)-MmĈ UMrT@rT(G%QrT(G%QrTxe% 4^KxI/i% @%%\ :|NspJM5@ S&A)L DLrڠ-L%ʬb X<16齩~Ww=#/Jnckܹ:ƽznGRq`U  0Jlrd<3)՚1T !+m+Ym^~88c+(NՆ֝[ ڈ?s6bYTvhJiؐ?e}i`3ۣ2i< ~_gf)&Hl4nH$I5m`F$$DHH%'Í^'4{ZMh!y. Z#<18q;cpU\F"VC)XD -7}0ݗ7𬻆-w p}Rtb`s IM97 4-T&;lJ{@$fIB' ʫ;7O_m*xUdZKЈ?6T % -K5jDTz ߌ} YJ퉋=uCEQ*bWڧ\ g/vlߵG'H^nsCg6FjHX70z5\AWsw\i4{X[nv/h>y}_>}3P7?}x/ȁQYv&םPzu7?oݵfiL0 ~Jbo]{]?G(wB܇`$nhWr2r̻~DC 49180hr"jr܀PM+KH=:(7leukeĚ[Dk]T~ϥ?y3g!PA;?JIXŝ!Qjqe Ы SRS\_;@fFngn0LIn%NS%FBƐ\ZE񘈲R|~idSgG[ԅdBz-{[[$׻lLr-/:bFlG6ii9i/'퐽Ub&YPٯ_eDUKX Ή(\Զpks-b@/gJ.[cqnRcVjLBZnJ0b0GU8io&Lfl^HYyl[:KL։5)h4a?ʸ</Hkaԍu _W$RD0dP{b ')yN^FZ:*N(=>XD"技d) )1Z.w^+q/2x=I䘑l^jO"mo zGz 4_ mTs.L"c"dalHEjSsuw/ =ٷ58ˋ_.@o[W#v,;f^֑2Lvk7TD9m6+mU(/ Od]c(²*#05n$)R$N0DV>FO=I&Xuc%2L $&ah. Z(49O^Oh!;rr,u#k8S%F1ˍ:)k/JP8P-%'B!ESNHS  V$Fˠ(͢,Q|@WT$: o# .gkA i:MU~NxZ%k1$(h uT>[.NSdJ# :iB^EGC,%"5TIυ ENH:fxejyhkv`be9Hˁ]z\pwV5`e۽я^楆 ߚRJ*mTjԫֿ}?½nU22_/f^0dE,B8-G[@Z|ލ;񰰡Zwۓ$a1苡+I]arۏQvsEo͘<SUFW0RVW g6B1ovima2im(݉딓q}Bf`ug3">ٳ49Ya7+Ӄrg3 n>ə?>?p3qnÜ6z6+-D'=\fvό' `χO" Q{*^neR#Xxk{2^8gؠ?';UӴp~i_cˁkq+9Yь4+0_~\_OU#ąy.r.Zd8k7HxvT(#jT+]8]8!su 贇=]HMxvA.kJj8]Nw~[Ye^hqVs&}!D<(imQ҉2Ì)b%>;]qΠȂ擝"u :RC54PH dvtBkF|!XJ"\P]'SRYPvE¢2Tr [@}$!\.`ix OFkuN",цw5Fe+t2Y(yYCZ;b~heMe!eiܑjǮvd$SΫM2g>gf~O?OqT.l.zW!vNQB˖:k2ҰLE9+BAȄo?/*yUȏy|O3K?ǻ~'5%{'c/hA܍܅DQj#Ő(KE1z; iu›ÄK6 Sx6)zrIT{{\s.?^AQD5N18o3y<69k5I|rƓ'#Vlm|/gqޥu!Syjxaveo ^-mn3GUW饓Gen2K'&r-iӄ6[WHU>30w W)=>znܕ޺uK٪I-/<^M??62?M$ߪٲI~xrFr$'THX/T;ˍ>k1k!Uie(+X6-0\iZ(x0#?}C5&ǫ{3`By-umK[]X/K]?y|Y/~`q= -,^Vx.}7@E7,Ј8>e+K׷[:\'r,kƟz](<,7Vш4t^R+:Uv=;8Q#Hd|I3,  R>r"h3t"8Mq;km~ -oƜU:[zCc&J!$H~}:$u+zP!GAĭAd@pP ѵ.ڵv=+^&if3z?:xe^f#>@ s2K$EMDnd1t6`) ؁}AT;-#LEg ϧG xI#^E% օ$l {rѱ 3#@30I*`Aq"QF E=Y &tfiMh>55lJ`Y,yY2hqj|g?tYuIԷ lKkAG t,I1eOppr'`M]Ԣx{zx~v 1ak \wh,u΄83t+n\@@ePHLng:#崔rVzt UTVWObz-<ݦG]LޡÙͺ=j-np, % zf0D\HS{[^;ݢgͨa~jw`Kׁۥ5r7R͆M~ͪnߧr+iZOBy]8cAi8K %0ZH0]_ uqБ`d!`ƈ }T(\>0>'Eƹ]l ~hIW1A 95`R(,#&@9<cBky;j.T"k53 r<"q2 D@H2I$cP'Q q YDB1Q*1f^ s\j=$jHv aD 37>UC͐!(hf%\ch6DRLQna2&P:K8Uhaj6C@Z;qQ!%7IY$Y Op)1eКQF!V;͈0g;;GᐉVz^0ikPZI㬐9n +Mў(干QJݬ%VyP$% ۘvDt%B&Mhg)؁M\Gnz"!wXqj:nYbyM)j>8![GRPeDP](3WXkp`ac.vw\^ @n@Et yr /m#X<ˌLDR{0B8CF+=#q RJx2A=n6c堘Q:9y3D΃ ; RDʤ pd &N;–#a:: %Ru ?Aj wGz %YD"̃Aa1fGsμ kӉ`#AicM[etPB`G4Fk@f\4K6p"F,Ko$=oC9"" hiߠn$[ dȾsDkpZ!S%AhVR;<Sf&}{52l %hr%"PHGB*Ё%? =-.a^Ie*Hxϐq!(*bژ@ \R6!I^<[ 0ra#/+( ǰ DUD CG=U%0?yBGK90'21`c#я D5h*=|0f4uLB^!9W+ {ifA33)Q]YxQJ8mʑBnϡUȴ~a b0ܲt-f@|ce]#Y2 AsOԓ(de *In|x{ " _X-jPu|tED 0ز > :zf9Фd>܄ÇpﺐbhH5: dDY]w"UFo`j?L/z/&yzOVcd"inYS$j.իm L#7!$ގm_ofCc6֗ug}yW_>%x$T?2a q|!8[$Gcƫ?~?>eu䫍!4"9FQmGtn٭pI U⛊f+7^}i1U*^r^(KGfj턉\唐2׬["ϡj= >|zwkkӈ.+Ӌ/=b{2a97mY2xl][9k']qӞwanFeA^8o0C R]_0QmQ䤦2s E5]۴% ߲< "m3؈b;~|[Ӂ4tP*Ae:L2TtP*Ae:L2TtP*Ae:L2TtP*Ae:L2TtP*Ae:L2TtP*Ae:Le:@T)1'tq}2L?~H)Ie:tpuMGge6PySd!&DMsAflOg;oyHpu:N]&l:gO'~2=h.c^l&Yl%62f.7dÍ,ߨF|a}m&v^uk H2 Bk{@y,&7q|^<|ѥtd[\  AzΝ!"em4qBj(5ʔJ̒l Jn˘;4nB֠`o$uhwm1 V.cTBY4%ɨ3p92BIԦ )I,g7٧Is]< IT%CRv!٘ ³{MȚI:H-VO[v)-I;BI9f(# W Yi"3[!M.DNS)\84u1ő> ҧљkH{Gm-x^뇎g #p~2;܄Ob0c]yqv\¨fLף+o{^VDv]̏pf\ s>'d,]$M.-ksJ'i dsKے[:k7#Uo3C6ɴO`bɧgmO^N[]d_}*l0}iY`>YM8Cz@Gӎ ߓx*īaI(ώa3z|Ϳ^g?|Cyy3D?CzD@`~y<@MO;7 cilqd ]5&w @Aqzgx ާӳX- Ɨ1J."T(a:x6 G)!ba[;BN숞vz]C u!. 1GzW[ֻeӷGWĆ]zۜ$Vog|gI5_.qQDz k%*'BUDDɘ-T\3/R&!e|QuYԙ2p60M\h ;O3p[jm❢= Ӎ3UK?}:]B{]=BV{t4ހ*a2 ^+qe2Lƕɸ2W&d\+qe2Lƕɸ2W&d\+qe2Lƕɸ2W&d\+qe2Lƕɸ2W&d\+qe2Lƕɸ2W&d\+qe2Lƕɸ2L?%&cFc虌0CZM=142Le̓d "Ii|aC#yژEn3'_ m6MUl6G1.;J#Oĭxvek baOC;n}mCipH)iVvYc~bs]ِT:ʦK+=bɀӃbCzDRAθ04]X6Pl>񢮫IԶxY +^?Yn|btmC!=.<~8(5E0R58:/!ZtN^ɝN'-5H]!F>aʗjď'۴݆5d ;_H\Ώ4f(>m:5cM/hvKvQGD1-sd0.3@7Eue 42T@Sh*Me 42T@Sh*Me 42T@Sh*Me 42T@Sh*Me 42T@Sh*Me 42T@2`lǢE?OM}ڽ^q*O"obZ:\h_Lbv={O:=*Vu]і8x828(z>ܽ?v}2ֻZ煈%<.}sOE8IZ/&!7%77TTXڷ}jt{%+قlX_SQet`]OF݂=]M:Wk!s;oNƾYwhɨވ}N(v2W\xYh]t 66xBQꑄg_Ϧ5HI3q5δƙ8gZLki3q5δƙ8gZLki3q5δƙ8gZLki3q5δƙ8gZLki3q5δƙ8gZLki3q5δƙ8gZ㯇z.^8\_돫s2dK 4EKhwO"S6"_E)K+u66 r[4§~[,IIu-5[}sFDcI,Ռ> _. o2)'uD>S[0a| T_0&szW^fYf:3-ʹqj-'?Lڽl3.d_.Âtr<U,~F'ea]f͙^)hiwVRS\|^Oe]5Pi"q{? 3 13///0Co:D_ ZRi6??yXZ؅"W05tj@-Ũ`ſvhŶk/*S#-@mk*}U|s> &q^5V7D]2&X LƳRKspwxK^vԒF>H#9eTIycam\Kw-8b^rÜQ [XcI"moX#2F~CuP ={=[Xݨ[ `^'ަM"%k@VE 6 #)۸%6m\] l,v8: AR V 5+F՟OTc6vf'VB'+_0wNH>?m m K/`>G>hڏG9?]97S77=]yFϢNj˲qxqd8t*_VWFxֺ| gg.׷B꾽΄ vK+4۫ 'evjhuHg; QN~P ~\L8CJOa8-lョ#=HgaxZV>-4O_8g j4Q$6ѢL0,t.TJƫCk!zܸ"m/Dֳno;þZl];Z^!\/Rpq*P$Oxܙ 'ZK(I%E%^F@HHL :ژV<UYXBg `*,tk$> 'H Xkc}䉱I M+7hXe{E6پ6̖--[!--F!ya igZJjGB$ gلh ttw|Y9["dS%)Rh$1 )c nmpյx5qv2(X8&ؓ *w{%Gy0.9obáhVԟd+ġKahg93I%'-$r8mVrE؄ A2 oQc֨D#Eo <1 "Gg-Jkx$)ɖrB)VhH2<2D #h`Bb|"R;n F:-)mWbǾ;-S_(e &酰;&b!I: )I6vB66 .m*{ݶw]I` m v7 Q#E6X1NJh؋-8RR0|%gSԄJBߗBG2OJggl˜O3iXJ3:b ryFԼI`φą\yD:5oLR2A ^=@c4 ;oMZl{kfk𱿐!~-r!+'o?_P@>{/ f>B\X Wԕ/K ;q4 lb0.~sA3n?Kw%8])/biW//ԋR0l+Ë{-x!ݩuy2bziȊ &-B\rTiS=f+;<WĬitX,Q` e +gI1A}EWz =k}jB(o"LGvp0nL;{Dz"`#}W 'lV"mJ;!—% kH[bleT,[F{Mڹ ˅QhcA&zG޽*xHo=k_\^zV"f^otfی:62<|^őOeK`lK<>FRVu|rTxX֒: k\1 >?'͙@+aR/1",D)2""&ZH0ҳH8Hw3sPKX/Ǻ].;}t#nt7^/*@[#[H#q2V<e:@QAsgт+3a) NPFj%q9=MgV_;-ޒpu=26C)ӈ`q`a(gRASXipἈHGJ@N<::]/yMx|8,D#GQsʵ'E "U\:T:Kl9`{gzs0L7lx-K5X|(RQ"BJbJaad 6!EQ' ʉ`N6PWsHwD@4:&yf Ҍ; b  B)= BKE8 '0MXwQ`%2QP"Y$5\[JĚaRxUVNyj83'ƠEP`hY.N9 ie-*%vlHRaM D <"Fj\RD{1Hq`sYNםxỶ7gU!5ƅ=L:"NpL/`2.?ٷƗAbMdeUXqӻΝ{xBhRy8:.Eu.;U"J8%`X+BS`!RL݅ý=^'0_.g@P&$"zTaT(6?¸uM}&POji`|u&4\~)Dھxrr?\V(.4Ci''^cB@eFX'?fL!}gGuNU<2Tɏ僷E`TJ3an?wR=م Q?0) AfC B[Q-1fHs3\46*d})QOZt6ɦΉa$hsN6WΘ&Vi#L-$5A}6T8t&K|;jߞT*Piۯw\wx LF7?\woo~_~yûL^߼-8LKDooR!~M247m*SuR Ϛ.-@BQ}}w\ʾYr'$m ;~w\]EӀh滨`6_DC[mWa=ܽ/EޗifH3:(7LeM1ᒿ8FhlW$IOgj@dGD;N\2"7YMHQjCцR)^dxngnhXS)b,HxUME & ggySMq< TN_hhM' h0]MdCVgfgkuj;=#XUQ/q0gR7SfJL)u3n& ͔R7[JtR7SfJLm-SfJݯ!:4WoI`AΆ l@Z%)qsNRqa`Jؐ /aNm f-@`#9 ֳ.g(: 3sV=?S膞O3N U8:mݫhUPcAwDx$ep}]9&r@p"6Q~Ȱ, 8x[R.k%$-?=E0{jG( /dr7lVX0.Ha`H$ee/39POr>e$fs>`.0 "-I3aaR"rV7i͟x˗xz~ SY<_d>Z7G Vp).}Ԗx930kׁڵaߥ'WT}y`Rp0s.__Phr? v3)0G J]B:`^e.Da^iXBg `*8Sʺg2$R"c}9@A,]XzuV!!: ĤH[XA`Ir,a#ֱ:%ikZiiְ"֥N<;k_?ʾЈ֎8] VU^-\0sOoG1B?;R0|%gS4a />hF^а4)UEX5n)1F: lp6OjRԚ.JK>Dm+ReSr_u#_tG<(mp_r`}(K0 V D.v6Wci<o澩QmϏfߏBiKj`Q{'6h r=:RHR^JS_}nQF$.ɲȿ>8`^i{X$`XeҒhjJ"i@i ;橕@TPXh<]W/ӿ?"g_P SlGRu:Ao4H";gV˧>=PO$ FmT4z7G'{i7OF4_8_WfcMenaNsՠu4 U憖'Օj7a>p>Tue ] *\!Į˸ F4 I{2zʼnV0Zcw''uXaёt}rO20&'SbpvNssM>f㰝:N39pANi᪃؝ קOc.6¬4W +/S2O(g6rNsП}Nuk<<9̅on!':|DuТbEY C$8_֚F?bwuܕ^}G;YL؎׾aU~(ncsCU~3Z45}؟ږrܓ$en3f 7g>|sHҽѵIZȖfWܚן-Ls1ay:}.3&n擴^6E'#?ޜcqnyN&;ƫݟl_2?B h[bh֯_ߺ{Օ@oir٭%O^BN/bG_o>M9 IM ר]QQq!,$ (J![X`򥥞IzoߓP2zEx#t+8>jK ^>x5sf"ha]{-Ρ׮gl^fw)e[H)Gįn(h4 gTE fQk %.P!`V0/2ZDJGt`ZD"rRkaoz/4,r!c0)fe  D9.# Ҙ$XcHǴ (E!1z ",iP%LR#1s:CǶ$-{M5״65ւRdԉgǖw-+gZf}e.BjAp?@@ )d΂%Gj{ ՗g'.k:~Y\8'϶ox6zoLP%cR(I~BF iYNCXr\oP0T+6^Ԗi[6 )@/aN< 3XR!Gy|6$9_ҁEʗ<oDt8VyBj5VE\ 6cFncT9J[DEdQ0:lVJHB"2Mg?ZBT 3m}/(MxNf[ `6; (1k 8ݬ\vE¹``ۘc|z;4Me)EВ)E 0{0b0sas>ZRrڋN{fBRd9Njtj֤ ;ĬW逰d>9HysQ bNV{ Ǧ,5m Tpv(un~KJ/ $|s/{0>U `k ri)mxxr?noӚ骜=o/ͷn2?9;#~~@5R-1ATgTsw RzdP PUD R,гR0*<)~fݠG (u}BY@ /lFY=&Nb}kP.CFNC=kW>8H yiDע% K'w 6ϲN qN3=Z%U$@P ,hoa`v^ >mSR-rN_̲ߓ 6H[#[[Oy-36v `M 8e *qc62\˦ty8lɮo>y Ӥo,1gHj])MfmN3@FQGrVtWw?{۶ znha MrEѤ(PاY$n~gIiђ,Ғ-pwf73f_nK!dDUÓсdž^~>|<܌lj8 5֏zi+%sl -[=X1Oø!=>TQQ=o;$b\ynG=Wn:LUl^ҍ_6P?B6JݹC7xIV46ئCE[/v%f% MB6?f$D끬x gWOqo$Oy]ߤr 'ͻxlVcin4*I<_?TJ5. ?2bR1pt{峁6s`r(T Ԝ!X`T9F$㑅H1, /5eDDL `RYhpHnUuM'aG=Ħkk|g Xw! ̜.䦓,YydcOPO4^JJ8Ŝ(aHrN@sk9XjnukXGIPO$3|yt'#$sdv sɲ$F,XPh*ir0bMֲ7+nv#wp^F\5VD%CTO @Y%L^(J,3nxWx|r^uB >rd)$JUV.|?[ O$N~,P(jٙysyde՝_wDMf7mh0.Ws {Yiߕ9dlt4uv7fûєTаaM3{>xa1?[y58? 9\!r -%& @anDOh) {MLonˇLhFcvxV˄ $ 1j) SPì$.j62Q0猷%x4;Lx>c'Xoo$7 r9=sf=Ï /^ƁlH1 B dRan &aF%^;mFx+K]} 5GQU+k_@ _lO~1J9Q7~0>+&̰ uqԀ#b0[%9*u)EHY1^Gx2 =[),}ҼgQQJ:=c1KAS&#)7)?)7j<%beZGEdQ0A[R. VD hl#6Fz:_vڞ#р<y l_}>@+sh6 ߄ru &7BXGr&`DA֦10^6ۃbCbOE( -sUp07JGXǣ>jadsI(bLQԜrbQznGe%EENF,ΦFQ2QrlB Oc^g(, | =KȳQrclXgP >zrXZ t7re.FJ$5\[rpY38C*O/?niziARϜ06A9!Jz=f4:( ie-|Eúfat dQSA! xDj\RD{1Pq`3ZTSS='co_2Y餲}H~ĭ?γm:OsP &#>~k|VTL:8O!gݏo+30BVyS;^ "-ΐ8w/@ْ)_ NS $"7R J1uf<׶:x}:M~ _E-6`M@ZD:$UXM.M0n=wQx6x?& OCnLF'i{9jCg7grŅf(ٰ iTOV63׻vśW~}훷WŠ msZCBމ_w"@xw~ڸk`&]SŖ~NJiׅqs`(Jg?}v_.mݿvBl  HR_Wr`2ՠ7&+ MEk'#~ܯzܵ/T"FxGA:7oΚ~>Zۥv"Wn$IOgj@dGD;NLȍdV >bRHznoDy(kXS)01 L$ $D|,D$ &u:l}؆}k>lhh' hc7֖po:?1Z(F--t `Z/q,G/UEw6?OA7649P0I<B~__"gWsWcx^݌)\1TJYI߄s- M6sagnFjo$9)JqR\(|QБÜYxP xT02kDJh& Um}G%0xߵCJna-ۢ*uZw qZ-_9t js~yףC򜱢މ2`j($Ť 2Ng$QzmE),5B)ȅpa{D6jl Cz ^ O{ r%`p`a+1r68fI!}!ٲokjpg>[m~j5uB>wϗ;eű0U+1aGS,KT+KryBvʾjeNukx[XKy9J/Ǹ/qـQc<`C:<Ncʠ:nN,MFi봢ZzJFQs!SW:M| r B($\c̤P_@cĜE [ZIv]6D0DanKfD Y OWaPxj݇b4 \>Dԗs`iPhް}q?JO_ܸ}jd[k3"y`A ~pS*YeEZ ۼA Y{L%= qύ2 e?{MH QN)9,"@gz%H'288qX, GEVy.Hm#s"n4z0bQ D0=D!딚2""&ZH0<)c" G4FΆ#k\~3~BgˮΘxssw{U>I|$Pͅ,YI22|ZH,ƦC ș W +bO`q*hD⁵&PfTet`E刊`) N"X!Ӱ7Fz ^9 . pzh嫫~HIMQe'˒VG]hˀ\D%j :貂JE}1etéhFvKgujQ~VT% B]N]=uq:MU"XQW@GJT2ܩP]ʵbGQW\"E]%jT]%*W(U#RWjѨD.FǢms٦ԕ tn`>r>[X9jgmirH9F/ݫhx[>a409wxqNPv9ҧ =ߛ!>NCwy1L3؆; {B, Ƙcx7ދY>2t. 1&iMg^d x(jR__k2/7ʡii79da0"kY"M;M2Cv'y1n201xOܹn1FZ7wspK3N1JQB"WcQW@-dTW5[Oӌ &/?Px\'T9;OT9;}ͥ<:c11Xljg` "-FpŵX` c9_]*zv ueEx;ު; HWf+NH~uu޵#鿢6E3s2żAk,9j9غXՒ,S7ؑ)]UXb,AM! BRq TO^ګ/I :DJD+Ǚ% q*FJu5N[lpdRj6 :re.%` bTEsPP+H!k E8*SҢuG8*E dAQ9(P'G_㛨=4.*?U펩'r~ of%4(5QxSB6^'@ !)N RT:&B8Pu2;Coonil~ɖO28V'(v͉) s9 Rb2=fd;3,(RjR0}4U핾:.tQR bUtdID :Kոj|Y2H&U2 Bj!J38ω}eYK2scc D\k*O5Zks? JnҢh78'P{lCq{iQsY' %`spll3C8F`* i>} q%#f./@>k&.^qÆqEw|X#ٿ2H^3FŌ[h}#3͛.5ZswyA>j#3 fnMSϷ7P4[NZo -}>ms'3Ɗ䙠Yڸ ދh-azQ)Mttǁ}0|9 tb) LD tҡ}AiV@y3(v''|?\>\?O \"3|߽2U˴ux΋K5Y]%VuiK!?BOjo`ɵNF N|RWR~HT(D9J&^)!QFu]vf&#0\0N9bCT5:.kNo~AQv'&g@F2}7 ['%lږ+|NHkҪwPwo =eŷ.|Ovz-lוY/J,/:L,՚,nA)><\6ގvPn9)s|WiMi<>.l\L<8k z%TL&a#l**I1hѦ t˸sbr^獤x6P&5ں9]7CO7P}f-i@0}WJ$$9::I x DHŞ`oLhYTc\XV[zI{S1o҃zb Kp1rKZx<MRK4㽋.71,7ey"RRUR‹R*)ejWRTJUR +).Bq|jdU K*/BV>lw{)Dq}N=;S)Rت[nGd}/x>=VcxQHx$ɥ LA IHx}\K(hýEˋZD9ϥfH# iC[;eT!2i _u >?3יpimjݏ}ۉy^Cd|k{-i]{\|]a#hUqkfJD#b>5ε@!JgG/4~"sIp%^ ׈WI\@E y4'4H6BzR@!*.Pq/džq?aN&º jHc@I*WDa>8%A#c0tiAϼ.YKvC4k$X5&Čv?rP bD%CK4*4EH+l\9f it`[bIcNvlSw~̭'O-9 q75ޡ2ŷV JFƳ2]]a; .?[ APpyhaD[1%n,qE5ٓN. #W@qNuo᪞ SGx@J8j=!HBt\O.WZ, 8o}qvc8d)U,#bYoͶлiZ4 {w={rBRAfߥ8+ʹ1f`׉81<Ѭuo<;57~._fω ~Ӡf:<쒓7 4ѢS;? B֏[GrHmðajfU>Ȇ85 g0b^G|1ɦSQ6j۳J! [Z=8yf6?!h!3kzavOү{~EۄZ]`닐Ӓ/eܿyT`64*D?FL#JCO׼w{Ukz}Kh=lI; 1uO3 O҂p1| 1) /;#ކvq3ijXJRA 8ae6$cm>ƜEcݦNMeZ'e%s.޾[;gQn \ywWuᖧ T!a,լ"|WK[ 2 q`2)$}&sM"6h81=i)"yQZ t!^sϖMrr+)E$gsG#w.ʞ!+(80AY@a0JRQk"tD %Yhs;hg'?SqȖ*s&i. R4HXAr S\P% rQo!H')g$)Sp`%G+@%*TДQ26Dqj+Nҹ}gkꪫJK|$Vvd}I94\D'D\ D3 dX 6@:# юIr񜧌#ʐ砝ĵ|\qΣ"V RYF B'cg$c8;Se'>}o Zx;:w܁Za/%Կuege+CyS@Z፪X K\cIMpwkԩOb~ОOw'J^X5"rtr:!k򮖊Hxl I&]p\Jвi@tnI$B" Q9@AC+9Ԫ:L7d?H|Vw3VeCAՈ ?;Qɠ`}N}4AȠ˧4x2+s"`q!Uh*:M0:,e]ҳOY 11kD|cty"b*2tD ⨔ `b1*&ijb NP*D,T9H OHS*CmBdЄ8q=l{":\$gZ_POmk̈pnz n*ENDn$ϱ.Vw6]^? 2kغմq:Nmdͳt-+lY͸un7Mﳞo|vnZn懜u͎gޅGô?1w[:%|LS|bKOG`ϛy64Vo'kIcٖS]:D9l05Y Y(0ٽkūy2"2N)D-CUgde-_a8?0~7LTq{Ks8Kk1j|#ez^Vq^8GHrҸ>'+w0~npR*х$Oݻɧ_.Io;[t `XNDk2>ʤ9~4 2O}YX3ZPdpD|ܺ]4w˕sKwQۻEɚҐBH--JSZBHƱ`c) 1(T P [Y:Q's]CgOAeѷAt:`yHa60cex% -m& x%tAn}C띖T LŔ >GS&"E}N sFGb]H26%$왋NސM Q F)85Irm]$"(G8p Tx(!c-NN:I$퐒vv(t. ߴhg`vK^{:L#u&JW"Z^9l(a]LG;r@G8zxnF/PmV')63VSsFS[မY(ĕA@L$HΘ 53 Bre6GQ)b%+#g;B8W=oڡז`ػ޶$U@pef7,觭Ly$9IiQ-bdUu=ǖ/E[yɭtQxРjD>^Ɓː:9_"VP$ sK6QbB%^;myk=.h-Ӕ. BM68QwP (sN9~ñ 'T|!)Q?˰\1 -ʒIr A"z$봘YⓏ,6G(%B x1Wt^)IiLFJM4/7 H|ABŀ{M(yuqmXz:, j>< )ĬC/d^VP™07—DlQWׯzW;3xa*zp} w#XȽaQ~7 +~؍ZK$Ng'*9Y/J@Y%/3șW"W{|2^.]g)[šuEܶ:&y:-O)T`+eC-fu}.~]gU_ʳ5MnX*{do?UVaU 4m]N};?`5fdZ5W~bѺF+(0l4Mp #z!Uxd!R r9RSFDD b &UZM'/N:֒B^wAG ҈*ݺd'?*Xd0tb 3nLn"*"i 7_PuʎOo@صs _F}|4Gaf[ (OT9ŢkBh M[*.X*MMZHXGl+  I8FҎY | \K䭗Gg-C7xO(R-_[ak-èv{WRn]>d`hx-`vM^3FkE (Ĕ*$14 %EQT f$:\]nL- Th44pvtNHitL`Ra(͸@/#Ayf%${,? U(BXh̪ͬ, R% QP"Y .HjKfqT `-,}ậP[69al4,rCF{-ritP4hZH;uH2ғ̜"j40DaBBJ1ho8j6S46sη4t ΛHein q5V~:EaNo*)5y"ٵ)rC'_:FފN6 9YurίƱ&* <+s*[еARLܡ]uVڳ$[13F(\ ۠΂Ơ|VqR`vY4a#A2 _o*M|QͱCGGBGكթ>6{,^ܫ/Ed^kUm1ga RIݸׯMalLgهPR*_"FIN\<0`{*;S)%"h@QAsglrD`p'95r6sJqrMA{ϖQwq;Shk)J;3&Ad"eLJ9nr;7ނJv^FusSMziy9y1ka2JQ&H*8@"R@ĜD8+rőp.uXERvK"2U?R.!mB%A}np`NV9>栗:HJ;B}hIC{>2DboJً@ iŤTiBT )dĠXy@U(C0!B*Caj3jX$"9@ h@2l- jlg,`*z4o{}l0ս.ߢiFqwe 솮]xO;ieyJ{[QJVZ[S Ȁ.& ѡuOa:цNm6ԩYv-sh׫unMg7w^n\!67p={mgQmW7t\{qgKs۞vw8v^mx1 ^̕t(4N7_ __nU|0[>-SNyp4rkD^F9ƢS磲oT+&5fɊkfap6w2 Gԁ8ZZ4tt0ã?i a9`xvsrX{ZRc4)&SNq飶ěU`̙?\L];Rґ>ǺAo-K7)~ {8:L¤OP fhBSSt$LхTm=,eE.qVq@a(.FrY D{@'Ҙ6(C,X7DcZad(QHLLN Ij$f.uj\̙2eJ'5(Z5; EUl:sG ct$ms.`9VGwB;#)EВ)E 0shHURJU+ϷU /[NS[[NY1YQU#TPŽnysuvSW㶛zRTMgbԋ^Hlf 0[`-lHI4  ooUDO?ſw8+a<6I"WJM%:~݇v_vjĞB# MWX90B*% -`.UVC{oz?NBgQU34.CQhyq+ǣ[wOX76tc sHv0~gFmݚvcaܳ-n-1[rΣ~wD&!W9LY:=V)(; ZDl ha[0 ^Fd0֎7%ۛT}kTj =dXO៣AhIm9/kB>boK]{xR?!rF U`@E OA\U M':@ o[IqA;g<`eBca)OE A(aVn?B,YydcOPD4^JJd 1Y,-,xGI4 vٮOlj_ڀpP6qPL3v8|TZЌ9\NmyU4wn& 8P  S"T[B*1ؘhoovMU,#[ UdI]: \fHHͽ(4Ā&p5?PV"}1 Js ;JQ~`^uʹ WLPhNH""Д%Nw H'p#it& %{Ƅc\i'yD;/H=oפ}M1CK-4F!64 )$rm(l_=8Ûg}Gc!["ZXQ}{ro |rx"kE;oF}WsĠ6ba/>ow>WHf~dF> Z͎V*qBY(O<$ !(i}1=e_̳#9Y"J'i\PWXٓK@PKh|;mLM t>k<xK*Ljd<)MjʈhA #(H8Hjeκp}=9&tx`.X B.֎^﵏Kon<"H0 ,M>' d{&Utۛ]~(As7144*҂'63k(pLpPLJΔ23af<aX֙B{dDq`;2M:<6(twÄɃ{iqKa&OmG-f3~7❌6MTݠ2Ǿy0TO&x6*-DI־^}-)1J͂gÜ(oF*%I w@& z/LÄ(.sL1MptLagS Dž`,N3Z{kWxdpv\aHoNO2ӃNL$idGm<\-V8mٖ>-M)M;1QaUQ_*jP܂Bŵ*[Вx \ sH[]D‰uXT֒FF:>ҭ9$9 h.PHØI->MLj9+j$>w~eGF{UftҤ`if_,?֓e[vY;'sUoE}rllL|ĺ|\x]kW];_rzht~:~wpY+`dHp.ۯxo;Dgs^Q"AOwx^A% ǫm[g {Y5p^Y^GTPJm(HM&zG SV2";uxuގL=,{ހQN8^KG Om6c3fkZL2"+׽3qPw-[9,Q!N](KY|Geu+7T NWip-ӊ Β@xdYd X!dDj!͑[#oR1뼮T5*eYBq.~|3)Kla|`7\x̒$#3稅TrlT#g^IH=i<1fc&PfTet`E刊`) NЪCe n Vzt<DZ!m~l+Cې`ST"[G,9êvSK 6E} Ƽ43p\qc 9qἈHv,|I)[Y/oV@$^^ŮmnQu]Mx=]{XVi#ŀh(jND(=7Z2ZCqQQťK)6Vg$\r.AO )p+UYJytx-A[Dqyֲ8;3Ra ]BzQ!.5AA0(BBJ1ho8h6S4/E8߱>Tָɯ3.0yRO?y /.l@wg*Nrge:ei.;s <ЙxS;^ ] T]~p~mؗΊ)_e EeX)\S`;tq- S/1ۋ7w0! H H B֭&W `X|߇&UU2B+?]kxܙTOT_җRt ӫ C0t//QZً·{7UUu}gKg&OwǓ\u\p?\= [7MTWvI ܏𫏀|BgQ31~ۦiH4\4N*d} ++V1I8]r>g7ry <%fmzW&Yլe:S Is@ G. }_0v ֧ ɠ zvnǟ~>_a>ǫ?XqU^ ?_TS2557*:StR Ϧ (LJWTqZ: I0sR} h$TOdѸͽ֍67a޿Ϋ30w+ȗtSHvP@EP}5~OS:۽5>Z%+[Es#Iz8VF:hijBwGBD Fҡ SRT)N&1*5c@V LBlHB41FB, I(gPCN7Ўlle¼h#f'ʍM+/U9~8K+!Ϩd; \ΥdBS/d(Lx%p[*r莗K)J"8,UR|=z{d`қ..k:. pgZLJ7RV

{qI8-6Ps$nЏ^jX}d?eI+oPw:VRp L&e| Ty.ul@yv뻒dxfNycpm0JRHq9$ m ]ߵ-r8d ]z]{o+us\,L0A]̡tFttAc"ik4!4r S\P=4ANSLP(d<1lDJ#$LisV ~&H|$VjpB>Z:ic@·|g$M2Pm->'OR8eL=ZRpC28DE8Eu4yTJ"TQB06,{ٲoӺښ%.H3E2H\0||tMf_n E\)li*c@_ S^twtڠѐa'܆@.mZf<$p[ N/>%Ϙ GgV,M'uu]]t@KGi qYFIf ;`pgFEb˫AK+QQbau" :0%̋zOekf>]zE,J0\渶!TE)I&]Uq-i@6ItB6 HJN sVL+MMo9P9Z,SZ;jy=|`ywj}urhZnyj&X k?/`N ˉ;> ;ͧoI'?z ~R{ųHҼ1[;f ןڪ9oSFxI g=\9$hFD !<;T.K0У ͳV ~K /m[WMrG:ـuW[VrZ`n-o|bNo\0 7t Ż9o[ZI:6~iSӞrzekN?mbjs;_}a=OI=r~cv%)(y͋o^|߼7/y͋o^|߼͵Rg ?+*( jcU^7DQ(%XP!ӂzZPO iA=-ӂ`lq)jƽA]] [㌧BѲ.EڂQ[0j _ Fm"c9`I+V0j Fm-/+@Z^ |ez-@ER k,@"kpfܰY< g,Pʳ@y(Wb 1өc5'RʏIR1b{B~u=t'jZR 4ńxxK-n7]Y%8 RTY Vɾ{ k&lJR ?-.!&r ,"u6߭^ 4ÿ3&(/|tTȬۯR2%)%gi?Wq݄^;AOCb]6ῡN\`-JǃRL DL^A&g(f6'@ 6Z{+L٧u+M4և ?w;0m?f^T.yRt3ˊ9JZ$|BD =f Zs&Q8S2GR9KVPY+h\9_ aj jQwWSAM'TVR"#]$sd,y8Y ]!W",k@byyqcZ^ml;y+DԣԶL 6"[mqY'vT|f\1i{!-H*& >>揸P4NZoG\lNz:{I`8LK9XI콡3lIƨah6سknǓ.;Jsu& X_O&+7CSwb@yQVԨ8Ox[ }ry1*2[cfIdr[墕 :9l#em_}olw1'Xp[STol:z]uM<u :Lfᒓi]x(K߭k}LRH]DA%Vhd(Ww)Ruqg(׺ffk6F+XF ۀ2T9Ee.%J1*Z43oo[9P--zdӍyB5_=y}SYk_Ӡ 14rޔ Vq!9ʼnTh{S4e\STXo/AvQ̦co|W 6_^b*r-os Fi-u&GM E\}mEXXލ֣Eb@̎<sauHK^&1@{eAuZd4e(i4&9eh׋ǁZȱCX;83r綼OֱbTJ֢QL b'T\ VܜLqLR\ ke*)ŵ^Lq-壗g1{S w.7ɁRJHBS:F{7yON6js<*2%O^h?ʼ*0BzR- BT\f Bmqǚr؏>`j-۰A8)bHh ؘ[i2LKA (?nlMϵ5m#HW%kI*G3t&"Dф-<|{j +HOJlP%Ye@x$;r VR&i&喩 BYSwq̭k_.:q_ͿY<*s~+=}kt21ZYpEy·wl?;|nտ xht{q">v9X?a,qF]I'b]OZqN?Qzg1/ggghE+7ZOHt6P"=׭/CC?S> 1t,зXs\Z-դZ]\!LުP@*#HyiN{81y 1 opB>;3*ό 0IN<:T_܌&o7Ùikb~u?]5+:Aq 8oI|$X9GjzV3*QdCj+W1~vУ1_uSN'jUsB6u]TBC u8g3hfAJ5*6u߽Bv廷?.|~ͻ}{7+0> A70;а[8th Άm@5Cq7gI{H+ }1>#{`]ki3H IGbM)R{0Wzwÿ] 3>+nyBB: +9eyNgnѸͽڍ67fܟUX/ QC:HvPwEP}:tvڥ0oxFqF tjIN? .vɬ&{|$Ĥtކw !&1*5c@V LBlHBtJ3=,D$ 6u:Tpb @; m,v99 rmҤS #ܒonATU, 9^v9&&* < -L )&]OWaKPYAVXd|R#\ bgb #U=)`k U b+t@HmOjRt*4Xb@'N!K 鑤?,"ZS4lJs;5tY d;f^Kw]5ӹV8viGJ2<e3C'lЇIF J+o»d^K,gl~r(//*[:俈O-u< ˎ9s˭#:F 3v1߷ShJ'^"69نr\Vt }M| r B($\c̤B__cĜE NVFq: 1rxA>l!^Bؐ3(9 _-0f+rRa{u]õ!ٺI/Bޒ4~0VCˏwmWi%o#Xmoqecc , ϨVGSJao><d^3ƌ{%n@®߼&³ T:ķlcύGRL]S-B"1l+&O$$L'9mOho*Wbkۮ1Y=d=p8'2n^۳n5ҫ #dkF׻[5_y߆`M>+%R+qiKrwx+k\3n2^׏USE.[U#^wə?s&h3 ТJ$flOoSδL7=f -{S)RlA䰬 }LA7iG8c[b.Qh =vPCݬ "Lxהs޺uigi/Eބ '_)[P,s0kC\YrD8S]W 5Z' %ȴ)5UBFA9RZ0bQ 3=D!p2""&ZH0<)c"-נh 5Fָf# jc i]_aEjnVp>7e2|y]?"kK1|ޠ d% GsB`9665w:FIN\<0 {*;S`%"h@QAsgт#*$;A UtȴLWY2[;Z"QƦ( ;β| RcT<5 `MF :5 }W,ڽFMFQC4c9GîB h]%(dWT _Q {Ňl)*ӢEE,4ygr眞#yQ ss s"99UBrʔ`V#b0Ehtca ZNM'(%d q|'~tll,fq Uz 8=pIib]2&SM}ɴ}R2 r &XFvƉ73ϳ6=Թ4%GJmOՋf936vidlgiﰳngAMn`1ch`AX4 \ŢIJu @)Qgsl+k]䕙KZ6oETa^ˎaFo8ZZ:a恘\*%sM#q$"Ž7+mI@ճ#΄$a}GYk6ڈ*2ttC2ML.Ӥ42ML.Ӥ42ML.Ӥ42MLM-K׭[3K 6_K) ifP`9sy4YYCoV@$v]taD~,;w}ae6OV0Ձŀh(jND(=7Z2ZCqQQťK)6Vg$\r.AO )p+R3M)AxokYsk44x~0zneVz$=]VSs53oϧ[?YͶ;d5NP"BLF*0FL2iU܄t<F5@*'YCs\ _jK\TxGDJNc4/fiK`$ O+kG\9;Bjj޵XDM} QG\E! K74^)VFXztRY$Fqysd(=Nl?y ρ3؀?jW% Pxge6y :KÜg~{}6{%dSVzjp)H"^~؇lɔLޯG8cX)\S`$"WbNx.mx,4u>7Ի`Utb PCRR\֭>Ns?& =!kF} ߳+Sb^~~lKx~6:^ʁtzUt,Z 8SE. ߯AN|<.bԨ2~F~nhtV>\kky酋bvJs`.ͷ {>+Kbe+Nl0~/K #i#1z˺aH0\SE@> 8g =^ٻM_Fu6ɺQ++!GQH40z'qOq6,OA1ۭ x_~7(ogw`F__uˏ/_~$J  T;pk/4aTo*COcB@(J?doW˫T8 I0 Ri4*Wr@yE7D݂h^L`FpDqQ3O* ]͆bs>RPEn:p WS}5}}tSج*=ޫ7@ՠN="aeuQpɴHf5!ػ#!Fad"\ԆvF}{f%/__ &[hXS)16"UME &5,!cPCq\_툾J6u47 V:-!tߺʶG;4?QhZt dZWuKu,_9$׵94mtbR9c6(ARLĘ]:U➶fFvYT{BDy8f:z."4Y4lUO[g̐H yv,r;ݍЫ\Ա׍g>_/ckk݈^+;聴#0ݛvM]&:-QXG)C7`0yå 3jAh:FY^1Tk%i]7vƸG^I7ُ44%Հ7JU5SFՓHcuL`B\wګ-= ʪyՖ{2$fHYMkQoPĺF5zoI⼎.Agnd֠#0NuA{(ධ 2 RnCVxYd\o 2l#M0x'< j"=W'Q^"dyJ$D %1d9%cSAI,9X޶[*E !I%%Tj4vCP qP눰H1WOpۥg},jZIA> Ke'V`46JLA -a~`00Qv!tr@)Vh hxdD #hF($(ֈܙ@D F+4dőp.+[n$ױm.7ܘKQ*-=1>`dKej\r&M!xb>hqta kى}VNU4:%P2eLl ұRg5&9@XkBj$# c$ceE_(ز > zʓz4n8~ü8g_-ꮂ 0!8݈ϰAa~ N\>q]s,~Jdr4.Bht%[fw8#3& JqAD Y(#wڐb&6 ʮjSZ蒚k0G}fF0J]hNsU!1*LQo9Q@~ŵI&Fg6Rܐ]. da$d`'(.%Ѥ,w* `NK?JafLd W*̠MJ>%̢.%H\dF*(9r;I6(teX=jnc[0oj|ط‡pMf˷)fyk,pz ?aΉ+pK7&kj̓/@2W;Ƞ!F_F7i9dz>WF64Yp-| +#4:>n8tċ>}ϝQ9=>'\>rgO-HaJ3BJyLEi03NL\PdVg˧<$@# D 0j֬EnM_arK(K^O*K,%yCyWSZyw,sAryqd)Z%22)I1ڳh'gpLVB3"+u(5%zi١(IQs-= |&%U[3V#g,^r]X3ՅPYAURͿ$u7gaQ4~4\cGKSOAtVd LΝUr2+}\C +Ou @P?!ڔ Tde.V!eY'b\^֮ae8Jx1E6htecVw񾆏ꅳ]R 2.crƴ2e6\F"AˆBIs# &Fz!5dl i vLIj ;PTռß+JYDsg(^(z aT$K~Q Hg4ZUrb>F뎗8Pmv+t.r[bENh3r\699 wvJ5\t%>k{X§gYOT%Op5AsY.û+JQE!H܊רsvxQ꜒yh9R"#XḋH ()g Wb"{>.X8t\Piiuh-?elӞ&?- zR0փ^ vr&Ny6Yx$9`K@ cd&qcB!φ ic.G]2OӖu:d=5(S?]Ck~W`ߜ˾F6m2}|F'_(ߧ}Z}.j8<.K\Y2LI.Z4DYLta+gFΓm_~̣_B뢥 eD;&:yknRmw,X97nXhRl7-F-`wo@2x) nt[/OtI^*e~!ny|>Z ӛN/w'$ux8u'̈tzݏ*ʂӄ(065N ѤI7xo*dd[kTR@P1!j4پYJe ݬiP™6rSt֔-ԆGhc?Gl|ַ>L׏?T~hlC(;RMc pL)5[9r` 6U鹝MՇQ3|Rb+[S!V^\,D5ELwfofR@p|:e )KNz9;8 Bg.DKLfU+8:v>{?tZ-G=|%Hs`ڧ3WgW^Kw\^% ̊.fE=%#) 7;͔cF8,ѨhLlhшힶST,4\B.[=H+9  z-ru{fBd/oL[XzݫƶS]DiK q7YiL/n,zZ܆ |tq]%ĸZblӊNȠgafąwG&a}qhֻ>ro#??"xdte9,^=4 b2dљl#go杺Pie.D@':}&bus@0(j68Ox+9N*F?\^v_R.U:Gι·&V1K1yeJxdi`J6jn_ϛ]80dåtrދ.lx]ue7,k2z?S6f1M{ 7n|D%b)eT²|ttN M-<iٞ'qw=- 0OhSj E)2˼ȜC]lEr+X`F:8O8i!r#,V FG+ dt2UHЊKPrIcWoqI/on+ڂ[:SS J%.:pL@e699xgތBK13A*K>%2X`譁Dޢ*阄r!FxE,+3%$$I͵+ҏ^e % :ZΌI  LK+&1UIhvH M RnC4zcɢ!1msPlK SdVXN賊xtj]Uq.K'1 2HJNYZ_Bc$HkMHD2^pĘddlBq_*زk<[V8w܁:Z꼽/⋲XUpF<Q6Q4,O2YC+WL;bK^aW4|̻6 G px*-11>2c2xb*Uh.; 3m}_\|ki; r/7 <1Uc]E|~PY17dkV $/FBj^|M[)7Φ `NJi7TlAT&A3[|JZeaE]JK#:ȌTQ"s>wGmP9{::َǶ-U~[Gn͖omKX=&ܑ%&;w.S/v<9I;.΄ "{IC u̓/@hw\W;Ƞ!ןL)ntŤ򐶮tk:myp+#a:̷;Il{d?wtѧofe~vG-և5Ѫ<(ղIA2CG<762xCEqXd=RI\½7?i볥W<+Mo,P VU[G8ќOUWoϿQ@ Q֠B"qYYH\)ɠ58n8hbI( Ie+SL^ ɍXUDc O8[DӗiPjTkYPۏ*Vޝzl\G@/YV )LkڈY43HL8&+'jkrYi1K.H$"*C=Q3xZzLJfFI(,g ..vI 26غ9J hp \ # I'͍L62 0@ r|3H iSҮZŽ C˯ʂ(bV3VJn;Zɇxᅩil w)|m5ZҠ @3>r%It$,6(h«x5.\ jO>8PhJ$֗r a+/E9 0qqϳDH>t.. dS]Yo#ɑ+|3>xس0mκ3ΕZmMOȗt{B #!u*u?s DH,lĄJ̽26fmGxkdn:U CNףC+WPg~6FѾJ:`@g.u|a*uQPRU>JP>JQ~R~D3k:!WLbc`BK &U(4'í @ʒeFs5FoQQJ:=c1KAS`FJMhbEC-u"m6vc6c?`Y*y,b9BpZ[=E_<-O`˃) {G&>wt_{v6|utS-bzI w652v|k!?G(wp)Bc&Uӵ}sWL#,8ZI}o5=9 aootX~)ǶM{/c` oOH`\2ѭu#hg:T`DnIP_+%n|vFp( ߶kc嚩k R{&7kfA J-@Bt.lw11ػ`4+whl_4v1C>1f}v©Ws&_KP RS0>`L.h07 Ya} (aA>l:artх+p9]sgB;x%H'288qX, GEVEBFA9\K<39i|'}R%ŭӊ;?f['N|a'>Y۱IwvTs1KN~ZH,J%11r&Ht:'JӘG@1ZleF͝QFGRvV2-Kpk8x4ĆزŵcKQMQv&GVd)-r&IVF[Fa΀f V0nL΋10lG·T]JheE GuUy?U]a+~ͮ#E! K75^)VK6\"Yr rlMTָ/ɯ1^_ä韹)$pwSxY-p"iί3CK4jpMD?c?Q~묙U v5)y@7wRP1e ]f _E-6`M@ZD:$UL.sC9PN1m62>fR~ ϝSjD4m9';XIS.@tzu fycF8D8?/`T3RLWݙO.Frcz^e`Jsbͷ0w铥zdDh㤊70ChJ9q%7tߴ i^eYYJ?LO1yŚd͉HUg7ٴjӽr4ZC00hujKR]|;)ȿz1`|ˤx?ӏ_~?gLO/?;7 $iѤ77 x_?Lm- ͖/+PPC~RuY$$, $5? Pl?WO@4n]lEpDq/aݿ^(0w+o ftP@Eᾚ'c:ۿv5>Z6VH2ΨՀS{vXY~\2"7YM.HQjCtjoL|DڅWx$FR0#X&&&!6dU1ވe!"i%Y>iP*; #۬/- -F3Bx+-s4Ld<8Y MWJ"Љe'pt-fhyun1C%.=c/TP(տؐ:%cɀMPW,omdT -VؕrRmvC 'Xna٭^H: $WPqs\1 -ʒI )"z$VL.’\hzXIgP;gL8ƕv2p)mL/6l\*&"~5nP FwwiX+XًӗꞜƓ 23"WX'?KOrjZòWP{9@͵krgз]J7?Lf'<|W\gz[%֮COu{+="~˲dH9Z!ao. Ա>pW khD Fl`hry w 3 sͲ W3 V ›X&*^^ݘXzݫm'm:hKm6na`znۆK2]z9\(ʑ&Y][sgMIn@Ȯb}qII%$$cGxiJ#PoG[(ѻgarǜcw3-T<2In/l8]Q{jc2zQNDi{- Ͷzjy]Y'"@'}}-'bM#7}$fʕ28Ox[rryٵ1!1c%DϘG6˂0ִA&x'tK9ojk- Et@d{>КR^^̠dq$4b7tnBeK F.^ሃ/41+@B8 hD' %ȴJo ! RH.=Z98حi1 ۯ¦'2߽y}^Ĥ;]1Qp?G-km#ES6sx IOwb҃Fd-)td$Y.YRRy"&|^}̂K&]2!m{PWb%*YΕL28ll0992(ss-ټ}<' eqdWWWb٠S ,Y(Z,Y>DSh#6UA*P:WɫU6BCN(iY #ղvoZcc'|lFY^~Lp/V^|yQ7.EK7{wԛY3tՏ3E{TbZx 4&8agZpgnK}[xҺzcɻĬ6qrA\oiޮCYCi0ӹz_.0zϵcݸ|7o^͇^>\ߜ/_h,: Ȧ^*X:ұ(?So{i3N 3F5*Z66-KZ!f7o~O.̇7û/4P8k gg3xo~ݺkV߼k0xVޯ|?oơDJkOHqܝ!!^AWrZ 0kiB,o]̖!6_DC߯X#GxGIuM6'I+ZhUxZR /ƕ$%$L#6pmFtǐblLNIzikÜTkGcEΆs ZSb<*f*EbL)-8(1e)^>vNV:Gښ)Sn3Sgޓv:%tvp>V>?S!k6F7r/m&rz{=XK%U]bX5kzK%"BAłtWIԊBfeC=LJ0Q^펴|sI: Y#F`@̰b&֖.)d4ҧ};Yё-B2ra$tz(t;v&:IɌDQ"HR@a eL%*z?HBbNj7$$:/LOaHXwي]kl!%y_usu )C"XV*C+BăN@XPRP*'1s#c4-Rp׽׾[f+>"m-7uYIS5"`{\Ч_# Ju`(ͤǕsȆ<QKJ:+HX^؈9^ko9;BcfdX.(l& `Kɤj<]# i\dZg/ \HRvڌ\e Q^ӏJ3mGFf>:&h.8ƴ2o&UjI|_pv iV!ij4H@dryӂtX_\Trd,4`[sQea,=ƲNY8g%7֔rL1d֠6* _j[ Aԙ⹔V%Yf6葃yo9b"3b9D ]B,FΖ)om]n>~^moVWošx{$]ׁ uE"7=mS:5kIT;CWMV?A?!YD+zy>yxCg6]Y7HOVd5[7nz|gv-Z^zrz(0ݍy5?wwW74ۺ.ov^TV/?&-'f(b;ԠK[H7׊3g|CsUm[:}O*;ʾ)j0GJO%`iDޒL烧 UD׷K0b݀?99\YAvG4Y{y;x'_?9&D)W/A*#gṮNJQ Vh۪0!VFhSdSLNz7Z%^F53myQ8,G9@MZ[';?59&kh{UOʫ*WޜzL"b,E# A0yF:&AIcDdc^DGNǹDI[#gٵRAxak+cW^-Bx^yናjX,'izsKvfO+Q .89VňJ*u6Ȅ`.T- |r%U@Hi`=q6A+$`#QYM$+2 ڴ99ry,\q,\;Im,vɑWe8Jx1D&(}cwv cI==f gLi$E(IS&+!jl$}0TR\C2!$PChc5onsæh1V$?'.f V:XTxB% L"W18ZC+ |08RϬ"&~n!FI%2 jU0F8@'P˯g ֚tQzZHl幷:{JXi8J; 0XӺHUs%+;(g̋ ֭vukF+2+J!Hգ\yŴ>gCA唔9ZXhUR3#YQ .,QkR.3ϵS CBۈuYR#8*|fB!?yy-K2҂9C#X~ p-qIY6oδ !/ ĵ ]\2syټ3kػFS#k4H[~d1\bW,ȷ~s f@'e 0nN&" T" ="&"Ko7snjϐO<{Ƒc add"CFȒVg 俧NL]6[&[fU] !WLbc`BK2&U4'$ Fw?sϾGD=F)LJx ǸN.Mv^)5zެJK9ur6#8DRȰ, Ƹ#aREbmP RHDc*mclViNEhrt,g[Q۞޺pn?@뙯oQh5uQ#Abٸ9ڟk:隷71(g #B&>zLN;?s6wN HOzYopĂFָ 5tӷvcBE_7ɧ X`T9F$㑅H1! RSFDD b FрG!eLDKOЧ׵.9hsdX54N}J+g/ʡRG+ϯd9YMK7gI7z%5k "NM[6c2H`iճ#OS0@wjRty ,2WPfK-a04BkĄőbfji./2ޖ{5v+N,ik7dCܶ3.t5p#S=[2'*bQC%1$#lF v4?B/5>}̻@Tћga|Üc_+[3jh:XAը轡t2#jhv=&ZbgiVTc &VWO*կ3YVk@y3RTzG N<|l Q\D1c,!xG1E㰨0[SŦ?κb_^' ӓ'@8|7p>mNЦO|̒M>cjWOXtbatgVbM9.XƸ1g΋15m?7o:缙=;` @$^ұ]q ޏd.U;ŀb@4rAŢkBh M*.X*MMu a TqM"Oc*5(K)µzqyְŠ1pLeU׽9! Au":qNnOeM( u><ZZKq>NMBM1Z+"X,D$T&ֈIPD!EQT f́H:\\)IxIIYy*#R"%1  3DiƝx!H&6]@T;C)9S6pB2-% 2#KdI VpR#-Ͱ @`dٚxÉm$h[Y9cLW͓4L($-}64S:I5A(w_/&TNVN헷;;|tFwocݿOw>~DO v`\jZW +xozi@&KSŖ~NJ҅ruST ;??}?vsB' 6g<'8TO?~nquqB/#|5my]s8B1 W!ل?&&Z5a;;w$IOgj@:QpɴHf5!ػ#!&5,pQ*IǶ6LN ?ѾvaĨ0T f^$ A$D\&5'egSY6{&?^9zSrmCprUnypKL5`L' zB,H|]ql Jc1Ke3ƊRdtmxqL, k+,2>ptLagJ."4Y4٨5*1F: lp6HFZ{>yEKR ٢ҁF#.f@O|45&{$qv=kc+H:\ u0D>腉pTFy)+c3-4q/)Y *1tfY$XLjARk${nK;G2">A [ٍ>%t 0AUp;oB!c&Զ O4#,RFXp4K)zl0UViH{!tfC{8mX!;cx7 >Ng۰]6Խ|X|_bn.\qc% U/#TfOl^JI_5`(=g?+QeWmۼoEzJIrIy 20"\ť'h; m;;&X("G|j/N0ξ{MHB[QF)H",1;˜l7."[6\u;5hf ^otvJ]pdbxŐpy^{*J%{jb?#߆/,sX{C5:S i"q{׊[Mμ4>nЯ 9,Gx%H'288qX, GEV+Hm#s"dV֏lG^W4j]۾ly1:cbYCgiPݦ {aʤk/."'^qLxT4pB# 5"pE"w&1gڷ uH :q|%V )d4&jci 9&{U[h%7iddK2'46nԃŀ 3rmt5O|Wt8VyBT6OE"\ 6 #)f{eQ18ȹMeJtdXDc@-J))$EaS@clOsAL0s8^/+%eMŇDV3V9#u*=/P.>wѮrgU%:-^>;pzhu~/`\hCa`hO]QSo7*f&y_h<~6 Vf1.HfxHʊA[:6SdA*F1JbS S1f1RJ2(:o;cƌL"# cBZ"|gKCby64x\t̨1Ma >os ճ.biP|Fe1g+6a o?`8.s5kAK($pi/*0࿋IWM`tOra;Ic64YL#3Uغ̳'$-fRB| g`sY~uU,zobǟ zIq٘c*F&4lB6$C[26$#^H2 $KI&hLBB$0%?̔4.k8XƆlTyfpTM{ds6R.Y5<3',BvR[";N;_s0䍢Qa[נ/WH[_]\0s,*UJڦȫ=쾳YH(򙖜eN)bxЄ*siBޅJXAj{]:AmM S6_ʏWw'S%hM̴ָUM$3Xj,Z'䁜rjv3v4',]߮Xhѝkyjhyࢽ=9]yqn{Qs{RrLΩ#Z)ωcJZjS%DDž"#&AxG97@Ʃ(Vw x/ ]QQq!,Ѝ$IQJk7hSl&p?*>r89Uo O꫾~^W9f=.䢢=7:(/{Ƒ"_ؑQm rN69 ,8l`"dG9~Ç(h)D kzU]ypqd)%22 ZZڱ3HLX&+Ws{V<ɒ R=4 c^DG S\KǹII֌٢zFr]X3vՅPYNU>-I2.h|7ȫ'rAA|k숑OdX#hl LFr2cJun⚑h`{EmJA@FQV2c+2 hj'"iGø`\֮ǢX=d{lg9 d d1%ڑ喤UHC)dme}F"(d!`@Vt(YĂI 8tUdꄮvj<[T@ThjFT54bk(+K,4Iaˈ }0te2!+Cگl;iW}'iA(*fV݉l'RcXOQLrnO&5ZJj Q {j QRc^ajL)N)ɨB槢 x|Jc:u0ڄYɁ~bN+ن|m-8QR>(d:}vR("BXR+tQ#3w9S^2Tawwht5|VIQh!f 7Nk}^_8:)}/)x]rMʣͧ?]##e2qMKOɪՅe^Á^'C5lgAwy.zФ+f":0Q꜒69h#dC)c##,wPHThs\AA쀻Pp5ryN.{GԉAYhiM]feYѧaa+ Jw]|ϱzΣ60˓$͙P`C1d8eB"W8;#S? dBL&,r'uʅ?@|Df'L&꧴}!р&u+F+-ƋФ*!nΊyEyWDh;Tf226:TB Qd( Lf)]r'>4t@C - eMqO4[#o!@ࣃcjAL2:ύ M75%wؘ(bƧ팏9r+-$l~q "NL{O`CQl]{w=:XDuE\\G39&p/ M2,9T\X-Jp8V8i}tt}}ً9i8ns{s4S;iW}~T2)x|(0 NnRnrSN@sOy˟Ƅ FV߭-[)*]d'/!-$sdv sɲ$t\X:xaw u2.LkXzիƲSҧ%6l YݞJm Wdzz[ЉiN[rwnW=FHTG$64(Ⱥabą{&a|,nq?4Օ{Ig pu4k=435+XfuE蚸NXR6v/mO}v/杺Pia.D@'}-f6L7/ ޅJ]IWX K.h[@ C<>_:)a#>_V=,*Ԕu1^BCR B1taW+t%`KYL%HBe3.qƬsTBcME/J ɍaDicf.Cl4upvl^uU:r(Nj4+<$,2 6aX HEU  ,Z]L*8ll099( ZE\:9OKJht&ޟ;Y*xm`ڦ[oWt9$1B3+J6?o٥dip JCʹ0'oB9ZoGo9f+kGheE[uz0'Xcc3%Ӳ,m̊eQK޳,KHF3Gc%>i2ΑpjB(if5^Jt$ oٓ%s,Be\- 9Q~]p/Jdb[ :uzz(xe˩?5a[Z|deF35BCr1R4qlZS,7AeYXZ# :'d t 1A%43Ǵ&Y$̣@ *X#IDr$IKU$A?w{}< EkQ6tLXu1jҐY2:LBP<"$NYc$7w-M/4m#HYgXٟ0`l$^dc*4!#N N:[:ƄI;a8GHFtK텍Nq&ZɣbufPj FI]X[W.?nz{J@x ߐ2?k*$oJ$yMן.0һzSVjxbgL23iptwgRgi`owUsK%f5Id`hjr60'7ѼC4[=qY;#]FF,;٘V? Y,h8M~ef(QgnuϪ:.i&!e`Eu((8g7kG7ՠ6TOcrprWĎ"eC_zsa޿Ϸ8Id!'!Q""3x4ozhbZCx1|>9|jJz0~|7K8lwBBbg_!ʉ/'xq/LonB,.lPL/b͸]p{elU\GT& PvO]7Gė6;կH(AaoGl!x+!,+]3NvFK{fڥicy)+r6[A3+Ui.clH!lXJ֛2CGƨݦNM*P'v*@5K8Sޮ[;G4Q^Ѵ%i8=J&Sb4tAYmhpkM@@nJkmWh\J.ĘMX{)j4'!m }^{bRl`xicYi'it:x,VI{Wp`r]+a>oZjF _G֢;M.ɴד)#ܯ\-⺟\;tt tF7tDx4H3pP3ɮnh(| (ID`&qJmB @xLCI_Uqnjuu<)rI&Bt 6Lp' OD23We4`JZkPؕSR1}դCk tW-HvHc>yV֐Y7K-"-V)O"}Q3RbFˆ\7.zZZPY1#OǖڷNpiy)mѹ RJfXAjrb^Iy &| ׭8 % uQ;C,f4)h:=3d(xyګ73 HfGx%dCb}G.3dE*a2:8Eh \%Hz韏!H')G$)VrI?:(h8#Yi-t3Y^زk e!1Mvgyүd:|ٹlE.y*$;Uv2s@(C+kYWB&} E5!5Iֵ9v2vD2v&GnY{u[pA;0m|qq|7^|Q0 .A;/',nwmއ cMg7fTˆ1 U dՋ`}}Upd֜Lo"3{*} B{oB+c~>jon.*Xe cFʘPZ4NiJhYewz}LY8gc֔\e4: _ʠ-e=XϕdfT5X1%VY?{WF vIyD^ut ,=`0D",ʶzFQ/Iʀm̊ʊ2"2MP )K!))P5wR9& ?oO4kʸ,msqs:m6A=}kg}x} g[zLflN5SOvͧ܌9\%\ۘȕ96!Սvu}d_Iܣ+<]G_t6fMU[̈́8‘Մ[7v>~'3ύ\WW?&VT'&K{ޛoc;/Ò^ 0) 7]*oKK=3ZR!*@ν/;QbTP$HςBS(8RDeJ^D尉^-4;Ɏf1>٪g,{B{ =2Po x|NbЂqx k`;Ǖ(2=H8!p**#Dr '"Zc")U1H*TMR9®R8c[,X[,+>8zf?b7w\_{z#vP!7b89FRHAU@˭CQX9ES m3{4JX6 [ "DM.$ 4zDDԕxKGa"g=bA?ǂŸXPZn5ؕ% m// ;@MVADɑk]U|i aR!ZP@ & hE XKm}u!@<QY[>!f`<D,"(EWb(Q dA)mJ*-VeѾ44EDrDT<ޖ,eqy QFI{(nKFbGGYYKEUe-.5dm'}t(|'Ƿ*ʫ ;Px#RaJs2H.)qcF*A6Sf f4_е};7aBhx+|rqdp*T^kfE4 hf>n r {6٬q#>5*km69ʭd*F(R5 {٭Gûx[ Ӱ$6ARZ G4!VkSJ8.~tUXUaGQʼn@HŨF[X FgIpK.YIqdsՎIF&-- $O#KA1.S-cF7Sjy_^] b K0L)Bmgs3c畘V_- U}(QyԁO,G=xpڦϣqXgXbW)U&X򓁫L&WH-0pTWW,'W`O\CũUVvlUO 2ʜ \er:BjcL%7-\FB{sr!?v'\5.EӍ[O|qb!B ;MO矿3Cf_;bC%\) ~n`jji@m>`m>`ǩ*&\s}cLYU*DT.XIe޸Vi}}Oiݥ nwQ90mCO!WR_})RKhPZб U1 9AMYAtSv1;8x.vE"Z+o2xE6^ERDÙA89I˅@s 빱12& O5Zks SRJYo"BWup W Ur_P컞V*qGW![ߗz|,b/Q^ŜXGTZMK*/$T Pt )Š:SǬmIU1HP1wd'=sdk9MQ$ LCRA Iy Is-W*mw4L\hZ91RHp;YmC50 v)\ &_Z?4xkyx~lM[[c4 Ĥk PC@R T BBb$ h$x9^*߻Qzh ՛9hEɃBP敦hAzB!J4'QvQ\yl*ݾʃ 9n! 8(ZƐPzlGŭ0N sg :n _}5ִ iTϼ.YK"Q{TC T[ eM<gs#>)=5 d2Il\:f it:8 ajb}6vR9?gF׋qr' 8Bgߪϗ:UP2 .yE_?tl?;VտwC3‰͑8w텱ydJfWOJj4k0SvvC.rf:^Y=B>=SMMG.Z@J8j=!HDD鸘\>9o wn]/?e&lC }mt]60Ҍh^oo<1BB dp:gic['ӽ5n(>v U̻߻.t}uwedd]˞U6k2TBCcg:4 'ɤĊI%?Ǔu߽Bv ѧO>O>Q?>w = $c'ށ'oԭ~[s o[Oz;68ۻ~[fz7kNB2nP0 Rl=?bقZ]L`R/b}{ס>惘z#˵*'qh>WGtߖHR jdS@XELQ 'a8 >ĘiI ?4FK{R51׎a)iJ W@UI"!mHLN3sQQ/C":GUAOlUAeG0o>yDyiңL 6~0K3{X0P Ă_ q9J"*VM@mskoDT6z-Ypؒ1* IGlI4[52 i9nVzSEH\ 1PKwQß3&HT4QZT Z<$a%%Hsɦ<[TͻS7ퟅKGr3t^9a#M%iGMԃ^*Gl?iT᳼,%QU%TN8S%&ՒvђԿdɼ b.l>)4ڵQ"=>XD`sd()Q P)9Ȟ6Y,A`d)XJ%Mt,˶#̠mYXU5Zxp6MG6.:+%~t.94(E7i4Qrf~3!`4s>5lZcy4>`)fI;Eϼ֯`vl"Cm6IPܴH#K^ dͣs.lUvxJx{w&QOd^M1i7K[:͚B>鸰5BbnJ0cD݋woJ{x.A;-6l7 յ?no'VGm/iIB TpC\Ҧ|y|~tx-khn iPB1Vь]()wV`KCϼv%M)hB[J0%&SJ~0mT`ƹlj9.4wo$_Kjfw =@ q_g76{/o륑VȱhI:^gI^rKc\"l9;Z&$,/9JT+kwI3ٓMZȠ#ʁ\Vz|<gySWpobXn+zLdJ JOOk.&=}!WKIO_U azz,¾ůf2ըcF o2h(z.28gedٚeCf jAےDG빔9{Q1SUM)`۔G.| {r#wtXA{=jr5r4t--Fy a#7)5^.MOWmCuGpyos:K_B>>;nO}§S>ۛȭMs6wmۻ.~6ӏD #MuC/Et;:]ّOuCjP˦ݭwmzxw筲G+-b61I/!ǃĽ莎[S˓D]$!˭y?&#PizhƛYM5LԸ@eqEsMg2%i]2ֳn|Ō+Cݾ|UY KHEd"]2z/VE 6=e"Ȩ1fYɗ,8$"HQ3:J8%L UK)gj;X,X(z,<)>}-$-i_{/4vv8qĎ:#QcU\g #fT-JHJ:S݊Fdi(6AWH`#QV0c3M$+2 Ԯx^݈ǣ\v펮6K[Qt <1dH*k Ս\J$-*G gZxDIC&#iA$MX P"Rӈ9,RRx8WK0]AjXDQ##sܚވ>;!M"="M+ͽrRƪw\ǪhHqHGQx^}0TR\dB MP 92"V#g7"~IԞpqz%[g]r,.ʸz\qqێm *;e!t+%cF 6Ř#ID*$Feq<;G&?Uk {׻C2ZPi;~}yT],\ўtEYuޒt^Z\lw$-]XϹf=fK\7FRvF@I# +y91/9O$>er E,֣!rcJ0MghKx=Zz7+ۥrBmY>I:',9h%Ir0RT[GusJ8 &:IdFQTܕ$ 0%8I&8(ˀ iE뒒4r'B)](7)2wf ),#[YkRy4_7F39 1yLIe%gYWXP.4y@pc$r%Um;~޳FikY,|17ȑKvûY1gOG _oG 0j0X@cC2$݇F]%q ?2񕍶~ʮŜD$EK7.휧k>G\dZg/A\׷ZJY@4DUƬ%{n1!V"V۷9ũ(ԥ]8i㐶μ#Ж 9Ss ??Su`/I`%嘶R0dc+RrLj-v=4QɹsLvϥFڳA[ W+y0j>\FU݂+{\]z.pApťerઐ/ y*T ;+P\\a*y)pUpETbה^"A^1 9xl2mr37? ~]t-_7u0M#--NppIO\|d Ǐ[s)4|pD* z~0L}ҾΆs\feL(ReK~ŏ4Q{9G~Gi}:MWm"Vys@iY]xVŹ,h< j9؛>75f? ӺIP!bf6gbLw|84]`#`yxwG9^ݵlFy"s K&q f/m2^'U6+W*`D P!oYXsZcDU D]M\di/R4(Xlcd|7d'' .F)jv)Mv^!*׽F5i *k{1pUȵc!jUR^Ϲ,=,k$8}<˧vsV7^h)_lrDe kBR(4ce:@^$"@e#|ʘ^2$3rmRjϝG&)L)+tc"9O)5sg 6&VT"wsIRpYWpt@쌶))ˇ/sGutµH .^^REyP}IX.#dḏ־|G)K0eLN4^kgU2GOҼJ`/z-#]"|x.nH$i>gC))Ùs5غ9,Tu,XЀ6j-@jAI.=bIv`n5rv&NJB%1&}[J:&^%%8/ ڟ5S|GeaMδ E(0x%N\.=wYs(>d8{ümD[]m(3dG ^+;"1b17%mn(W)nienM*h1Jܑb<4Mj,h i7RFG^9EE˻CȄE'̡~䌝$[W 23u2*dDTA'uJ&+ a2 ᲈ0K#R%8^Z,$MbR|mko͏ge%o>i[/DG/bIĔZp[骲ZyJ-XTJ\m+BgBT]m-T=޶& buk>M/K %NdK.%Zx˙|n_Gs$j ;=$?]mn弘Ļ49ggoCMM|iqϠ_SM3=\I>ΒT^DPQfLYpR$/j+T`8J- 8h9nȓZcqyqU˓$[K`[^<"ݴ/M/%7_{=Vfz|s2=mqCR,}$m4ԁzŽ8Yiwj6qӍ~MZ[f OLR3㇣j&m{{/]-vtFm~X|5mb.x~nya]!ױj“j Pތ5oNǼ 'l/OY1둁SH#\rpy H&)\I$8xO S2Q9l\_%IOw/ {42v7ҏAI-6fc_H5Lqg`Ŧcyc`{,Me=weVj ?EȋS :A\M*"']O? eG#;#:Q^#twGQ餸@ !'CxkC0Q" sb"KF0u.ybVVgD9BB6HtX|IZWUg! ɶHdQ;B&HQ[km#G_E?m>/a.3`0 >m]d#ISn\g}~{n~lXE+kϷ7=70yop]4KN9%$D1)(ڂ"P`Ԩh@}.1Coe[B|B7\Lv:Yޠ*nqR(Y`Twߺ鹶]FxuZ# *Xʚy4!fgs*IQRمS&CTfii+$r VP&)j3'bGC'=b~]#s4y;^3'#=]|kr֨+c²2,f8ݜ~ޥ7h姆w |)rB8MI&C{?5#jTi0MvrMN.rb'x')80wUQ Fb$HORÂpUvxqy(,=ȆfE|mb~frtzu}|4@C/_Lf9c/6>q|w0)^x61⯵w7م7Kb83M+?Le3܍쒑7 B|xo18| i^feO Vt!v*pɆ8 G0bG|'w}M/'##{^׽0l$;|_GL$F)=Qy_4*AlhT"Oi?ln|O(xwߝޞûs~sw G`=n 'kHez >@]O;wTko5/IkV~ O7cn޿0w{j @^)GjR'=?kb݀Zr.flu"D 0j숙,ztP;P&\WcPި MS/q%I3S@T;?J$gǐJ‹NIznk\TUp%JS%)!3Ґ?;((ɇΩSĩӪxbȋ(I`*RJpJL09uԿeɼ b.KHmDct\`RdS Rpc)CCȺoѧO?8 /wZ{ZotCbgHu< ֐W>`%Y*[ lؖ;| B^ qQY. 8>J H9wn26*)x:xU֋x%:(Wb ƭC]K=ڗm4FGZW_*%G<>l +USE" nM 4yց+mW$h) ZDJ5*zTL$0l>&J)"l:U ;29Sorhbb6Eٶ;U#t߁Ceojo$ - GPlB YKR 9ttZ:&FŃVD<^Q".{Vt{Cs%2J1hښPK3J1&uycOl\+x% x4u^:|PykCWmRF zEYKԌȅf9p󼁘YT P5833;V ꟯_5#_WɻCK?ᜏoF'.܌UR;74?Jj!q뢦L^z.zuVoYدV^5*۫*PBF.Z6U+t}]nO3ȋa>캸ֲa&ssdk?v;Onw7gA?Q}dS%{=|Eӓrcvt3^c?g!zqP<]sP6Q*2G^Ě i@ZTG&jZMT-3Ȣ@kβmhO)'8S9y:6'{I2%( 2\@I 1jM8h-l {K]m 97~ qO3gugNQTF-tR( 9Ii2~ AҢb8UN@hE DiH[+*%Ծ+V ԇi" GbS sz%:%b iM`TYgDIˡ5&SRQnHv儙kBH9XJDj$=iu.$5֢56M"{}ދo@[ A8 CP8|*4;'ʼnCqYX%rֱ$*8Aͻ3rmެ|W`$cMC !j!Ge Dl.eY[4 JIB|$2AaO9"VPpjS3}/>++e!yHHH)7Vo9(tt#7}W0'6< rVYX'1š㡃9h?B[bboHhPZFrnxer)T8*DB@]mR$v]9C"eAJx &g GT9;5:հ_ֆp)=VߡEp&i2[wuζ-Mvݛ]O6RC:3IX[ks>e>]~j<p3[*KѬ'+|qn/Z{x{㳰Z^zr3?n6\߽.W+,]"~8d&[g1.W.rQvr8m O c8PRH"$R@Db;xyX;"tV0E0MAp]㑢eD,,$xA* iwbdp*T^kfE4 ! /aa善,"] Ȯ+MX4xaEl:\yjB?oO|q$6lVP+'V8DltD oѢ#$GHDTlT܊Deh6:+M2@γU 6d TFEIF&yP0䩴M9m3ܦs"{Lj}ۖoo>rqszn9vs._ٕ6f}e#UCTU.Ջ*grZrJ]U%M=ELcޝǢXtcy,:7R8 1QXtcy,:E<ǢXtU İMʫ½D E[\%)a ?E#b+O2?{WH&!m*bzkz7bez6e**֖,y,ܿ~A]>S%Y2) Ԕ;+Ԕ[d;(B)lq J)1P'JA@Ү@ "nKov1%sI&%j)\ZYoVΒ"AZSԪ)9˚֓}oɂ?,?l}4H+|^ _o V29 Pf[ 5LD^_Pi mPJRRi/m nwރqxY틖llzeZE";GmoɾuNMMNZ~ȓ88މ+ނl!7hjغZzb>hHٗG{/6}+C8; u†hPH* )$Vb{( {Pyy4\ޞbBJ r1NZoLPX%IxM2)UW^(ܪ:k;?ӲwJOdz͢:m;n[-f1>YCG2!918/CeoXyJ ko#$D:!ExQ_ގ_xnbz'zs=@[ %81`"7QRI#2y+ c889}M5;ruA.3}JV*" JJƀɅL,:( !pÜ!9 DU崙šԈ7[#Zu\#aƒUo=Ct=υZ ".O$تqy"ZԢKQɫAqa ݺ˼m_y28Nü2s//5YjcJ4͝vjKfFҔDM Ѩ3qK+cz.>[BٱAQۘzWO>'F\yIe6>N`PHg5.K+K+ǽ/qQk+H~c?zsPj,v-V2ݏe*% ^+5`"ܨф mEcJ[}J>N :\`UNTVAOY/Ny ٨r/}EwP Z{`~ЉmНcjjzan|||.IS$_ Ytr/YEu1ٰw(1BQJH6m$ , QA$WFgIa%3r6K8x<--Q87ެ|ל%k)f,dYzD5Wem6G/.MC5.&ZgcƜP^ll;+K~pV3ۡsmGb< ~B&LMx/4qx5auͬ0h.+.xeރ4L ^Y"2 dm_㌡qO('b>*Vܷn)%8 KTXFd18nuY%%gz鵖m EGj e C'İ 3)WB̓x^gغ:eOB(ّQW4⼧$ Y$SP*k mN@[kTn\PoyzǩV/'V_wyթ’1yijS 9|:;:Qw--?/Z7C>L/Nn_jX:4^m>=z+^'ΘUL-Z͛W#9sc~ٍg ohhWĜӏ gx3DG׫gfFGb΁֑}|aT0ֵsU~cMy4':uˉ܌98L=kF%֏|ȶQ۞j5nr䥄ԁ-_Ǘv}5GrTƉgŸɺVӘL+H!apAPV,ɀA/u`DßSSng Lo)rPX߅oMq6qQ3w[Br'ȹLwjc;a(ޘۏ1$Z`MfIStUFDK5lq/<wh)kXrg!2Qʰ67&iAy/@ԟ^EZZ!GॷlmmT(H^##K% ,/*:+r6C^x8zޠ}vT2ѡd*(/3(T`u]CKI jCHdY?-q U(niB)+BzT˚- +v)(aO/=1Fa;WC}2\NgSBhJvj1)k3KSs>hBw't?>E B `N`Xw=Wۚf %M}HOfd`I%eeJ]ܲǽi,{ ˞RI,R Akrcl%.͹Bj>,Җ}e?cYx7cT0 HsyTLjePp5,[,+YrPU#G|y~zvkׂIbnj$ŷbF.Rpn6~u}8jh8yQC.E?}Z6\n z(+߾*ל"zމHjy`)b25s'$d׵B(H aklZs׵*qbP.ԅP\U92~yHWqou{6;M>99;gfmXf'4FTrnhdᚵj6BUeMDnf/J)`I-AA AG%ԡF탪ťw .v.o /Xt k끵 gKUUCM\ VEIaY+AYKLs]*chH zV8d&V:k:຋u҆oTa˨/v>4󕳳'1 #.eD0##4V*fzw0ZY|lf#6 %dX-߱hmQFLGL፻WJ3ucLiF6V8i&X+uc%s;#sPKֹX<ma^ ^<"޵6Å}TV\R Y  ?߆zzkAwa,/ .‡<3+P؂ۻWMpݏO1pU!v%7&WSpcA*ƶ7&JCgXpCD?֯7wlyi>߆f{?ߠKjLz5`{DW$GON/.mM䣾a^>+!n%gHJBץZ4PL&~>4Ɉ ֣*'\qk6оDeܞ#]y88qX ]MߴEW}ɉ2ճ+ v,''o,_GG'#?]9zh,IA=A?ß@\w4VwCg"`\Mj;='i~p~(߃@WzIխ&VCWIBW NW単=+،WDWQಮ&ZgNWs+1 r Qz+Q|ܹNϊXZհ4ЊsX9%-緭vLm o tSKG=:}wDOO揶l/OG5&۰v_==8/sl%xTxTXۅ5ro-zƨB5%bwfɍǣ_߽w?LJiG?)A- ڈHhNarG)*=jd.1QnBmi8)$9 \86ʾ;2ɟ)o@X-&BWa.=]Mt *DYpkm; :"qa$D{qtzgȟۃc8j p婷=ځ!]%J^dEt5]gKՄw &JwЕлĜ}cz1~pЦ'"/l݃܁:̺ojja-thjd>3+&)X}X ]MkkxJo*tMJ8$++Np&ڨNW zFtenw'3q|E 3two<+HjHz•h;IObIZC5<`p[ ]xÊݳ~7qV-@}ス׀Q:9ǣ_ONfDW}._ԟ5=1I34u@,8#*} '?&3KCYm|rr jQ⇑\e-a|&,!^.ûa^H>r|-;~aQ/W%7'3S2>`!.+zYx+|Գ8b"ze%i*3Yh2o[H >6~>>GrqyV}y :=mp572W#=bF%e/܇P! ff`1E]T {Y!n'Ze]԰$Z=ubzo%)b0vvhFK6XF{QG*FH#H\## qKaMz73b's9ksi!gs1 Rޅ6Zzg/ B`5H&jЗIU "{hdBi5 ˢyJ}Nx!ReYŊDv%Ht=!8dGBΰZGvivteGِ/ > id5̳eIj nGsPQCq<~4݊#z %YKo*C mv%nA2ehס-#1I x)Yeен 0ԍ,Ԩ=tXu,yCRZ6fakra5hW5 d{lUBȮ]`S%x X7!WHecYI+K̆6#lBĕiI +Q ENyHt@IOe$`Y"deԠ x܁"7CAw^KC 2Pư)0mgcK Ȅ@i(#rAW[Kqƒ s<`&CC\TujM3I!0μFm2\-? `I!\4+Atxom¦;'D]`#)# WP0]g,W #=A/:_Bj58*b3;ˋ EUDI)b<>od0+=DBBh&uDX(qK͐ȲBB Ĩӷ! b7C∽[ hn!tKԳ~yn b64!9 4tU5u:38!a%_|~7`םl~˚x·*ZŬ7/^.06#&f3x:pn SҗM6W%s jm, ڦLu1 '$`zEBE {[ɐ$Ra2Ӛ"Y`833] V c,3X6t_q5rPRg#VH܁m G_ Wg*YȩՏßzBby g''~."ON[,uN f\b=!eDCd%m`1ȋE}L!YM ^eā pa6A~,9^4Kq[ GEj&ڱ!,"|Pt) .:E[L~Q3LjKeE=<~3L2Б85v`mBgQiY*ͅjʐ pyPZGxw؛Gכ/pqQcA` |( B 55֬V}? w(<6`ȷLms(AB5n}y x 'Ǵǧo5Lg7mc]ּŢwnb6+NE=CRۢ$t,ڄÙ33` ԥ* d;Eq𐬆f#c`ah HYx2AʷVg,e98!+;  j;:@Qك)c5~҂ 2tJA^ 99CtpBMQjh{0$tw`ށ T ,XP(@zJJ0,7VAW >l&3 ADQuLɉK3п!I^X JX ư`$> qT FlY+{ |H>oKX ,Dђd4'21\+\ODR+_ %Xz l$fJ>Rf4u!A‰TCx ~wͽӴfUٲ9(cfƲ2|xQJp"ڔ#=Zi 6жcII0e#՚Ƌ _X.D4%8n=xC ^cO#܀HeYc*&w@7ϡ=ÅHnc4F^۬gQO[Lm`tD&ˎ.ăuAbp[$L%wt(_W8D4JGVE@a]UBz-NRf]#cMVb -c@X%0?oEMb}D FBH<0x:?͊ZCFR4 gA"㲢SAHgM;?QLhl K12tX\χC?܂h2,+^/d^JZݤ(%盇꥛5?gkw\73:Y 5ۖ@߸ 5ԀTLD9B==̨!\q2tyV'J_% $I $BI $BI $BI $BI $BI $BI $BI $BI $BI $BI $BI $^/ THsB$ѓ!ܞ T}MN2Ã{JH( @H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H^H2}Z$dOs9=O8@y?X$@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@. d8uD.>5dH A,?z IWH)L $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $!/z/ $5{?x,5-O]/OOsYw?"„`!ɀK\O\*ڣ"(ՏkgSh2_xpf,dm"]|fxfC88LobB\6+P ʧ*UÊĘSCr{B;4a?$hP??֭%u>o'x.4qtNm&DDR1(*x-tA3 GJOCq;L?ӫU%4Ψ8!xUwo~j|7q W yjM>XjzR{څ_No#vBᖎ]s<QnG*JJVht=JVٷamƆ[V7q/b[Ǯף 6/Ms >C-Vz2w>?; /{o6rrirz['3p]qK%Wr圛@j{/<. lobd\ 墳 1[&{1\Chu{tyIv꼔Fe!&7gh<0OT*=cOS;o?" ۧDp{߼dwCgГ3x<m u}ـ[? (p8 ']5 6ɼ |:.Kߕj:Xg&UV2VA`[ PZ`%}7&и#^ƶ/eKAf#\VF B A'uJ&˙ QC9/gdA`AoCW< @|OUeڢ{soü}hU2uFVU6+hӸEo4rq4&ʸ/>0xB5KC>>Ul3M~bL8|-.דzo{/qyߕZq]dW|uпμ蘬"3SY'|Ns+7e,bV.l$WC\Gqݛ9u݊fޅ|}RJǤG~byݎxWGCPKQO|{MǷ~8[LnZ(n{i+v~V ƒu<>xFJ2mT A:2NlI)h.<)9 %xq<"t=y5%`lfԺڹ51x*Z&1 LVA $F~paٓ^É^t8QͧcZ|n2o[\yhP_3_h}LnTZf]5P9Dr}. >]}G<]|]XLg_j__'V3S$<,|p&q,&5"o7%֕ k^S3+ ܸMG߃r݃~Ф/\Փʥ˕X]$_Dcׇd[_?oݹ9M~o,}*+np#po-oH~Kd:/w45tckDwaE麋n<%bzuU/¬뵧 Rl:mAe9ds~ ]{m7 !1 nit^<_IoMjFg<xfڸls8me2o_|OlWM+lR*?Tq4QAU S\mũΛfbu%/;nsE<;#qZzCwsc}HL8kR;Tv<+[Ӂq%ݥQ}:gPs- U[k4ϱSbv.u 5Tъ^H Z4O(.d Ƅjt65TrwW߼''OVlcۢEpA/ڌ8"(n{L [ѯgeFruEҌ&' e[m (Q꽐3hlR/܇ T(WA|8iQ}]VL*ι`Pu1We&7%n7կN^\ qc( Dh,PG3I  ڨDFE+=11G׏hv߀w/k<ho?آzeњ.]^Ү(3ΆwG.}_݃"PM x.*LBG:%"J)X#YYb2 <%R:+O(~N\Umo$1tI%sj .?UCʻ r׋6!+F3,X"%JY8I*mym4~LKGM:8eM*,1gU*ku gg@/LNj͘ }q=߃oϦEX־>w 3XSԁ厸#,GB<繘ة\5y.Vn읍} #R.if\HU"*D <'!oRss̓B'"Ouw[W=˞&j:e}9p}t>CjKs$9pnΏhgn?]!%PZu= k=dݿE :/:^˭5 "a+J0%)9ŋЄZ"eI mJTڷ HF[U_A Q3|m*UbZKР?ymKJn`^G$!).KB¸ǺAA&Qǘ:}zH1*BY4%ɨ3^r Is'WR2  &Ȩ"ephZC!ɀ< \6*2l"&Ó(8'f_ CÕ0Ws{uR=9#Q_ߌhs|Ӯ `樕51UՐpAOЃuCݤG:WƉHZ'\U몶r4:J>0Y!U@>?.}3  4Nqi߿(T0Ps]]X_?.\.^ӫۗ.^z{8۟^Žqp)&{,¯; |Q__&SITŪNvhUUW]M7jBQx}enyrVO{hD@|}'_);*\SĪ8X4`@pDq΁XQ*g|!OGVLA}.uG^$.$=eQ:(dEn$]02>Ԇ6LХϿѾv94QaJ`CSoIJ4|vtdqT.lsb\^vd\^:k0 NqKx1q-М-e%,Hv%1Gix`,(U2xi2])$xo sqO[^mSNgvnYjT{BDy8f:z/ Ca4Y4'-_#@: eyM_W7h|.KeͰ ;(cBKjDEiBe(N|g Ǖr@+ES,hxdLxT4g^PHPOd@*rg\\Py#2\*밊%%Vn3Q>B6xјI&rg2>[ca1t28H)9ReҼ6`' L c\;56 uDԁ2o{=4-]KGt4>9Ppq&J2 KnӢ$ 5(¿#мҭ=ӅCe^=kZC3N U8:E hU%0wDx$ep7=ЬDmȰ, 8@-J))$E,) 8" R@gZ٤ o.]ϕwYrA_>5S+Gr.瑨" oz_9/3М JےRKIipH$eUT4&t'DŽ]HV)" ĐVL*MNbLF jŜG 2@ AHRVa S띱Vc&$4zl5VHKDfMd:,p7n3ݫMFmy(pryy8\I؍mvC.ܾ*ݣإͳBm4pCe],M"BN~T]Lo5EKxjz^_,[uCMJ%{quklө9iնNg_t.y:8uD21%-)bB d sùtMw,Z$e)V@.(qCvYX$$@˽)i.q6/aXy؀rwd'[o=w! ;YQysЯSb39NH=NɔZ*0Ka#eL#P2zEb*.}ԖxȌ ^`A Ɯ V6 ͌meBz½b=o&1"[Gw#FX h#>e&QPv .8Q1DH+ŠW•;UQ8p^*tCR (Z@TH' A{Q"#a:W.D)K͈m}?b jgG64PqE(C TV^"ҝO*RֿK'GΌ (h@0-ڥpH|@N@K7P#ȁR}z6q6aԧ#gî ͏mgFD~D#"=K⑑Jk%r$&VAc;L a!!Ɓ3 ^2XobA% 9Ф H.wZllFCmGȥCuf%̌∋G\\7㉷M$"qoAZP†/q#JgAtG\. f[ e ~no*q]1j1HyQCHICF |J6WOVpRN^yq>ɌD.|:\ w; *Ǐ(aM\Iݤ{`\E$siE%@$K68jUt8x"N“ @DDC~ 6 ٲڼz?p Zw[\ Ϧ,†f:+U~Q.yK &5yZ8  |ϠE}⦾B{hvSl,7én u퇞`rЇF;7⚤O &cu/ǽCKM)EԒ)E 0Ïۻܻ:rƷ^{y\Gmw|Y%L _M'MV.퐫/~­&\ NGPvPhz?l(PQt{Pm ໋Ȫ]cp_w&ٗڹ5k{׫s&o@^fsTN'Ԝ YԆL4s:nG 4@ ˟B^yxwZ3e03&aTJ ;.*åv)_R{sdM{:=oN8fG IK>@ $ 5k 6ڈeL"2ۅ m<.:ʽֳʔ](iq[ˇtm8B͵gv߹5+eXgXm$-|HH_˧st>y<9՜|䫴ϥXXGX|J>kX|Tǂ~ǂ=='WI`Ş \%qz*pb,I,$]w 4 00 t Ϡoλ~OMk%ɕK%hLVQPJ]Mx쬨Fݪ{>u/wvr WY m_w WQeM(&ޖLRc$Nj+~' A`t8^o΄ =AYm:&30V<՗c'f(#5(At*T F'UtzAOTf.W_Fo ن0*rYޜZ U$2,?!Sz'N;\ M(Ѥo" Xj5`ap۟hs5,L}M3[d{Bioᴧ^,@? [Q/`7IvbS}2B>m[ɃXԖ .HHa\eظT>1ͣ Œ9K˭d)XT9_5y F^hBaK-c10|mj[ |MUk (Z-#ctbWSXVt&r0rz͓M&N $wCgO W7xU Y.V0֠T8 k3L*Lfɫշ!L-v&GN(QC{7lO LJJ)lVF },A&zG:SCPk`ME[wBH~O_"2q?bx )#- whx9tBZ7ܮM}V}j]۶zjzV49aĊ|Ag6^]Lja /#''bsPs"dpcwlw&ssAϊsB+џfʦ߉hSئLc_ Eh&iXG#lgOe+FyXzmMv1}qm\R)"f vOެV}ۛSSM"HSXs$پTmUtZ4m*z5{JOI%D@D>%ij&))=F>='J U'WI\dʓ :\%)$T DsgwnS./D5S}gܱO!}3N`\7{08wxI;3:+[! Fn@^LYA56J̄:~xh%b۵`PyH4.R b06. c~k2L9V7qmAIYh0eSA ߷LFds[#nQne!_vb!7fd ,&0u#KIN⹸%YՒ,ӶrwxNxÉR/PRL.KarYZe)tL+drs"$4cxBicWWY.WenN }/e9[7zolEؒyw&8Ys\hM]*.t@K %Hw |)A/3/fqŘPZ͏q"K ]J+.}ww|].}5v,:jqןI ʽDJD JQ?E#CbS04zHEx~H lpdRJۀ ere.%`\Z^Q+H!;B 53 Bre6GQ)b%+0R+&n:Y T[xhoT'jAQʹ1l.O؇d)ҠtqÐ{@ !)N RT:ƨM@*.u䝣/a"9Myد>muʠ۹b67ێ<1V7a4mg7"xPE;V б2 ٍc1dgGnGggGdgGaDx j &U2 Bj!J3 Lă,4g˷A% 1IPDRBHψO̹@4F,&nZdgl9[ЧUj5rCvt7uA炘&X@ WȰr8>;ެ}T^edxm3bnՌȦiy WuScRB DV.X)]H2oܒj2E5,~->㟿|7Wm@zEw_t~.;گgj7+e"opQ!R1Дy"&AJ.\p\0)(}6ZI[LҪPFIf>t2eQ!VNHtC>ácKĖxt8CwL]]a-OҎ ʳ3 (b"kkp)FE .$Aj%$٢ xPJp!ST1sQ0֫DQ @'DLavXLtϧ ~c"0Ánx-uaG޿ȉt fyhٿ/<ݞ"v>pt^tg>f-^T40M N#y,χ)P @L84B$I(1G;o H.<y-s6(l0ru`x,*w)"y{f#Zwۋ,HwyO>bO6yih0Vi~q :!`mfY7)ƅ;~KEnN|Uٸ?"Ndl<tPh͛h#xn9 Q6n4ԐwѭHO)KIjlO>5x1hmuH:>uhT#}GKI|pts==Z/naAUg9Uާ cQI]=+2{TmmEd_VbuC[MCZڊb"H[fzm/M \ eڜ{CtZrn玳'g4ct/wKj4綼LL@- 3E t"y٪'(%y.5rh@:)c$ppHõ!ԎsU@mѹ `h QgO;R2\nZY$_Rg}Ww\|IKuβ},K9hD4"lKYjhd/l9I*JˆK1V2BWSDV%9$O^ KgD82jQwX|zhc2NVܓuBP!2%Q=SyCJKAtdQ gZHW֥= E8 Dͣ 1#<'SM /8 ,!Ҩ,# pr VR&c{RG%7dgGxo^~0<{;רy;Z=`1>~EaQl'j^~]=}![{Rp@_=w5D!8 lf8@ FHN.()'v:nx}2 8CS-~#VJ8j=!H Y\#s-nvz#gtvP']P/ 1,Чڜ{V#u}k^^^..*Z*#Hd><.JbD3msTb8F5gx͟56Qjv>v3cR y0JyӲܴq7~J|t߲iq dsM8rtkMBn颭^Tլw8eCk-"0  =sp1m{9ތlum°eӼ3xsW^V}zB{fѿR1񯦰sCet~>7߿;L﷧~z+߰fp"ۃT_SE\UNZe5T(~fL}^8uz qHV_ `"В}1;;BNKvDK?ܮWSsEUftGAϓBAV~%ő r )3R@T;?* $-gǐ𢱼HOmlX(=yȣ KISj8(av](HRZg'b"`2|9hbch;;P2nY޽rCKgw 2|/C`e D]u+)f+ʇWBbj[9B˘c Jw\A^<~1Zc )Gl ;nO$ Z2#{ H\pAb""~Θ~ءgBGGCGLLøGh.UwC@4Y $"!KH* OL3Ub9/;9}ZGGCN.a}>DQ1:z.}T>qNDH7X1j~|QKrHϚ͛mwPyLKhP/ lB8s0D0w\D*ιpP!U9ϟM3!w_[HoxW^{4[\ZZPBf[vH뙐" )Ƴo^(wwu6=0~s=qe*rk<a8őhmbSI '^G >H2 J<)dKY0@I 1jM$:BI10Z:C^)qv.<:/֎42r?d!;2g)X#7s@ykm#Wŏ1/E9g}Y,rvK,đ}$M2`[c;YvBq5ͮ&[:7  e5!=&])H))^K$] 1 `F颸VK/Ybbb yt3,AGQb 'ENa- %('B*PLțʳU09fK,À׮iv KYF̞(2[LdcDHslBsl*É2{yO`[ƒ݋A}9a}Cy`#s\VWk-N=q's쌲]^05_=U!M&@vPuAIi# rvJYTIYW^%U"hE\R #jaW1ՔmHQfH m`CU H[X^k$s.EWEK9=G)x{/oJZvu@v}5W QB$+yjj_|zϰXnМ)yi:UCuB c}0:Ik^tL17oYa '==c'yд]|͐>T"t{࿱Ts[_[և9_͑3:|7-h=C4>f<<j9mgrͧy9mbWt~񅕓^[{:7JП:RMvcG[_7Vj}a<nzۆYR=mzQ#Ar]U~B.}t^SO(%K :2.fYuR**IP.eS'g]?w\FGy!i1T(XXDҘ` 0<\-xK7zU I3Mq?Y?T1$AU9cK$B.Aʜ5[;lLKw7'g~ln oj)*(DI!"'gVE'ZƒH# *Ƀ^@ X]j -ST =tSR` K9Yiu@( R9'KqfXld u‹bݜ_6i}eG} p{{v\0Jfg۔k$BZ'WUUb#ltj m^`V1BT(E̕JEDt#[snSAfc* ;=X_".{E}ІHA]7ܲvK ݭoXd2& E]T"Zc5uETL!4fѩ , SAf\D4tDxPKD@<ӳ"Q ڸb -JX1'58dj瑰gL< V'U*P l%bdOZbs?d~ĸF.Mi(91.ڎx()ʖ`$RAC5Q>T3zq8ҩN=[R?au{o탵7yًۼ6^~XoքA""0_Ą:)S }fSifb|-_?+z3M`gTY iV9-액.% 2ZaRC'c(>" ԨF 5z/cFF$X3!'%9!FeۼϏv2<\tz%W!,OR>b=)k\QM )K֣rG#EY R9a)jA!gHhr>imMYRB/K"^jfΉ)H))^K$] 1 `Su:NX-P f኉!+VlPbB>ʭ-#ֱgEJRb&EN\$r5VxS(&I 4-+ϞWqZss̖Y)G.qa)ˈEbkcΐ J"ϱ ͱE5gǞk?qn't>Uw1,݋Wڽ{rlzv{s!` ss` 0 T] h@j .v,rsNM |e!Z|"HC+L0ՔE ! BF*m"HŞulb aaR@QιX_;b"9NДr8K:xÒG Zysqngf!&{iv;y%Ky1fݜo, _ maX-oK2 H5tVh &{t2t6:18F;ϔ61_CO^z,y㥗Xv_hͤޖ׸yԮII+HPr^,d OO&4E ILYru?oJb?W_bj{F@|u{W?]}O )\]=m#>_w{"pk8/EΟϘol[N ROoVԣu:BFGuҫS NT zR;t)Ոx flbnXEdy ;s8z r𱺙™BgMi&SO.PO3ܟd0`QXՐYST]!SxTG(r" d%(i#ɼ/I=-6)xJ0p6[hZLʜk\ҢڙKV<J=>|x1ժu>}Z\ [iK~քS汏IL竇*\x0:qhbZHXsppI%& s,|*I詓֟4ar/nϗȑˮ~vcb))EJZCIr1B'g, BS@NւvaohY, XyjK2=ߘ73Ĉ8/< 3ړe`mw߿]37be~lV@s:Fa(ťdq)$T"{L ^8rG B˛a0\JC% E ~ h(!#򺄸[|)?i8zە=}uv"?|L991on^1e?} αN!|9)pJ߮fNcOFoWH}}uvבGX9_))"jOOO8iOxǓץ>m13㳾{ U;pZȾ)|Wwc(59#vnXojP{bO2"4yQD)-;0dT?ylVWa],#m=TP# g4 6nmܼG#}>ms+fe7 /lҦ<6Uٞiu|x)IMMm\Vnc/;m09Ո}UU͝۾Jl_Ej,&&S BVhO>`G NLҞOdB#%FEĢqhKO< _T+H0RH)3:X8Q ;"nK6P^yTS6VTS57䅓Ǫ7\JT_R Ṭ̯UAE[$_fV[Ae֮IYJN$$ g?zk-؋M}q?A\ ,ipºri+BI=·Ku%y}rwmMm$tew9 uɺ1>xvg#fa=p4 KcG~uG !խ̬/ 55MFRm@iuRP8RL1JM3E[E&re 툇k14#yXo-n>9;.v=4H514rTޔP ׹t$'*RpQe\QuSvwlE7Mu!|bo] bŮo]81s[o`\3bu}|IԧW Zt[c%(K#+ѻ\zTplϩ>S=;1 b+Hb$ ^D2emB)3B08\@- B+z-Gwoɘ9k5p}}ݑGoC^0b>ĤDK>LDz>"[oYo=lLiI酄N,-qmJ)"RbؼBIԗf97A萛)"$ {n6:GBF%gy6>i*Q .o t R#e _p8V;lQȄQ M-ֺ%MܚQ Cåܓ/qI@BggLnDU c=Km՝K\t&ji5F3 @LJ+٪Aj*nqR(U^ёFu'"o-MZQ$KR32j0D`M@N(kфe퓢S)Rt QYBQZ)G ?' KLbo>p3vls?b˄z$D3oLq֟e~0c,k摖\#`Gh!N=k|L@o(8ʩ_>V8gFrpPpk*C>k{sg3 jtrH0$(7;Oľ)f"P7ZOHt6#Q#=Q:n. *98e(&) 1*>ux2#մ8y::ZԢ~Th! LA෹ =āf)F%(qy*rPM.sh [Q_rr:x gƈyguorxKb0z9)_m?~B6[fXs3B66,*QeC +ofa8yordkdVgljiy3 xWP|2q,FbXG3׍J ܎4j%![UZoVĽ|$yH*pA@:rubgR(0> h1$lNoOvlҞ^szRҔ$X@p$e.$c&Zb"ʢ/pieOghZ 6iϻ=O֙o(7& :lU]#铝|-mA^׍/8x"2眖UpmK6hK>Wn#6DAzǭA+C~VqRZq\ІHo$с)WiBbq9o'֝- Wrn:x4M١wYu9w:y{űCGτrV{2tx؏3pTFQYZ84)O™FGυV("! u:%* *7F+B#DoBj 6Eai,S.Ce],r6T|=b5+ !VX6w!MS_!Hݯ>&+iXxav烂RPyMZU`[rzrbyDm.KZTx׶X;LP +/2wXX1VҜXd`:7XI?gzd+BϏ:~'u0!5sW"!6˦W卮 [/%mq\//}/_ v9m7sڕm>cs8d&HXrrZ߳UawǞ΍BT+T::Aˬ,ۨxʛOjZ`ohf|AS"TcFxI=)ZcKnQ'e n;wkF x4NR1_-- >EXWtxL2 rW/f1wwtϷso˽4r|[\2$TcK1W컹ʽ\Is8_ʏZ{.bǙr9=rT/ I)U c "?"o?O1N[MqQRZ2Dfۏ"pGDW6 5qƏa(?֔@[t)d,J/>=Q@^NtApk/gJ5[/#$Pcԏݲj`R*5ceq+z4C~lƻw' YtRU*pV0;urR5BH@dto3޹2J$Yz0{YTkO/7mtyR]ԝyRwĨ1 _/,1a)"9[ƄvχȨޓk Y}A~22R'XJ%zwş\ƿ/tKšI)UH%@=(F S=n%T}ۇ܉mˏnNɬv3yWKoajMwhGEԺݤ6&蝪l<9']?{ 4׭* GZޖQj FR*U ̥FKs9ZE +P1@Cf EXq&Τ&rTۜ% z2Rk;w#5~~0j3Y pofᑹq^Ժ7bwG( RGM2#*oJ(H g`2q+D 빱14".0 L$xZĜ[~LIExj#֝1"ZrB9˸qv8NX/ࣃ1Ns\+!Ͻes/%˾`Vu!ϮZRڇI 1IgU %Ъghe޸7~"ߘwijG}fn-O; H>tܩ +=΃!I @$wN(% FؤvMey*Ư y3 /0(BbI(`!AG;oޡda8RKZI-: wSr¸XIEF(d1(I $׎{ ִ~XAYWEn? թ{=xc~dKx #xzGrOO^w1}G(Նp aYVTJ%Z6M8<@O2vffyBބZ$BV: NE8كke {UtFsNip(L?ǦP bZi΄\q,ԐqDYԹU=>cml7?kcCm FvGGgK+)?psDzltycׇta!!ښ6~b[VK~(h7$ke]]rqXfoEWdf鴉n揻:R9zsGx;1{ >5vTdޮê"Ui(LOjPvNnԧAYf\fSg& նÍ2iZꘘU07V_wP47VZ=}9ݛ.`'~p&h$rq;&hj?{/ðtmawc> Mnfha&lx2uaљ^+rWdHCep9wNjd PJh[+$7DMPM& )-w>hE*p ӧ6#(c U RI1fX$QnV4J2DauB8>Sub=Ս\7`7j"ѣܟi?$3J0 ^VsT†&vz x(<\OWqM׸T+ s!~;@1{KJK SZ?@=z2 gcUm "yq>OiiTN:S%&9w>jvғKu21yQ3 }S,{Q1=>X <a33A{:%*E!TIn hBiKE\o ^V\_ࡶ܍Bqq0dN| C勭!6qهݼZvm~O;.3v?渧oXEz-nkzc}2jyUW.[V㻝O۪ᾒWT T@t0Wch,P[+WqV %&s*ݞrЮE9<>l{Tc|MU>k%뚇Ѝσ7qžLOPTsüEr)ϼV)`k%"JySEw,28m#UQ)ERLiO=$+k͈ ;+%{O1fh=Շͮd|W B,$l Td+I'a%Q: >&FhoN?2 e21fRe6ra!-Xz <-՚1o0)m\9f iDر%Al}ٞ/P\q6o΁xœ_\A/ߪ/; 3-ﲟ2n??4`x"hi?:pwQh6p+t2Duv߈ h d2a$bJ.ra:޸E=CxsG?[]*yDg#1 w'vr$7 yOxPq9oq4~0iZ4oe޽L*U@*#Hd4?~1y LVӼmωk㦚Bj**qFփy _ͮ_UÙi= 178Wr0+Αx2nDIG[N(*>rt$naH9*|-x'AdžxtapT42{KvRÖ.2.vHE8A&eo=T S/(>}o?~ݧ?}OӇ/8&Y梃Z%d#V_T߸RpgP!C:ŒLQI-Oj c4c( ?46K;¥jvϿѽv   E åREB$R9ً9mQ/;"w:gP'ZKB-,;*_؛3;\^(wnt7ܢVQe9Cy*g$ DNOkE+gȡ,7\מoV8Uԛ+pp; s+7&ngʠ/ fR{w7oQ~L/`֛~bq)8M B(^v|^(;Mxi+Alpxj&2>aRRHj)IJ.IPDRBHT=uO9V!VPԦ:ڞ8vxU#gNgul.?j7p\߽Xk6J`AFͮԜC0`\yqUV)|C| i8{Ho t ˙B*Љ⨔ $ƨdb@51'(w[ "R@mrp$)*#@Z+Ip\m(Wؤ^ZG:\_}_Mk>tV7t`"wVOx͊s@2OYVѱuϭq:Nm5ݳt-+lY͹nwM|^hWZ懌n;q:P-ꞎ0'^y>%k.^л@噯ۼzvPd=0thP1V,6-D[P` P" R(=w @0 &iybW -sQ϶T֮D䒃Ud1ϔ b00vsr<ʚ?o~;sjIm(w9F1f XDh"G*R'!]3̳e ]THGϒBSMY)q28 97jmЊnsVO~s:jtQ7-ST*Q9UzkG$t.ZY4DD…0.US9!!qPD$8,R:EJa Ғ9dUbq,e!eIe.kmڇޗoTv4| W.Q1*T' ?Sʸ3ޣ}uR^ZiƞEÖ;"MW)ԣ '`&|Lk(\Ma<];Ej^jN[yB+or MVADeds2aJ# @ &*Њ9$W,21fl#b]H2Fu/~Q`. sX( KDK^"L} jqGF $-,#4ڴ)+%Zڢ}Ggښ۸_a%;UMRZ' WɈTdVi/e.h`Fxl""#f"l"#S2Z%r3>*d B $-rmgiM,ѫdG@]ZlMK6ehvqoi>daӁwJG/ S9GTR { +5{a-Om/**{Y>Xy-ڗn^;#׏~iUSâygj:֖ ZڼM\)65/8'3t4 IfzP.^͉jEM86V>8> )e㝭 X)tSJqɾsmz/]C1FX fp>3ͤA#$-AYZ13'亄[+I,|ߠB0zIa3KԲᶯ&;)Sd5r-&3{x+zkͅw>+czjsʧ~ƴWqр:6_6eD^⍃13?1?sU_hߕ58f|\;e9ÞZyr; o\SxRO _MIUF ڗ1aH29SPOp @+ڧ;ּSTHk}>tcydEǝ&nm\aruvgFw+0)odF7zt}C3eʱ*h*x"u2'vd玔p;GJ6 TNLf2j1 P~hg1;Yw A`3fRIsZsOo5w۠_UQⴤCaTB˥) 4T٩,eoLrN?b,M šDވ&EYwy \$J$y T8h߱:N`HF}FzGsֵb\c4d!U L%0)f]VsVlWd荶@X=A+IDaϢ{ ] rςq>;7 !`Hnfa%}Y+-~ ;6٘&:Vµq89L "]Z4^`t4]'B^<1-^skk%+ksm ]HwO͟N]o\ 1GGoθPov_QVp\j;sԙej;dAb!! >VQ5'A"8Sk2(hL&0ܛO]r*ƒw%,ŕLxM 'wP'k|f}'}ł}LםHD^MWۡ楊hðCT Ӈ g H\u==kzVgo,tϗ )߬k{J1pM9&|Q5bq5,榪!SPۧ^J?e{ӎwgw`k6rQڪB1uT6t5?(uS?A{{Zm ۝:~Cݛ?+}ۏp.ߜ58PSIp-¿%GpvդLmUMMV}_lJ/?ǷQ6ת#HJ|!-H1_؉ӯMv&E,m|IXP? ,ڄP6 F3;;wEqv&1vwR|tۗ,ķ1Ri%=CF.uL`v\*BRB8eN1S/6LKUڎ'DB3p  CApӅ)K4990٫  zjʞ(/ߚNl?n)~s|dF8DoU \[}л*M*Oǹ8iNJ @Td%_O"z'`]Ҳ3J*dD\L+Q^:@?dmiЙ 1 ۪wNс)&'} 1^b-ܒh:o;qd[q;ǯmy ʔO\_@N0&ɹR4^y+'ǽwL1^ Iڗ`B`R{E50G zi&6˥ '4~4-:*r ٩c.I" 1.Z ݍ뙍Qs\_щ:wwb}X7-r: Pmg3tɐAHA3הrȓc13* 7Z"9.DvkɘKТ˭u#zT|x3su!_Y^O>tn:}ݻ붋^}n~ -5;_qW;q5 ݘ>bxNOͿ-X f_ݺNzLo,7N& ]c#Aj5W/viF&ބ iW"6JOfVI)}ES\ r B)w(";}mr.v1 ?=g$j .(%yĝѥAW-[^nY;KU&b {wkKb;B2cm ZnX ^$5f#lMƶK;*2$jOYr1,EɄTpJ$2qe-G&Κޅjx1t3#rhxs"f׍]1zn{XYd{ȏ ɅPUAI6+}h,$%+ c\2 lth؋VGp@^2)1 K?t%$ errZ!t0N(XBz?g}b0EǤf^]δWdZS4=|&dVyjRт-%єsJY]xW.zR! ϔDw<-ܵtq׆|I4sI;5L(RQRÀh +sNk)l䜧K} -1:Yl7]40OdX[̿a(KZ~EF_ \bCC&:IwAɢqYڷ=#悐 $y)Yk_cVdR=GYxoGXcZg;u +gZؤ Fj% =FgOͣ??Sʍ'LhT(#<q|)y6jl)>טg&jI/{tφ-dcD@_Is3ͲRat2QW/!;oSΡ&8P#3`b!b+\Lx5+àUjSr)01>=r!IdePX/Lˀ5q\ol_\ w^Fd<Må޾F)Fףq:+2 L+=2L+wtrom!6 ER?Ġ $]@OWttR"2#V-bszѬ]Ћo-#d9mLU [fٷCv9Ȏ;q' ;b$3$|QZ$ˉ (t> !DotQ F")mC&{tCZeHPzK ęLXH\kJ,^N#)fQnnhd IgY)&'Iy$VHAh-qVtzQ[.;]kzҲa{feZʫ(ȃCH%K"d[lǢĤRd˧K c*CdDb1ё9+gS4HǹV.j2&K[UZiƦPl n_ak.@Gao\7uG/bGi$&FefPLu*^ Wb_%MP :%Ȭ sld!ٔQd0f0C,Xh;-qV[l?sWvkڱ+V[ރ]h/X {uTLd %?{ȑ9gۮ YyXLf7M}I0-,93[%[-3bU",,$-~)G?v& r &dEr$XP" )}|Y :iuykܐc*x(ؙ|슈1"b=">NO[5g'DEGdiW,VȾ 7?һq4بhGєR] JZbHm>̞!9p|He"y`vS7~I),'lRԖdV[hWu#:0+o{;bZa:vԅs Y#Y#,'BK$R(tds9RdхˑK{%-ۂ/V=rn|rm|S-^̎ZK{{*mm ~e>tPjy_D$^r K$*,VV#T0Y{o焢kk\Yj v ֌l]΂(%I)"Iei'|Vk&JSR3ka-+W/aⴼt_?QdI͙Pr/ $Ӊk ]\H1syټskػdGSY,7y(W [bW<跽}BMؽ?>g.$U\tG;2[h*BTdݠ Jkq03u2*LT46Z0Y)l,enu[,ZA诡is*Ŭ*[ѻ~t;ѫ/bZ1U- y4h]͎mvu^pl_D.=_J])cuwS;](baSJr+-r#6JS#}Tsн/ bds^Ț9)|NIJǔhQVdo@gY] av^J1b~;PnQ_ngDŽ!0,&Fhb X9 -X2ldL=",+A3ε!E ePA'e4@ɬ]B²`d(-$d엪eIofɇ O A-Q)=R<3|ȀGa^id?D@cvQؒxZ*Y<dz}{iy6HzM(u%Ka!T%u1EXzy¨ c 8}a8]LIo}"Y 1Cf&)z\ y):+i L;w~ӷlUբent8e1Rqa-H_7=fW{hWI(9&$3Q:@&.E7.Ey8 };ཀྵ%?JVu>9dzdĦpU WbZ (1-d9Q-}sQV}Oʅ4`2M7jI}M0>3]qED?LpKD*It ~כf#oKׅ٤ kiXkW;t$lX6r3Qr0|G6`]nz2|m$%lFr݌w[t *U]GRw9z !: J uBlޖƭFeE-ox&%_]K~^nKyB"ETˈ`j_[\~_AԐ;rCMO7I&ro{6]&kB[B[dCA>(O= ڠ7~G I3IFRf$7YF=y!yt~wau~b{V213/ܛ|b GsrGSm7҅\M7saYWr^myĨMl)cgֈiJo}^;;&t9_&pV]fѯ3olge{c~q괕O : 9Cع ]2>WϺ\=e-_Xi|\=[1lׂ/g;vz\=q+َKdՃ}][͹9"* k8*Z~,pE WK{zp%~LUaذ+bpc­ԇWK=\FR;eeɂ]N(K\Ѫd.U, ,یfg(Ϙ`:nFMwx3F&/; ;L9&vjO0]TaވpT2}0s'n}E$? MT㸬8b`2N+Н'aݖ )f4*DE ^Dr"t}V 3 t6@U`W=@:{{`8F+}H#c=Pu0{?#ic otw _olWY$Qd  pEEbw؜9Aǫ!WWn:+g=QR⻻e4Ӷ}eTM ~蔟PwO8PryI=@LY5gSK:*W*9VTP:W*DeN%WIswo.%Txp'^ګovfj7j)uKiI&3lQuaˆ.ybVVgduЙ:+ }Ν^dFܥą`Q 6^JL*&Vs|t$G"t|ݲ3v6@vsWvxN#.S_^ﶤY<^RY.J^ qҋ#eYV(tܒ$Ch!F(HRFR4`;HG,JqRe'KLG1u$AFfd;=Q2&Ӆ,U B-5]=  vDrXv)_x>ݏ@qjCLOվfDcddT3R"$-ZP@[$Mu4m gA9KX GT4h“8^M%82J&8d䊝k%%fҏ>̮ /OVgIKHrX.1hGMz] ܲ?Ӽ ûwEҸr~&.8-r>}\F}kNơ˫*VMQ _ O0Iz_l@qS}2yAӼ$?kտ'?beKfot<_43+.9G ߐN($ǴpOm݈nPv,*?ӖiȄ<40έ5uZo- ܾd+[r7 PRWxۗSkjlob@Q\dTl KtkŪluZG޾Ɓu!F) Fvw~%lz/?_őY99G(AAoGʊcV[!x )f!e:9p6Kӎ1s M+r6[TĘ" ):'g4}2CG\TPðSeMWҙIҙ4vtz_==debN1[](U哈QSJUOY7zyF/( RoBD@n3h3ks5N0t8tprӠ#Ù;Nx0J:Fb@IAI ^ڽZC 1]vZuGvv7ݴx}>3-^G:yNÁ5 K5)HSa@HW/#Slg7R㾮^<>vƋX̓RP{NN^(;;,Ǜul"j.?[voڶ3 }VM&(J\]`XUIh~1c?Bo`H_={JZ/=Kvto ɬw$qkY ;!swFIĕ7hWc!pdލlcAc9\v8O̅}s9鍝I$[gDju@,`kI 6xѕZcO rTBcBQk{ʒ2 tj@$gd&2q.3[9{9Gfϼ _z<4s?XE{C!6byj{̏ 0~DґCaI)FhWBPF&́A:E\Nsm.Ip&Y`D 8*hj*jo`./kس_L B$%{x,Y*k8jQмqz/id.*ɈAw^4,ߞp;ɓ 'r{>e\MB3ť- A\4wʘLhЙWBbR).T. )/6PA=ƘHjd2ʀ9m/'NZ9̫~V˞ 9GN߯c? 牠+ 8꧟~-68`DS=LUjPy'ל'Ceч՜<6&qۀMpq( ,8xP 0vPO_3[,.*P1Zc}#e)㠴hf.Jh82;]pÂS0eU )L3`/:jh+39Q~ϕD2hf'LɥUo>GΕ$.2#CYI{^쩚9G.8-}4]̾e[qS7+kBW_Sv[?p_)kYj=p,|/5ֶۯtuh#6 RS\B"_\ 3Oȅu[:/g7˳_^t~࢛sTao Йͪ}ΕoN>|:[geͨzׂo>Op8H "7m=mM\?c5 Z \\yX',-C·܆ŵRg..)28B eJ*ilR%GTacMFG@De){ 1#n{A٭ J4/JdL XP&$c*(DF")C({mdv|dAXJ0場3@4s,$S 3(%˺ ZQpM# ېT6>+^>D!A`Cx="3`-sv֮N,Mq>(W''(.ܧa UFU|8I> <8 zd)%22 J KDXD3HLX&cwLixF`-WU+d%X fӤ}9 dpt)WmKD24/.g/޺i~jp%2kEHن/Ե~-64)12hͭ j&'~*=gܻ8k; O5SkNFV{Պ[o ez} ^QKY>ٰ\EO1 Z/1]O+x#tO{)۬{=㴚L%dD&3D)Ra2ļ#< rT*oV ,耳tEߺZփ!rcJ:=3@<RvEZYm×-ۛuW<=3%@O]%_VSԺ{ԫʡͥzR$V9v} \fjҁUdt %pa*GsA*#"w)q!Xj 3x)U1 eShyN9rI.FJF&:Eѩ DADzȤ̂N0LKno01UpIhY_߷=־ \ B[eUpO0hX |@[Kx}yٶMp,*4Q"VAcd5ƆRU$lՓdE+٣lU֏lU"SQP62&IOdH#q:433e'xe'l(J9y>r, >B)0J̵20\>is(6xz=Nۇ#X >̖R][U 3mn/ŐVccwL^w@̀Wj.:[8vofUx׬R4hIfm' ԉKbcQAN%+̞\Yzjnu:d]eUIbqΨe_Jتto7^_hU,ky9)NßUw%ÄGcM;O%6n1@FG s'op٦CmeCF5.1SR!ZcC8/~%8?hsJO@]`B"IY))aLi]>}]LFf1\J*؀I!HQye$%vʉQR).T.HRdDb1Ē65R;lLTSJv`;) 6:wk${wp!(Y@tLV3煁Ii7I6->⬍qXxJf9(A+1F3=Ѐe]dźX?l Sփ ;U:2qBP(U)d_o ]8f x9ka*QWG, uqFQy W91(h)vk DE}a/!rƟ@N `c ̙ BI,@DQ$9eIJ`gaP!rsٕsYIxBM'sF`ɻp|Nn2Mݲc+Ȃu\cMurX0 A21HN GlFp0ʂ3}='KOMPvKX@G1ʄ)""Aq{\)CI%O_vϮ qĒA%,%m}$1DqY n',NR*Kf*v3<6}YM=чᐌ on` 4LAd/!n$7i:6Ÿ6U& 1V $Rn eAt 1 `I8x^3Mk}ZȌ1zU)Ǵ(RPώ9.ݎ7ø />G/8_Tnjhg('a2=ƓE7Roғt9;ŷn8&:Irtp0%/7!/F_Bօڤ 7I;tBfe%H9*`^6:p]EmgDvl3wHmܢ+2Op=_^VU\sXNgc&B B))~BW˔ յS]6po{k՗[_~%=TAb_[;#QgWwf/`ETot1ݐvE;7C,Cj݅Z.b"DZz)[{WmPfr#wo.\l6lؚ")f$7YkJDwM_b{Ԡu)OnOPՠ.t_GkS[qcb๟yM4q#jpRf=4M[ 7!Ԧជ`V"혠omކbR;ӚMԴ"CsuGGmt"`7rM#XBmTFbG1̫h c"j) sO5t1e.{6:}fIK)lh0;&Dq>c 38bj +'Ʋ!(/5: ^^@i/w~Ec{㍵h,8z~ƑӋ˪,ۃb;.|9 [#/!OT J֮-ѭW=3-dRbJ[bUݭ6Tl740# cvxV˄¦ʭeSC #p}$fddYSm{ZżHUl|Q{B 7F`CTv_"VP$ s̱*1ؘ7o?> nL6UQfou׬صJҷطٿ˿o-iڴLN$ O:6 =pP|M^Wffa*Rٷbe'aw-v%Fb铇jdQY0dYXJ ggkƟ U7SEzm;iRup_$/6-@Tv:.\U Yd`v.T{n3d41tP|U|bnm#. j<, ˓~K$]"^&$ݑAMNb<&qa/6O+{*ӣJŬ5RTE!?CJ:.juZ>,=udR cKk5K##V^#E2[dm6ê_~7"dfM{Z}+|Kp@kE\GV$d*V$ g ^qc`T7;.6qqEn}⠑T -=QE΁́́́́́/o7/G @.<*Ep^Vz'WqOqNMkU?E9,dVJ%??ű+33ĸ?O?/СT@7xoU~qX#W+х7ae%+!XϵB_&pg($lHOs)P'CMf 5&3dP /E5V*sQu\ٙJ3o9lĞ&!2g2g3/fόCYdό?3g̒'fό?3g3fό?3g3fό?3g3fό?3g3fό?3g3fό+4ml 5YEP,\ p/>Q1vIE~Ce00ɕgѻèKc-@77{?E^k働duIp<;ٯ#av 3/cQ.lKX5SxՒhl=ThP~`F0,Ƈ{W@8 7ԅT(L#l}R:#-mc0P0vY4a#!4Y@ ?\Tknl61:_ɳVMe-'fJni]w(ѵMU\كetɨѥrD8.[e"L -ꯪ yv>Mk{ '߽.jQ=osAva]7 j<, v{˓8~KOjk.$"L[3I# V2nL .2.QIc`;쵿>9^B>VcPfb;n:E)׾kBh M*.X*M%6^\$`kq) )0q+RYJy\m-:^jF6 T ilۅ\*LHRb'.tLѫ]T@鷓O'X5fJgArR,MTY)DX#VO!E@zR{Dh!b|B~M/\ϖu;"E Rrwy0nNx l (O;@2qg1NJqPϱh-lVԛu LhK$FA#{₤k+T)#E38C*O$?K5mgNdA90CF{0[h>$ O+kSVa7Ri"j40Da0#pc*,p 8Oqؒۏ7<\9Ncָwɟ8.fE`8% S09=8zOM: 3DS(us\7U,Xwh*lKCD\`EmQ'n@d 4o!V 2"H"rxϤSwhpaW0es̻`Utb H  W0owZ}& YU$lr3  WťHբZz:>>]\^*a(.4C#ۤtRbPՋM¯sar4gL-ިi]ӗo|v\8Y5nb0(gRٕH1Ϫ,`CsTgOEW7.:Ye7e}+3X84~<^=]9z9=k]v+gLE-2;x4X$qq1?V< ; jbQ}ps`@0!5! ܻQBt~K.|%By>@Lr:D0}EJ׽;u[A>ޫ_G2Ψ`I="aeuUpɴHf5!ػ#!Fad"\Ԇf鱃 p)s =RVĨ0T f,eiK%0 qp! T`a/bYHE ,>tzJX[#,aym<μk/ o1rd_TR %Yv%gGix`,(U2xi2W)$$_ۥZJV)GtSo-= !H 3=lEgJ_."4Y4N1'3}} "_a6g?2.+t*l=u$ '5/?B =9S+l%rܔ |x 2v^FDi9|gݦGNs\9>66nnT%9%>j))-ɵ[/)& ֮e_7>|8ӗ朼C"9bTp.d4(B9lI9v3(.y" RDK%l SFdI֛y[sꭋ#zJ|$: w(C濫@ #贲#{2RY Cݖ!0Rz4RH 5G rEl 0 P}Vw ԧ yV l]P*x¿!'a}q*'_t V_˳@r3PdJ@\/L2Rv!D0R!QdHX4/ Q=E~?X%YN\݁Hwr_,jkD4(&3u@"tՉ\p=By38_!=pgWHKD.2BKp)dKyZm dIm@k8}`Rue`1a`A:km,*5.4s8z$'L{oΈ=*v9:} E%ѢIp>o}pWB?>/})%g]R8/W7u"6(:j1n9墼w񍳕.@YUUiKZM s6(+lJ)DR!=/;`Lc̝ j#hOor{ W.NPx8[L=[V??bn{oyUg6+"\MGLfM6O곓<.~ʋbw+:lZdg /Ctnqݲa8'#F eꭧ=x 7ZU| ׏|JᏂ˯1R~<nC\kͮyŏm^3Y:7>8gUWGa! ʉ\R<;lW_5Z[W^J?R)-^Ӳs.SzT ĺsnqZ4CBFywzQat.ZdUoD`4Yi)!)qI$B. c 㞐ݧ£=_m;߮'}C"Z *G4(Kƪ,FxkM5wzjGfTTG@q1bֲWQƌ ,{t Ġ<.ʝ5.E2yW^.5ԧUvU篗\[iT>G!$=DNV+Ph DƋDJr :wK%$`S8dvjJ<{X zH^JpjsfXY,l3΍8 / Wd~KO\/}ŕlv6#vTuDdǙh%BzBVWqUfLDD6 1dsb !pa E G9.3V*1ɶ&6rxwb%j7}8D‹sh{h} (Sȩ`\b uu)?֍5Eb2&EǺE4*+a!|HE'9}x̝p۔k`KDl?΍qDCD"oIx GzVBȿHx6qFd0^kXs~'25Ǒ0g < _*5΄lșBO N3wGį7_u8[-TlK΍8.!.qPW)d6Eu.x)(/B9x7FZRI'q}zG_ga w\2>DoDf9dc{~z?Kd~![ "dUUtVk<@9vQN.AQBDHM7"YOBDȳ|؎"l2 E!@o.;Y `J%efd)QE U lDE ҆T%9@xR9<2MDa/5F99 >vӻ?(naz?_ۓ\Kz zRXɈ&+g{2s!FbU=˩s2-m[lIXĔrI&%Чr<@ )H @֊Y9>PJ#o::60entdK˞}ē/z=jvCF,pº,!w}ܕ'%/n ?b΋\r, ͧgFr2 U RHc1;^KΌ:bwYj#zh4z;>Y̷a˾'id4uvmhNn<_y)g=l- H] 5oo7y8-zȣ] i&OFsnWgȯS^yOg}~n>4:ٳQ}]~>.|dG&w3=؆7S`{PlnL[KJHFJG9ZD<ogw}7cZ P iYKmٹX)5]HU.+УG?$4l.VBX*}޸@eT`@ךOSrIں3 %cײ3E:y.xQX=JdӴrCȗqOgnژ_Cl~|Xh0URpd- 7Cmn^uxݬV'fZv b a-(D-pNVya-VyvpP=\t ̙|.oWR)@UQX/̮r3ϓD89JY(LN8R /!'MԢ$Wޝ{ރ.3k'!y@jW$0W#^p-r[+.^t4ȥr@`q8mͲ hh{^un1iO>~yyf/Lq A_Fӻ)g 磼$'ϾmTfCi \k6p5$]QkWVw ]K[uė4R O"JRv;Pcj@5v؞Z ( ؁;Pcj@5v؁;Pc") b~o/? \8e:+0{()O4$4/![zYbr΅^T~/6O;TJuFրI B)ȣM`JJ=Rnnm#1qZG{Sm~ q&R;QյCp~Tuݯנէ?.KV8o) !0" eIVꠔ:Y0x z.'̫Y{47O ~=ʄB[ |;7o0Ry~&%ݘRg͗&l"m!9Tٍ<а,s)69HMz&R`yuRX:!ںHYdtv\L`E's֌?A$8*\Ai Q鬵*kQ!5Flqۊ}b-f]e6򚄱ct֏m1@̷Z'uMlLg9\L٤RR.;mH%I]& ޳.OU}\ʺ1AY,$Vs-b8lf#[]}v29o0!i*r|R.BH(ΉR M๛oz!h4h]Q(ρec"c#T ϴ !8TODwGF.J.,uLsFebdD;f> =uZcN= nqa}p^j^5ͪ!%jhR ep4٬xL294aa|$ĩK5!Wl uF MJ1FN-GKy[8AqTwGq >޽*7EZoEA.}F7~K2t_d O_141BWmGzlw_x54kXu %쯣}챫mW%ܫIuqh~皧S̓=_>V'\¹g\_okm_ȝp!2obI86~{bOpNt}kgVs3 xCOdm8D7b'u~wB.Xs2sz! V*)q-R4hvtΗp͹%/sq#JK zyu^Bs ={"Z_gK8ybyTЧ\gNK}n\RB<zv\Z6I}Ƶ9Ҥ9$3I4c֥]k^dw c7/S6gۿ{{2ܞZ}sHMjmk=fƙͅ]XU7w]hU]h}:7gm+7]aM-pEV#K^KĽs$YLJqs(rH gr .O&i')MԿڰ@V{^bKg&F˱3mZ:7Yg]<wZl1~_cb.6#bݨ-({ ä -ʴtf |H&UFj<'J!af 9.h\sB8t\zkhB"B2>xwdzU^Ϛ1N/Ycƒ5l]_rzq}M4*ʅϫ9؜⎢T_OpL~B/$#? n? \? Zg,A;V +̶_?W/=^.tĐڨFi`=ob2^29͞=m1+y8ie}ZZ{fé5YңCɬYvZjڌ3I{ l" 6s NPRWգ Qj rHzKRYZ8%AJ&Β,u<'ǒGWgsnz@I%{.~~lHi]_E\sG8{=$lX\=r孥˫]Ϭke5:.w1ؗPE#BS-X^2<)˸ctu*Ohp<4Ӄ&$9ŋĖm)diR֠VsW[ )NE'B@#uvKAPzlԦ!vR!FA&]?dC*グ!L=sV%h.m!cxԊ$8E|3ǣ鹎] H<Ƈ\rqI K$w]" %bqipkN$cI3b]z Zo(ْ>?y6~*51syJp;?k|֘@1^aQ IN0{?]RI{{SM;aAg??$SHQXF1RFp UX%')'~<Ƌ09O^5wICrt,jp57кjex'[47o#0<֜~oze#mv۾6C_N..O5DSm;4gߟ>fj'v]9qNxWoŨ֗V_{K3uMښ޽}|͢`v)s`.0gJw Ƒp4zY}C7_S0֎. #aQlOӼOf1ɦə(Qg\7꺹j:.8 ϣXco{iuׇ^F ‰_u// !f~|[~?=͟޼t|C+0HH{4槝&d547Z;}t[PPRCc l'gi"5L4J ^vnAS4غI^s#֌w ,^\@|dtdSЊFhYxğfu$JG)1"Gr1koUpB1e!|)E#=a\z}t"gIk71AmӤB Je>YOW;*w:Y '$ڱ5}"=`{΁{/zޙt2\!V _ːQn;Mw4xA9]l> ~) [6N n<b>0J֔7$(GHLH43*1ntiP,Ү=ػsyZ.W8hs&ԮLHhNzo"{n뻽ea88\c)gxRnf%qݴЋY˽REg3+<)Nk,uz9J/:T"MmQ?z/#A=!,ِ-%pSrF0 t\)eh0)7%) QĻeD볗*sw$A/@ګk1 W'u-KQz!I8p9Q4h؈ĭ" imDt5jO E)ţ3^^vZvV쉲#KF9Ì6xg1:"r0Bgb`fFXP0nJ?D%*,uR pFx%ЃD\jy|l.Fn)#ɽ9 x;םU+|S'] gI\-f5 49 :fѣ lths'V"U)8:^R(+187yQ*t]e2zn:鉨rd2Wg;:&y]gK_OT~ 2f5:EXn+ i5K0JNˁn'IF}erY:%:FѻM^˰5;{.u!SI2FŢ*&jt g0{=EO2.CNO7 x{o'4ȅDe9;Z7*8f;#v( wo"N_ƫiBYcdѐN/L]9:.9QEã[*Jж]l[ՔOz5*%H}; }Ig9X ز GFzw8a J4#埐Mߞv+N~b[;VҦfdQωxɿy/vxym o OD q1 /xFNsX6ls"#g2lJ%)#HQ = X42gdAKaKZ[0PWf%8ҡLt>o.(c,'Dy t-hIJt-p+ѳ8˶_yt)~rmFg^9- [,}P0E-Ar@$8H^:+b$̂D-d1AO,2:zGX!9k !r>A0LX@g'0>8]!!sB7]q&m$)o \I%& ]1$IF<=X1Y]*}ٴ_c5Ot yER8|F`#ᳩVsVv 5ΎþaDK\?C8?OT `+X$4\[(B| 3;vL?Tj;|4̻ aҕ|dpDOСn-5Rj R֡7?vqCƳ][:[3V7O)nnh,޽3,dw;(~2EuO s֋ԺKk{ZxK6[l; lغ{^Ϸw>BP"7{vλhw56v_ԍ6-zÕ5gӟ5rumSY=x5ǀgbKYnNdĺO]_qG-#d(ѶqL&1[Qe"]RAEp҂&i,#7*P"`4$D% U*}H<_89t#;>Q8"C-d(>A6 3e6̳\kJ DAM%Y'Jon3HyЋ#@DbE :kڕ]jP쪽a#ڝ*^5<٫*޲)UlXlpgu\+oQ%7GT$̗ h!E AZkCti5Y/BU#/|R^t3.j;78M÷z F &>#82hn3(&6 Wń1~6tJY2eϓb+C Qئ46"QN2pY($+则E =kc0-µaǡpmuG vche&+J bTH#xP+v)OKqa" &H];,!>TVg;?l!{OXpj/Gԕ9>r#GζIE*Y;vrXMvNf߯J9,[ͨg$ZC4gג8wR+B =>Y 3yI:mOIReFRgqu$ ^m3.PNzHluz鞍7/U:of.isoQy^'/~.*aӣL̉\ɘLYCg{JOc-/"ҭGcM^| /Ň{p-Ξ@a{cj&M3~|w6;q} )tQLDݜqy!K \ܳ]$'T:ﰖ;7mD!{"$&BcO4r''ؽPlGPrӳɐbbfWg%ieVk 8؈un-騶n ~:[׶~FĆnV|O+cr?<|0{]瓯JoD5$qKv4LwF[ߕB\{Sэ>1>آdѻRG2pGX'{ |߷V{m+֎6ei~ËF݆ywϣnG.և}Eu~w=PߕO8@3*^#Th ?UJOplHW ،7B+Zo׬&]!Ժi])nhFWё1nRʵu[|0wZs0ZandA+tkу7BC`6])Vtގ]WJvǚtu7E 9.ydMGv4BjDJm P~FׂkedAiɍ}dA)ډ,8-͊Q`ѕftS]WJaK#7%|^]B Xv//ź^,!CYrUls _~o_$Ng˅K5uw|7,+CxG%a͎E?&H\U]6軇@Ngkw;(!/{wXUWg${\T"*pZxƒwZDы,h-{ -mօWj /ZϪ|nRvA*9_[yqN 01ߕ X%?7'Ksճ?\ݭ(]lUm_5mU^|↓fu=gx~>vņjNRqm3M5},o)e0SSj!05+ѕ:ߊ6~ 93,QW! =^u [ѕ^x])[b0vJqѕ҆8v]It*lYĄ|Ax0hABM;clliNɷ VLRhSk{Mcԕ0mHWѕf1(m}J(ZWǨ+˦ΠЌѕ2IWGlKODfDKW렙+wӤ#UGp0N*1JiiODv6դG*nYĠu5 {-haadCq⤫]yC타Jq} x?QtuB 80ѕZӌPQҤcԕiHW쉚ѕzۊ6jO:F]sll~ԸĬXq6&CфfN5.ܣq2$ YaAlb3ڤ"7(b=NEnTm(7@QYѴYqJqC J`ghҕm:nEWJ;P݀IWǨ+/uߵ4-F=3-Fqɴ+M:F]yk[Og,_$A%YzYzh5\ ,pb`dQgt11ZPSM3&؛jBcۏ|j=KS-HKYg>VW ]).7ӳThǮ+DIWGjHW }3R@JhR4 vExcjpP] t0ǵ> Z^;j_ҕ3s3`u{Y{پܓGW#rK+H Ӵ+dcוPڑ=yteBCRЎכfZWJku&]8Yzff)S':щ-=J -iZ7iō͌ -aMcvǨi mhFW])-RF7u %] s/]).A+RZ?G J1c >jHW3ۙ[ѕuL:B]V];R\ߌ6CcUmKX~/]).6T񷮔OOD]E'&'COˇ1ԺF9^ &]Zh6+vѕ2+Ʈ+IWǨ+kDhHW(6+=xgpmlqށ ~7WE%JogkDˍ$I[un׫ g?Ǐ? {y^HPZ}@-/UY_.MH %^ot~~٥۔DtWoyZ\_^KZB<97_b 9olt6|;nbԹˏy{ş*G|{*ūrs/25NB_;UXrDy$**R/D$)?1;dˆz8!59p|w/$tݑzZ6Ow-K}ew5G'f.ykB }蒵B_[|66B:z%y)o^ݜ_o+$ zć[EٍOkyn|_sI_p)ytD}TCY' 19Dg8K"bp׊k%sĜ2A֔$ !u֚Tաptr. PϦRv6e68Hә[hSo}W qM0Q#,SId j$͉$Cp#-׵OZ5* Zl.F (({}m.{7J]yBTbJFw0]dp6KKU*P1Jв-bK;R=ficQdiPG!]Nx.w@BDNyIlH211 ݋6\ϖuXc D !^J co/ !wU Zj}ྋ='b/i3RhB5L R9+);,Y<%cN·)vZO;T{s#hM=y,Kg@rY-FxƁv_PZQJ-H[Ki6wzf){ -QJkqʓXu>Y)_D2+Y-t=Iñ$ m[wyTEu6d-Im,KY&g. T}/:NjT:=Ac#qu}zlLCEZIt K/wj%ERe}] V>W|o kp]kmȲ &-A GaaAPNIHy \TD1ESU=NwB7BJJJ#L2b!]AАN6p.FCYBL ds[PN+$_PcYTB2 VanB AvEm%˲fH57]\?7(c)(|kGJyHBB("2m z;clQȌ<BUFݚ :$Y,  ^ B9TECk>84iAW \ ]/fĥ"ҊYo"棊1͋BIJBm>*y ]Y,uG˚]zqIx[ ީ^Z01}Y]` mӇL[oT^БV%SdIW}HV*U2P (y I,=A".3(Z R|$ LjyU,C Y0ncP&>yt_V 5ȤuՅ:@ 7XTuJrA#+PZDbYьmC5YK1Z1ة`U05Ӎ~=K;. O&!The1h6A@F4= R=A /)}C0 {ۣDTePEx(%%l-ѧ@hg]+y r.!z-bB bjwXC-D{Phňe{}65^ PТ td!.h]`mB$1SZ(ͮɀ!JP CEdU5\J> a!dً$P>flqRU9QbզX5Zl@VHڳF& h%A J"-{Tjj5VE*~'X:*fZ6H*tM>+Mӝ%L0ACj)M.1ڹG<6s.ˮ Uڦ]\gsU&#A[.!loNN44z63 Z DiVۆnME^k Q$I"eCkBm1&Lz~{4l}˃ʌ' <%2`ж+9]PnDVCqnQD{}I,WR SAA ,3 R4Č,mAz |"2PʽƮXB^yE!8ij)K:)Hvr;X._* Q 2P(EeQAR܌EEH,zށ U l?t󩂛]͙nZR. mn߼y/0yFgqɓZRM\CL'C%]]щ|dݤem`@2טN:_,qŅT\F ѱ )tbr/ez7++t m!mm9y.U[g @gRZFNo*95oΌ Z8m!mϚ6X Nri$; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@'ҐenLN  nq逜=x'PjNSt@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; Nu9/c֊x@h@@rNStf'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N vt@Ҫ0'Z7'#WW~&):@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N vݣNz=QSjx}>\9P/Z]OfpJT#2.`Gc\\䖱C7.Jkظ1.G>PDO\pG}j!qj?FCJAWC^J]+XJ197tE>'N֩ѕZ-b NWrLWCWZ)oĈ 'EWWF]ZNW;S+V~s@[mBˆ˭.ȺNꔙ;z*M~r֊޼l>c;S#2k1OprY^^*V>YYWbNR3Lx3Ъ' | ~qv3 qy-/ˢ =^-gZ~BG.W)ۗuwfcbl~?>aJ~lXb\ƩOSΒL.[^?] UјiUi(ؤ ZKPV{>,~|?mvo)mW!)K=g>3R˱;SK92+G 3csϩw7t^tZܙ]S;iJkQY !F(۾2%wM:8#bm]hQٷQxK r3FhQJ*.1! *1hiJ(OtBЗpK Vm Q+X hJ(iiܘ[iFCWWatE( oܜ"]y)k v=7u> P2 _r_3yn%z)&x:M3h:Y߫~'h z1 [l%?1_A7b~g$/YBݱZv /7Wum`oZm:ke9y!ggf2]1Mr5\]8xF//u@Bc>ջߜmL\?WޤˆkK z-'֦!D'Ͽu"j-[?Cݰ6O^3<;-mM_εڡff=հfil)m9="閴v4RTK5BpqR͈8hSY\BWȩHWH+8\'FCWVJ(MdЕyЛszz;u/Jp\{bp?R 5,2{ЕaС&H3&`um0"ZeLW'HW #+h:5xuE(u`:At{5""q< pc+Bkc:IQn }*bjTsIT[r?ғGDӆfGCӀ MZ=xU ^HivDtEx׉ѨJBPtI9z;5"~4kvY+B)# |nDtEG@p ]m_#2] ]O]vY"a4 htC+B= UF1]EG]\cBW6 豜wD?te9\xwձptZw2 L]=2]}K#+|t^pwCkPLWTPfh p"sLW'HWG%Vt.+]yG#z>k~b^!?mkֽxŽFl=&Gv<#`Q I%65ٸ!g&C'/؟ĶQ86Ύ'06L4A0" ǣRc+BkPZ>st壏F| bDtEpdzJh#OW@)v;"OBWQ[7"n4)>B#ޝ n4tEp ]ZNW@)f:AR":FDW8=ZB:]J嘮NT6S{j? u0j j]^iӁviӚi'MLh`BiYU"ME`m3t_ @Wx[N' Hڽ}ϋ,?~ͳ={q\&|{OW\WM.:ӒVsR(hWo~9QU{PF^܅Mzu^W/ockw?d - 7<6{[ zw]y&6^:ChkooA%iz7׿fmw|uwr}ͧ>՜Q,%UDk6.JI徏'v}k.H^E}9~^z h]m//EҜC/ǤJozJƷ^TYEjlRY jѪOF~ fGCcO3~Bmw!yigomDo[}f6C&aX>~ohkmH_0rM^b i,S(WMRd,Ӷ 񈭪fw=~U]}=\]UѺHd%Q(o O8Sh\<^k]m~\9Vu_~sȲg.Z+r_)']ͪ:8DjHަ{B #!u*JɀMPWAyQޛ`[]})Dm&Ul0= va[x؂oIC aUi&MDMtei֢^g& LF'`%w9QY8~Vs 4/dMG@R4Q'*5"ޮoNՏ1(ujoѾ .a!צ.}D]r~`P?!e+e5't 5eIΤ pJɆ(fCljWɟSD; ~>5GlSZ}?3G(%B x1Wt^)KRbD9OʍX֑aYqGkBi-B"ƈug7F YAKI=-WP(O)ŗF^Ͻ!?/^懟m&'Xl%[Ҿbfy:׍˷L|'nC1@WBkP++R9?lF3xnC01q`&=rIlknw즘v6fПp6LêY,u Dj Ǡpi>h%E hl R+W5|Y+&X`T9F$㑅H1ΐԔӝ0+Cʘtື׵ͬqlvJDŽaznOrV E-XhQoHq> U`,ʗ[ɗ9ĭcE='\y<"ƠeZ s%%L`I[1>&0 @L`dJ2F|@iE$eASvj.dXe`t,{mB" "$K΍4hGg8b5 `x1<|iz,kґ`meJcH!ϸR 0k ɖ'u&IV&1zek}P1@YI Qty`EǂB(eTY0SwA;uZtЮN= vwaGt'ԇO 6zHԝ7ZR,J#, (Th˦G$Iz6W,('UB,0☶E1 WR߁Ь@XE5n1^OYM/sUKCtTq:jOg[">{ZpZbh664YF*5>Pui\h-m3^l鵙thã)qAcsfߣ.Itf==^'Ivw(yy"YOX}Xh-f&W[g`Ff0]$~}_4N w{6ۣtϮwMCCFDvl?F6S'Rёi}`vSkww٢DHkM}ꮚ@^P|kR2]i7\s2/ ZFϥډD_)a~/hGԐo}1ݓaТx}x4A9ݏu u7чɚAhIf@7 SACUs'E:VhHI.GrUIt0vocvvQڬN~I궚 Km,3) r?slꩋ-=j+wSfez{Ӆ\ΊmN K,r[/f[m[v6~?m)&S5bDʖNFl˰&h7sM#Y=$MƯhjl;BÎk2 >R̍a^E+Vu-S # I^F#*&ОvB+0<D(ӝ<$ Q\D1c,!xGt6AΕe.r9gC5 * ʛ;Hr-Ǹ8%  P9s˭#:F j@IOAIZg]0SؒO+=w)> VQRS3 x1gz1j$=vS=*7化5/]~yk+oC6&/೛q(9K>|u:$ֿ \>DH7sޔ߻C{MRƀƮz- b>$$[ϓ^<_{OmQ[JUkA=%%}+ʾ{SwW.r8{So8{S/p=xc:HųE0g%xU=J$񽾄ٖQqiؽu2]hז:!0|)[# Ơ%V>2'7'{[9,Q!N+KYvG pknHoJ0-NZepZq@YL;4 Hm#s"d{rm7r7]f8as7ȦZ|nҁ}E 5j.<0P~;y`h4BkM̨3h!ЋRb58 l:ǎ_wϾX.ܲӾ#-qYRd)G K 1?fɷt/AĹ"*"i wǑeS9:eaz )0pkvVSR%; ^#zugoiMU}M}};u$X<^T|>dth7őiYh,I{7Zf!,]W$"x4,ta ;vtW Wd>qϋ/Bu;"E RrwC 3Ip4KЈ`@H18COqQHi< -\xpg |y&9c;O|(oa}h,3-ig$eNLߪFЌjCFa4u%6/_<YɹW^ T^?(bf'D(>UڂNP d@Ktr&SUNnO90yl7`Utb hCRݥBew-[>[?% 弮}j|Sp]ڌ-ifò@qJ6'nFpAEnJ/6 ^٧;UecFmiRk:EsWqU`3h͗02Nq3TߍJr$+Wz;NB Ch3'ɉ͜.ؐn6\tY>>6@Ww\"}Uld4Fg&-D?bE<\q9q#O߽ߥ~ӷ߽}zq#0.mOOTO{ެAb bY?'unjcoKrJUPW_@*}_ɚ;=ED׀h\bs"8ω|w]ڄ(1-@TK;ҍꅗ {~?cGHyk~?KqhxFa~ (؄R8(]@aB+. ,&o>B}~1`RXB)H*WX%A9 TJūB1I@tp=Q=+(cYe^T/woR4:.,e]DZVj<%#$JEqh-jO=>Că+/,/;Fw܁Jy?Vi`^0\p;JzhݥGۓ^0J/&Zm^^6Y|UMujCLm;əFm  Qo,a|I;ȟ?{'<m9_sCB~'_T_sKu.TPH;:4 d%h=CO5?>^O{B;%(b}w=1?Ӷ_=R&e gi-Ňm[{n]b{2^"χ:cmZmͯdV\6im2I@mfI]D0hQĐ56mҎ&Lvsw)K{a{k"I EB>PVcFLdMIOJ$A@MM='KϏԭv\}_юo ftTQБzF %*KN+Yq ךt\3Ux--πc65od.堢'SU# #!95Y;l,Kߩ:'g{~{ > s9NYQKd "|-rB*: XD9DrJBw|Ԣ+)"&"hz2DRR`@p9Yiu@(LR]3cgpN]ڙV ;r!t̅z‹rݜU w~D~7_~MLf'wOɥEEIlxXctA(UdՒ*6jپʮ 5:5ذD6uY`ju}Q3ksa'si6M;&f_XֆG{0_tmm$JtKl5 CQI35pzrq_c~7%{pqIA= M#vVPJJ)}璪e<&/w/mL~,ʢd amk@FݐX4)Q5] IIuWHzHR#$$ XdN)@v^2.@@pBߵOz1 I}IZHRڤ)AmFhҵO+8}^=3ͷj=r}skw(`[[ݕ>;aSֱ"fԫҫh=0 ᄑfPH3Z#cFiRK6&͏N8Ml`Cx!PX\jdf"zS͂Bl~_vB&v(ZA90r)4y -r rٮ0 6yEW _ " RXLm3О4Ѿ~v{q<iAC gD)^&KU d@Gx/3sG]<'Z9!A;z\qA-0`x\*\5]}TGx͚|Hy}M381<q(tU 4NWR8G:]tN9yez\t*kP/r+7չ]/AX1$ ]U;StUw(ҕN 0 ׈UE{v(qHW!+ VÕR *Ko葮>"]4q:lgpf(7vvB]u N(џ~YV뢉;g;%|onIPe&x1Tz[}7;l>_&y{s@IҠ!yh/ojwjh\?Wm:m/ܮSނ?Z S?elH-$7o 4Uzz7O[JۜOCϿbJJQ (Sc)r˸se3zs%^[:q^蛇hmC* VJ A5r-T^ƵpkﮅRѵ&z@tŀAU ~(tUZw(ҕ5A 0\3W*J#]}@r9)DW0EaBWFu 08$bVD 8h;]1J'Gtvp;.\hU]%~? kWFymu:@WC{-WPt#]҉ ثõ׾0:tUQ5+4j3 Up ]UNW#ҕv<;@*\BW}+F{oDWzڳ@곴 {{onV휇Z<74k׿d/DG,hC*:6%m}ePrɃ AoWfW5 ռh`25$DI6 Q75H5d cBBmM7|s˺ϭY"hE0)eE=KWXKօ@yo)r7N:XS\x/oOrmr6Y'yӒ_6mz䠶;e_QKd "|-rB*: y^j S,4:Qhѕ -)Z*0ѸT@,N=&~{ؒM1XmkUe8Xĝ$4= l6@DQ*.i3pQ)bq [ZDGzĴILр胅F&1iFJC8-DK..֑K}uV%ES..vq׎'6Y*0T1N{:V1.Ř#!"#bx {Xmw LX7 s7D?VPş"{;_[q[CҞ3+RWXެU_+X{v3H8g{yلN*ƀ%nFvmJ5( R IEPQDq\;nCⶏzb'>J9e,Zt֥R]K&sbٳDNHT]Ge$ %ma@eHYrcL I JʅA$2q"f*;s`chGAw,t^~83d><awފ7/ ڲ`4*hIH\)ǽDڷUD sPYMtϼu2}uGdx5iE5#bRH 0D))f:D5=29)d 7ʧP9&"!`>!:@itG:6O8l]w$!؅gp3hp;P1Twxq5=k&`3u~(+ld5ƅ'Y#+'4󈕣*!`"VX?"VL6{$Q>!z/A#79abZ˘B.V:0} dБ>~ܒtztHL:8}(+QQS1~#3fL) 4WI@l\_|ΛDM *9bLdKiSc^~cy wK<灁1GO逤Ŷn )MLO*]Ŋqi$ttFY1q2`hGESj`/v'!Fm}r >JS fK|khjCQ-=FL5>h>Ep~_i_ngdރ#4o, :&za!RZt>R.G+K1,b<NQϮ=l֏xy;biUM›lu!08mJ(CL L) 'W.$ 1 $Go7`0 o0ƹR(53#c}@fb%xR U3T͜NزBt,ɇ:}(=>[k;hiVU@;$o 22(ǣö,T6\̈́?D2@&Ƣ(\i8.4Rt 7{ ws[Z3e X4Z沉Rȹ&f$'P$q0O7tONYwO p-bѶ;?LJl<ͳE/2Ħp] ;݌i[SmM-4N,ܱFMӶ?oMz^#h[bvcJӧ~}#wȍ@J{!+8vzs\8=%ԳOX~|p {-BoluteK.wL$hvDʹD%my -i#hJ57ޭ8]+}(Kה)MהF6eRtkUvh>= jM@v$g$=<<,gNE!υnc ?t+*w0~4fBg=r_5n8͋BgtWw[iWYWx0 y'aE=7CtwH Næ=sn9/]ofe ۫')LǍOו;j}|d4&˙Xnh(z;nS0x`8=_^Pf^#3*V\1*. DD*V=$= LF`C,3_'uru]ݛN']\5|@yNuE\{[hqt{Ґ)JƩytQ9o,ӶR`9Cs~^ 17zV g# Jȭ<[=hǵ/0ւR甴匨7Uwd糛^5bT6qTu/cG|M;m$Iysm( ˜Z2\2 p-<(o]/AwƬ~MAg66?-:bʎ>o~Jƒoz?sy?/e!9F0NБUE ޵~GAxݻG\:ӣ(a}JvyF% z%.0ynݓH.<)FρtEzJ$h,VoK;|%=z"AA9n [O]{LCWC)Uإs!=߯}o: Z۰vh~֑:ZN~70o^e{su=.=`-9_4*G4ahoGGH]V8wwx^TM6*X:!kЀBlAބF;J]u;R6!|,ݞZkS\I%ȉɬXvJ*kQaŬeAz![i &5u2bFAɥE Ho]R% CDž8c*jXgΚQt+CCGJ[ݶR舁\_^x۝{s^GAt,˶)}Y!9HXKC0;B"BmYhH :K n&L'10Fй S$ց 4"!)BR 0z!SP7t%cJ0){me=d5YH Y:LxBP ډDVƖExPlQp;lWi:Q$K,h:@0Ě.l der1W:? "1dL ODb\j/\DI0D1c/J߫ͭUt?ػ6r#퇵C\ EsEFI;\=,ZI){l8%grf^~޿f\m:BeOqN?&¢WOryu<؛;K‰WО |=(Җԓ쾁 d2a$䊜+9N& nO&ST'q`hE WG'$:((ד˥%ǭ=Y81T~X/#bYڝ~1lF'i{m^^:^^^PHazwy;\ĉf|qtqݙQu &{o47{<௵꯳;?7\O/ޮf$~Ӡ~1YnFv9^OzoC;@stcO t kFnVY=lQpg6@O_^N6##{]dSM* [:pq!!u7ϟFc(8'7kGJ5n|w~|o~>{3?={\q~#0>l ߓIP <52SE\îtu ~ ޯ_ ûzò_!kza,Gl?yMbhɹ։ӒS`2*Dyb#AϓuhkfD{D~I3hIaSL(㓰 aCb棒 /;ΆrjOk-2,%MJ81$1 JR !3X͵EN>t{:EtZU'BQ%cU|ߝGQ-7O>x|pY՝2|S9=Hb?}x n|%48Õ UXrPoҶrA8e]UFaT1* IGlI4[52 i9.Vz~vRZq\ІH' CͫpTbq9/l'#g7q?SCVd>3UV[$EC02=_ƒV/:z!scd4 㨖٥jv¡AA<|$"CV gsSQwVS$=2J_])\RO(;]]}fDct\`?6G:'CDN H#-u6^:C`*Ck ƭV{Njmgi4+ji]ZR2Oszr4տUsˁDMܚJ;i&=>ߙ[Ȏp?o# }meG؋۱AK0H镳FEo`Jc;ͱnR$2.KA0ʦ,gR9$8.DxP2#@'DLbxs쬯ƣ/m*<߸rorTCegvR0LAKmɍBFI*?'VFtZoA+oJ(Ş`ohNDFC4R9XV[=)ZcK^blv7 hV棫 pE,_Y}-(e&l{4)33\KLmOKT*rWV;ޕUo}TBJ$M%k֔?ޥjʧ۟5@I7g ?݌ z49S 9ShC݌S#y}._~?3găޏuNZgOn7;Nr\g&&i{༂Xe Qa(#‰!4.*ǫ.-UCU"EݴJ3oPNR`LhNcI !{nhI<kTp `rܩdc&/fPDHbT+J ]Yoڎ@a((Y)D< bFH)E/C95IhCS㓽wE?=h܆f_|Ɣe-{9zk#s^9$(F D ՌRBrSI2T=Bd )L AR-lB%UZ3#gf,gӅ8c_]u!tAuݜuOד4zr;?{fПLqTŻ`@ )Jcl+ +pzm#*I@˂Ny ,[j6T^ '$&|L+]Xc#gvaXmŸ-Z:Y9 g9@Kd.[6/eQDISX&!Dh0Q@yTkB$^ȴEc H ITGeKs#g>A}#6c[4b1W#Qtӈ8>b(Q\\ x A:i@Ī,;ASMCQhPT8 @#a< k2Jx#fU|/]X##gF8dqrMb\^Tb7q< ."$&gA,q!RH "$R@HHXN/B/E㎶=@<}1Mӏ/tQrיi:a|VSA{8ʱ P@QV I2-WT$: o#jM|C7Y Ղf+vd=*:4S^".΁ɰl**@ro}dL2&SR ܐ#qm3_HW8QKQ]AH˨s!QXdl}ve;v.AhCʞ ΠB@w.&eYֵ5>IeqⳲP\p%,x#+`u" "'+kE[yf8zI N$ BSе!j!$GIR &\*5,eGJIB8tI@e$:j>1cJ*@tp)rSPSlW9jNp/T'b[[߫A~L1} D#*qS)qXJ\ejef*J\}% e_8b)Y؜'2A&xUy՜zACibP 6u[D?]ke4Wރ / G9.!VE P }JE^ZſMSo,k`K PV&GT/ ?(h8HKU60)yH9&[Ӧ߃f}aFWͻwHq7͚I'p*|܍HX7慐 C5_ RLiwndp*T^kf޵2o9 0R/8Y X"g /,@^DR%]IawtuW_IIr)ۘLqga"KPŚ)jҶkyy PJ>K,hrµ6l1Vz#0%6Q-Q]8$"~m?ȶ6wPc/C;jX6ZwWy~27UWjcr6oXL;4O˚ Bd-\0K v) YtseZ㡒zn+_iF$+Qɖ] x镠 9PӁ8^U[~ixJ32,p'K18NdOx!4<) /l41yq_2XXg\Fg}5c F9#@THYi> _eȫz88όH8΍N~3E!I£QFe:?#t1)9YSB{Ĝ~ofݷYWM#V LAXLIC9GLk6ZZaI%#t !ju;&j@PT h!r}@EKc|pDK.:V*8x[|X; Tx5qɇ˧jp[Z;LyFWiȊh]Mť^iI?h*8":kwRJ:ӎ ǗS=zg\9vҩ[aDS7o\⓭m69ܓf2]j}RzU3^4i=[MČ&w:&vvC NtDQ.*oT;*{ NĩtND"z bF-0GF+Aщe㨷ҖT ߁9^eHrVIs3fYH")Nid 0&*^+J/K`HƷ7~-[02:xٟ뛚lga&k:lG)@~|*;B5piR~Z~~÷p*/ Qx9si ݩ[6: &4*X oJ(<h_a^qjC%"s꒦k*q>u;Eooa3}tf?x7ߵ2x]]o;+7M@s>R/J#4uJ V2Vw#9'hztWPJĹ)s+8!(; BP'LGb1:AHHcuJ̆L&3l#~QCK/QRFeylUڱyL(ࣃmBL%Ĵg W4`E mTlFmVjCj{̶ Aw.YN$پ˴A*Q ^׎a'Js9eq3?q41ښ3L5DMrf]/I\XF{:y3 p#uЉjcyyUg(4 g(c$r ѭg<"0i5"p)ZElkْ&sMOh߃=G %ڧu|9͹X.@\ ?,ѶaZ7ơf1.qgaJ4U M%b'*dr"`bPY5b|A2oGWxΡC2>t*0%OAh׼6TH2X/ո"Be^xu=+l@w.XK/m?ƨPC e=ӔQ=RI&)5ЙcmAx)S& 1::/`8Y=MH@m]H{dyarPD0.RgyO4pp7V#u״Z{PnQ HegaYLfh'.QaF<B*hzw1=uɕ oǗ>,ǃZ9̵ګRwq9y>مHQ ou=(|Wۇꯂ=ƞ\uS7Yd#TZ?8=zwoz8!zmzV)e^Gyf+&Aq0 TdqNh1lH^QJ *&"n.Q?>~|?-Ο !2 C@CYx?>u(LmuͺFvXk(??~}_ſF!WN5q܆Lj0BD UW d7*´bis1''BNۜ~\PuUahGAϓǺhV2GM[V߹#Rpo!\Zؐ[h c.Iul)^=N1vd(\+NXS(B$|{h(ݞN+{:'E+КLE+kGI+0oog-o(75ų&gtqi(T>@2D'u|2aG<tJTģP%F!ڮ %^g믗P<|\nny&[ li<%i !W;Q^h(wVC@?π$lkϺ)bz,Eɔn*&I;$NY傩DU p&q.vlk2s](ooU-B>9]7PC,rrVxJ^NI'/1 4F :DKY+8X)P(*ku3`epK&0QAq*Z^ gu< Ʃګ]_-v,7gB%UTWDfD<\*_ ef-3NGMGRfRF8]nױKuE"y]H^W$+佁p BE׶+mwf~W$+uE:\)bCLtʅUQeB6@RDKAkRչ`}66M~`s<}3=99Byۜ3{sիspgBݳů[Ï|_ے<"}VXIC0V H. FhS#dY)))E1hB6 EРE 9"?{F!_mUd@plL_60tؒW3=~lI[M٬⯊*A3 UfS6IlOʫ0iE"xr`e M6 }-wEn? mbIon73H ?ɞRJp! QͧcZ|qAؔC1IT'dVot@4 Eb4]y+6>z/%%l6ru,̥0Vi'Ʒ'&NN`?lz1su6=nJ8ήr~ÑmZ/eJܲwiRTOhy&vz>H/VpVΎ8ڳw:tvTN;%/!jWf \nI~Yt^|>r; _ut7/T9M],u\, $[jN]ږPEkwQNnDauVf+8DXHI.#%]tQ?.ܦSYfMte~TbCr;np)Ouj ̯t۟Զ&{E<gZvZ3[vl=uj6oyt9&+ R+?,v Pj^e2@턷Xy3Mp^(f빻^Ln;I{,֟=g(cO_2C!P;p8jځ`j`tCv`8jځCv`YlY#YUWdlVM9gw%ўcU*U9y(IA?+H5?t&d(,;8)EˏK7']}@4j=9`C#&6Gj|dHx9(^y8)P\2Lp"%;:3ZJ.ÑӅ?>/H^s[? %h^`=w]bVDf0=ήԨVkq8=2dQ'9j:u1őOLףKWE^/jzxqqy|REbe(lӈ:\R`p]:qd2Gwm=jY~mo|w^xs=dF ka̅}<J{3#tբ;zCo5iZ1xO4s'tՍtg7;٘&Z?6)0\Nί=s|1^{8=u]v=+(eŪyq)5h8BjF<*JѴsD5D5͉i'Q_~RF?Nߞ~ûS.9};Zq@#Hu@߳YY |󛟷&S_]Sˮ%]?N|um܍j'G?_$~?rzyS5ȨIT[Z]by_0|x_FX|GA ǚ~OR[kXI6oEqH&JJzK uLlVCYL KRCta+rFέ4Z9M/+ͬWHBXsr*ef٢^tz9½͉ > tdםYm(w{v^&lQ2y10;WR%*ۜ*W^0UyQ` q2xNN|<tIrUIFtC<.*xA3gdHx_JȊ;HL1HcE(%JwZUoF|x f]v~3.zh)] "$?CXfG^@h'6T ,bKQj B%\z%{IE&xU'+&ȡ9w܍Pg[s/LK!o n6TgG^Z=s1W6GELHAgd( V-! E{ l0}kj|mE}IvmGn|s3.o s3qq #zyV i)%8eT]_''@,~g=$ȉUT`m&ဒ'8GrIXLAB4Iz“1LAeQY5Hk ~8㾂tP{M}P_7ߏJ6CISm ?@z( MO`6D{F({0egCWMYiU)]eRot1]?a{goSWD! r@b%7^[H?:Ȋ93(5@%Lf. m³ϣ?PF`:$`9,]((L.1"C[qB?';~2s6P(iYRe ]Bd&F"h'1s3؁Ǿu2|:V3hؽx őxǗW6l> VWc΋eN&8#{J9*n=Xqp5On'ͣ(f-e0{s.>1.2c߂ ] d_bmV\gR@00p-",7_mb=zcg3 3u>43/R.k4>s$pٿP}^Z_kn7?bL"\TNHUgq9h-i1Ԓ>Z&e7KAbA/-)*e2'sL0+ͰWhRr)iz6>J%.2#CYI{m;DolPktPGz];WqW; /7ý?6/T߂*qz"]TٍfA2t oOzzi[Q5kYOT{KWz҅G?!:uuմ^Os2@tfC_?5AjYQ˪ۓ]7(4L ܂C)Moyw晷I~{|tᶚM=z5>\>{]xi9=OVy ռPe 憁Q{]lLFw^/M7KL$jY m$z-)7dߓ.d?nd6/JØd9u> אd"m`%pVi06;T!Y{~}99AaI AK_.(gf hje{L+J]KI 2Jx-&'Iy# F@C0 }gFꋝ͂f4w:͓>uE??! s=Em?7 ȃ*̗,E!EɀT|^]sFWPrI!*խMƛvsyZ\S$C]*|m&E"ALoyI00=bQ4ӵh-Je)&=T\-(93@*72f#CV6]%f}eBz£bί9I3\OR}z~;w˪}4h8_|O f-g eT TUÒbB` 2 ktRE#eHƞÖZ ` 4hzHH+E"rREnĶo}Alfg>k+Kr锔fE;Ձ@/,@N~)RЙ0r!1 &v)\;$(@|`#FudlʨO)g !싈<3"3"qk.GFZ*Jx KI<\M 0* w2ۍJ!!Ɓ3P<"/&JQBS0#sim)heVj2xgPPJy96 *jG7Vn8K;za \Vj@Uo}VPքR`bnK4*5AR鄡"Wo8d;4Ftvn6!^͉?ڵƺHr6%R2|%gS4a,upoe%1l:j7UCz 2tN?l?޵z*YL/t)?إNV[թg϶ڣA{g3h4.g &uD%PhcI&zGg3 uX&c]px. q,U*&6Syƃ<0X=L5?hJ+h5`v e\Sn}Y'p]]ZeSb@i^ֱf;%=՞f +Bg.l|t~9d J$2qj^0ѝÁ2u+=J9TF^Ҷg\0̯x QŰZ01E4Ʋvd2/Ϸ92+2D.& 4 vnα{a.s)XE!rɅ o\hE1;\T*/z.E#Hַ+Y1pJJꡥxIj`f'ĥ2Гrx#J]|%InN Jעc g?}_Q|U])5JV"S+Թ>%jzjJf/4y7nE?۝=@4PJ8&\IkJ^֊2AfW$"){Ñ**aJ^+vJ{r6^yb1V:h# VĀEH&MX* %`5SEВBXYz)#"b1z)Cʘtlt4ƥCm'ab4Ij+n7~pXΩ3kN32PI,JIL[ykIyb"{&CDG`祤3NX,- Jaظܽl[WZײ{a/I:Z:en*4q~6T+IW]m*r x=bgOA>~8}_UhwJ$Jnk++i^.]UϚW_6'5,VeytߊL,e;U)b+t@Hmoj:1 AnT$7tb*e|wPV ~{;P& }:>tBJ>(߆Epߴ޼Oޏ7d3&_W#>crA?Rdi$n3كzuNDƕ¶bzVoԎcҩ:UGQ02JU!^Y^V{L90 ZJsR*%K-`.uE{oZӑ(WQoG IKe5<Ȃ;:j7h#R1(Me:wahXFY zi -h%{gU0 (/h]#9K\S=?.n ՚[ԁoVTâi9}s}mϾU1v}cJY_}$}5K5^Z|U}%UJ߃RX%bq S8ܠ|g3mHaJ3\R*|N?@Lpm.o8nvԋZ' yf)n1ͧFy]/ꏪo?|2E=ʏUAfEB%|ų/6{)A_V{ӋH,l"Aб6GIla1VMLS0Z;Gleغ_[=B~dTN P*=caU]#f~;v 'p:+Dĥ u4e4?V&]w9.lrlM>l;ly=h;Iy;"K4;۩ az;nD>@-ŢoGߎ~oG9D+ 2r2pT*QP*Q:p1zBp%RN}2p']%jWJp•D`yBp>!c0P+[]z WJv6FR֏& Ux2ս<%\u|^Ƿ'ܯRW.(ė #^B)>zLY\LVPq2NTMɾCW٤ nTP|_ӷmrS&i VE хjH9odF8Gb`T"DH ,ò7⃖@}䊫l?ݎ_'jjrw{o84ۓ5`ba ^Y?[qzYeu^LޅZK*J%#t 1pqQɅ la|[/Z>{C@lw+]Կїq`UU8g33NG-A¯$Ƃy Lob~kq@}E^`luYrY]*`-NFr5f'j?#peϓ{p 4Q/7|^<}h2 %rcQo < js}~%m-kIܳ{n(.+JRRh$1 AUXr8OZNơ.5 F -yA)ıH1WOpr2ٝԝNFCwVR`&e _Nu2&Jfa mg+Ϭdkv/RRI9j +12-iNTFKI"(c3sv$?+&LI5F.5{RkA+kYfYFNǴΏUHR4 ဏG ʄWIE'4A!AF>ȝ D]E@ل8>Xߤ&cGdTaXK"2P>B6xјI&r~Q\UQ&UdL}2&b ΨF)}>HI|!lNm 45IǸ0[#g둌͂Km3}ٲ[s7i[)|3s89J`<裁t%K: KnӢ$ 5(O0XY{ Wa}tc d"T*VmX] 0I-'Ƭ9&`Y492E5aREbmN ̮lVP1tS:r9QU"s9>Q3fEGsVC<(kfZ- jNkmIi)@ AʓS1jΑp Azԇ 97Z$!T")d`Xy02TV P`B<UX-zg՘ID9@ h@YFNGڲ^?lV-68ǦrNow{l2ovmC{; n.v7e"v3icvSГVm.u&r+i.SCQߨt*:]e}a9ٮvjV%Y.ۖn=2lȭ;l22^sun]x~2wWw ,t߃YoѯX~skw/\yZl]rt!"K-m8F#qdd4ꀨa|n3SNepTrYj8ld=7ULv#]d)M">Q ŴpR '=AY8M 'K~З_OGw#I"v_0WkTHJ_p(&EQf53]OWUW?owTT=3yqc6y!5"BK|BX!TU(9*QQ$oA_!Z橏PI6FIN)}aE-&fEq0 l<\(v's [VoI^\ذ\ȓ~.ʛ~@5qs^+Ks\)iIP' 0>%J!$Hfɗ(N!D+pc)U1H*TMR8[*fb3cW,X;,+.Le|Z-Г?L_AeJFjd8:;FRHAU@XMPY9E .,5O!;{["DM.$ 4vDDxKg g3b0,ǂfDZ6tYf<'&Jޑ<zt%Ң69.eQ}>9*Mar6XEaD"v,I V95:) %˅JiSRQ'*2A=hi(HGhx$,eqy QFI{(nFblFۋ!.NK,6KvEUe.,&18 "8fs!tK)BL !Y~t,}cbXpGΠ=@X%uُ/&|손?J JJ_2H :3"J|v*(jY >`(Iq'BR8Dȝ؞!weL@s$%%`5\G'S$oʌU# /Z kLHڅA6eF2 ,Luhk1-bl9vWWѧGH_Wg~%>XVTc-#&^I℗VFfCoFP'ZD1,9 ض,-U!rUvP>g3),CqT9*Βj}i blV;!<66ݕ9kǂMm!ˍ6޴TI7}.zugw}!rJ CB)T!eiїCR.[rHL9$stsiз"r6/X5BHE ]".Fi`лe8 XMi#d+z^9 2>sM-Bմ'#ߤ܉Mfl-GIIc҉@HŨ[X FgIK.<'S:/Yf ՎEF&NCDK (`$SRlhp(Yv`̼5QErlQl6s|!nrryw_ݕh8HKU60)yH9&i4,um¼ cboY Klfo%XcBPUH]ArJR =+{A@Pg 0& TIgit(\JF) :pbk)fd2~4,x*%3A/f; Ci:jbh䨼)4>S$'*RpQe\QuS2Xo_c&2J  hl`&W58+$23O/B=!׆<\]sZjl.dȕѩ\UIe" C tS' X#8{a8y:2.$bRѤA":JK20D*T\;)Ni}`Se]6-Ztv[ᐍMB=(*fa~OTij7w7If~Q g&&3 efwZŴ! ˵p"v0tߥskG%Gjb \@D5h3zvu9Hu9HwcOd^.{u= SWFga/xrƟsn 9p1RH#jxɈ]a̒Y!{5sfѠ?7M-x9(BV!3?9hy @ {پ=ۦ{v-GH\Yp  5f |`C/Lu8ttuUgR햎!w5NHtsfz+'7#}C`oP[^q:r;W(K[r.U[JleRS4R. Fjw@PoO.f^2Ӿ^N^?{̕ R|]_C͛@J^qie*\/{?yq6[Fz;-wo˭< Ͽ4y͵ֳT L+xi5u0ݛg评҄қE砩1K`m2߄`z|n+796LΟ&UkVuf~޾^sƕy59Ywx@^?O壋(Cvh{W_9 ~%d4,~P;ht EnwXX_¾Zߗ6d rʑqrҎ `bKjϟw.XJ uH-i[zq>IbucYS P~rrk24m!;͙[^d`'{J.X#Q\Uwl6lst>E{mY$*V?^V]9 a6=f̦]~^3< XwS6BzwA]PCﱪ$ҠH k!$ƙ5a&)5 J|=ɑ%ʯ/PFEoxԽ?Ox؎2LR*(: ZVhF fD6Qi5<}PMW,[ln,EuQs^^#?]xXul=uG)h<&UH4^DCO??O`%.Gq|gI @"]gI_ CƵ!ֳz^_5sw3ވ^5, xW P(jxo6M %O߳|Z^wK}e\jl0Կ~SYp>Cx89{͊OW=l[/ aֻhFA\s?O(+Ydx4>{p)n 9TPμ[Siˁɤg\߱uRN-H"WZSIw,2xG(J)"q`:ELQvooF6(r0*{=hD٣(:PF@T+T::ASZQcmTVȳKLhI兄 ڜA&OD?/s}^֎2?/k'Y]RL6Z" ;9~DH$H2 jN lt$J >i*Q GХTl$y.4 H#v6jq. h˝C`Hfc1qtRq߷V͒grzc1ܞ_nv\z|ksϲ\.q2h&!޵qce_`;2nEv|Yr%9~9h-Yr<M ?|< yXD, faX+z,B:^V4&NRT d.}7צ-y]P")9-02y0$J344"P (7iݫ>&= P~?hbPVʺ{0$QP"=抂qARõ\*5Î3ND#MM_kjZGgN[,X QlaKA-<_ra)nAD D xDj\!4Sac@q%#9_ X>D~ث=̲EeOXȜ~ć[ÕЌ-E0V:my$ڮGh!T^ PgCyXQi\zX+C2`(9HA)gpf (:[̻`U! H H åҕbX,cՈ\~vTs~[e*!紏ho~&Yo?99\gOo\,~p`4&#\BӀtDY:]1;Lj ]E5VeD[JWF]!] I@B[DWX"\BWPMR$Mt+p5m ]Z*$3tVBUbٹLX .e=}QNEY.˶ßiQZa3Rώ]F3vwti{xoK˹iӯJ#\֚*$M戲fEt%n)|Why*]};tzC56tܧ>]mR6ހpGWzi-+l ]ET"ZNW]!]oH"`ZCWnuh5aMю&9wP>=#ZʵOo.C-4t(;Kfi`xaϦfsA56gX$ Gfl< ț4?&sc̱|69[}zv] A,z;n6@]Ɠ^J)J2w<®< hEQ?+hnd܇D[k?g+Sw*b卑HaǏe'a6v PX? nho̕d0~&rQ<$WSʠ\\JRtg)GIZv$*q@Sy뼚-=`rcz2PV<.`>Rɰ7|L`qVFdg"Ik̸g9w(2b,ga|lWu횪JvpwRl,y4m,p}yM5ڳj& \;SmL5MaEt9mOJ,BWV t҄#-v芬9LOOWf򩭫'rDqPmg%Clg0g5t)-$TvttE RpԸUBNWjU<etpk3hM+@)IGW{IW+*> 2R}BI뷛1.iMaZCӌIy[h1PiPR:C+[DW+ekbvVM+ ¢=+0Uyؿ_{ֆgY 7^=5['~ZT,.?h,7 H2Aѡp@c#筓gYLJ)EM Ԣk-g(UK$xwp i0;͎&fzG>(c k~ +V>7Mc~y u y!TmV^-DԎN_w”LgTtqU>Hq}5xz^W颢w{V?jJ9nȍ~^,T|gY򲤊kI#kI"v>XܺJkT;^^ofr{q6ߌŴlN{5pw J,z$T34|0TP@mqA>zc7KVd!0(J)(Rh$-,Dzqp*(% x* s=KBTn!؝Yz2gzj} 0CJb #52Ǩ!(8vН6 @&xDRϟ%{LM6"Áϓ2& @6旦%(ECeՋK/fwkD_J嶊.kgofvZsݚHZaLӪW:eq;IM6ukVUyw銥r^s&VbZ"h%51-*~a$ztu+¥&LI5FҎYZgԸyc]ue iJ4ES,hxD #hPHP+2 Upg$Mc*I(\B-Ӎ' U RYUaxkԚ)U%dG(YSh)XUf|^%ȣ,S{c(+3D6`' c\-Xv: ?IS-o[JtUowetrCP+NmZo۵/_2Q ;fW#‘Ald| 0>{:R#{)6a4.{ X2KY~Ng)$&%@ٻFn%W1/śy$  d8 uYRm'A[-jI)[i#+jf?~U,V]5o)sGͨ^ '0}^ h8bW?+i?fpmGCN͠o#߱ g^0T3Op"^I_|(K<NF,> 0.odŲ,qU$+-nԱ~/q8;|nV7֞~==y9~zwiğ6KĚt G ֡j!Ge D.up!(CzYJ©% H=uh}b)ǔT ®bSQd8*#93uji+e|q/F V^/ ,Q.µh.2g̮>ZAV32R#Al^{BGH$¯( fbW씥ԃvz`'A2]5ŴUUFWqR$WFCo#$;Г;bɁ%@Y*&fhQD F22(?'P:0 ^M .jmR$v]9C"e $<jN*da8{|J*;п?nfakickL?NxR/^ׯ풾s* s3evG׋Ux'ӦGsHrhfH5nuߠϧlv{uϭyp޶i l;ĖںۆAwww0Vyv}M&[[^ݾvw<;:^#-[\[6z]|5oe]Ya;܋ tNU˭w%+0#m07] |SQ&\p-y=7p*ė[ ~$ɮOd7'd0ٍp_G8IJR.2}:8)"y&M`$d/mw@)YF͇?^w";p i(w9F1fRH ("$G*B%t2`T HG4w!Z橏p(9FIN)}ZL5~HK ݇4A\flOOMVWN^nPԓu9ŝ|܆fw@hמS'1h8K<2D;Ǖ(*=H8!RU !9G8e=Bd %L AR-lB*ti1R iơXXxT,\i23>,sQ=&Fú;O2 #C)*i - NVNѮB0KG*$=-ktr0dcϢa˝CȰɥGNI LD|L+]:8Ma<.Уv`i<'&ޑ<hz4ir\엲8}>.Ma!;T~0"{Dz-Ǚ5:)K b/ѦNZ!bU;ASMCQD48H Me#򠀥3.8!(Q{4ŭg7"~b\:%_g1-9Ua\=.MYp&18 "8fs!tK)BL !dD9ca18<<0 `xя/9d:Rf/᣷X)0AI +VA*]a,E0^eIiBR8@ȃFHrd;L@s$%%`5DCSIr[dtƅe5x@S ĩv!p@G6gfԁHO̺8{BΜt6|BQTpqx!0y-1?V뎿R$qK+#G#j&' ٳ Kx4z)-<*H\@D DPd33),9rTΒj}Y=oY"G%>|2ٱ)Ӱ+>dю3z뎃yjUUˣ`]YZu?]_v\R~ah?qq?[~&X- !&Z`{t'?(=qן߉@HŨa[X FgIK.aXbsCmqy!ɥtP6~Z]D"u&RW-g0}'[g)?CP~z˝ `L$6zojit(\JF) :pbk)f`2􈉳~2cA{(辌{z6F5|-3O|'a( RGM 'oJ)q}!9ɉTi*c&EWTm]'oɻ`o_Eyq7Mqez`w=5fl෽}2@Aq={Dp*/? Z4[c(+#к\zT-k}/l@Zy( * |"Ό d-ͅ@ɅSIIψO̹d͔T89""t_Bfdّi]娈f^c77}bEݝj}N>zBKsjXДȯcΪJ*DT.XI]H2o\O6BC]~٫ yUcp|b<$1 I˹ƂK1'Kk$9 B;;F>vQ:%~}`E v'g8m6z\vytswD8$`Ba֒hTފ` #QJ0$I{D8 "d] zAJF"H*@E@nc,#hh-QCF nd~^حT` ;xʏF&/vOCnso-v`,|;$«^[gXn㤢tBm @g$Sx&" ==|iRcs!CNJ:.a^pB$^Z-@W#ODͨ%#셑sI.Bb)&M <$˹+CB%ɵ㞂g}YX'}d]sv_Q8d2KGҖR xaxs®K'oDd|I ibFW0@,#‰aᥞzA2ӽ߮~<-ʚ翶6>WW0xvW;ڊK++P*k1*12nwWnrA:/pRw0kY[NŬ[ws ?j8j-w2DT>!- rQoCLjԜ~}7#mO63\&ƃ77 k;YDLa/?O˷ d֜37K9_>_)|3Owb|!ʐK;u~/ěy3s|1}ENbb+bg̗:~wMIvOsLqr{>96͎vDT[DMrz>0ӫAWnCfK 3gjҀz e]o R>+B3Vjf&()mE~l(gh^9,n+)Y~;Fxqd&j&jacq6ǻӥLgUH{`/Y\ h[w3 ۄ&Ǽ6s5uE6GZ?vh)Yqd!JxHzСs;UQiF$NtrpLT6&)4(A@[=9EE//[x.r86$UW>ԄZ٠dZ%& =Sw.xU>fpb4~Eb}|F^cKKJ q`-A *&XD0}0T((vVHc6և5CR5cxw6-ĐxHpsYު#݆*4=ĐLc.}XC8,JHY& '\ ~=YU+[Cאt vl˶<rxg1\IB*Y qq7p!<ā}Uqf|~h黬?^L֕`T^=Lv[{X-u h?|T1?J to/ xuU~Ymjk“Q9h2RQ13oRzͧ|== S./i-ЫW{gpBfC>OcaENڿ#NL*ҏ;ztGi .KT_ޟ=ca _r9W#;GG߿?[w1d-Asxs[KS͸fBl檪?a((~|ta=Чmչ8=꺓SN5htBF~vp{v{Ϗ)VoP?ovW|oxmO_ữq8k:"OE' |24[7n6:1ۋݾ^By~p1 Xx!W 9pltG15 b看s5w"vΉh7e+L҅q,@^it\xunfwGױģr$Ys9R"dԵ'j1D=DEU[e&w(osߺذv.{ko wV|#&P40UѺfL@ [\ME&k:ll:$^NyO:ԕ sEĥ,3(ٱvnyxo'!mK]4tkVSpRiVF\A)Tbނ6\x*T:ٜ4$/9$bM'\]4E(Q5J_6GknJBa*)gn&Xfcl&,pV6hqfMvKfKR*CB&TxF8r->#8Xbյ5a,'mu Fٵ>E'HNLJwdXdZN9'u`Ie fli|lJo6ٴRĝwgg*̑s e&Y3@1 ΋)u[ORzR)) }Ru{6?J世*%k WG7"Re$*4+l,Cڠ#9r;$ ミVWNp}=-e~ƅaq^k<-:z&t1A[m^Z RśS|3J  ehPȢ"+J[EX/|."pSbd dmm+{RCq}Pl:&]绣mww0M9_\{SXχ*{@BzΚ!0ˋaX޾!MgXR21 SwMd/Q7,'nuW6zw8)݆+wSoDr/] E^оw5aՐ]շ7>$> <ˣj='zH~bTә,>rQM G9u-:wg'W˅)mWg.vÑ_Vh~uD-̫0#Ԣ5>p>,krzwV?C3u[aUP>T # #0,v4բcn?<4_fg2?afXA]KO_"+Ҳ#{7FꊍE(z3.Aj =rQm[7W~s!Io~:rCQmť]VcjKܔLJۘQ 'ϵ$q%ܼRڬޛ\ b .\JuZBj'Sc֧?8568Nkn짩M}Z)n*܉%;r7Ŧbs& qLqNiSk5hcÅV1KTd%j֡uYK%bhGcYRMR!h-ԃ$mH*,fLMBkt5S˜\f`4/z,)i,]K&xDG-G}dDmMPh/cK }p'9yh㲳C0.#BE&V|s*=eJ}A n}FUOQ7p#Aa>;ڮwpii"I2g%y?HRR-FyߜZ^GZ:EWS~4ͶbM¦'ŖUNDoaY9w& .IS9Nd*,FgU=)U!9Ml&eqTEѳ,-21Ia `֑v =7]jq[aYgrB,x&))a=M#40(e*,XS2<麂j{VPp}۴F`p yX^i w\#@H/1y`u"(R(6 z:PFޞ °-%ZZ &OS{`͙ Υ4Ȩu'72 $ e@;H'!%Xp0VM @ dU^$@:ͱvM*0PSpD iZؐ|Ѭ".Ћ A%J HEV ) CQkPdZ?3{cOS@љ haY\L KY1c3Vz+w(MP@΃a:)p"(YKQciZXDՙIL?{ƑeM~.2@>,&I&ԫc"4!)O<=(1%Qƈlשsoש.ٕ ?AjD#q"kU{z:H2,,{nj 8FKu8QbզX5Zl?vXIғ,I#j4ZIP"Hx"ZM/-zrDu,e4XN6H*tM#>Cɗ U{&dd, 6c]!~]뜼<ܲNKA*DE)D7lajl}5Za-"mP Z0XAI'=Hàe+B ֚E+Bb}4CSf0<#J򤱧BSzjt2!׆&I@ b~"0XoCc6ՔWެӥ!DL0PrꘅjI:f"D$aBO(jI-U\ +(O<!G vmz|wձ*ɦcgtPkI9Au{ϧ nˆjtiFߕtQl4ŭS{7NzqYSY#}G*1 qnq9/dnoyv(8H|v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; ѭŘ@bDN Un4NHFZ=a':b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N vHI/ףq(Z%Н@Rkv`d'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v茝@&> qqaֆ !`9>zErDtE ]i/Qc+M (eYUnb`D@[< yC{W_ߵ"va B+w.//Rzm?c;);i/Jˠ=w~r0QީdXeZΤ躘jߑ2(cz=={H:9V~hQJ*#J-hGZ\-GZ vB9p~DtE ]`BW@koJeΐhח8L5C+Bi<ҕ;!cフr"1]!]ES#+j< 5QWv+4B2]n=ݥp2?LW|Qh~Mǡ;BAW]/A> ]zQ(PLWgHWE-ˆ0\#GCW#Ԗ Jj? pQW:ک7tJtut+sBxĕԗJc6tzW9x? \W[ѯ dEY-i[txer|oN_=F*jͤ:CН(6uPsK5 PwT/仒vsw_bM볛۷>; LMy_O~|i3v|3/4MN=3zzD?Ѻ宻^Փ c/m7mvoe]P۟㮘t7ojJ&]Pޘu Z#X)u7VJ)[ -x-њw '~>$yޛKŊ^ۤ*5)ګJL%!T)Zʂ"gU 6`jphPPl_Q7KHz>{+cq;VۗZX#n-_Gof3,vw/l?_{f~>y{Ir>UdRPӊNrZ~(owu؛}G}[~\kkʱFwԽ[tdU)zx崑*X}ҹ"\.e5xE'r)զl(E5'+'\9 Ax hD-}v1d$j\ RUW e &BoKjLėPR!pd}E>)}*c4!xIVEP)Qz[O9'аP>y^FGaͱps/n~(0dÏ^D&o1"vJ S뒒9^zccVb=6r7kdT҅KȔKy'7W{r}ge_~ r`>닎u>*؝?É 4=+Ƣ.at8rXzuM4R-XW Jvff=}eRnޮVKTR7(?(Y:YA%.K$ӇW?[_@wnRj=C[]O jlu~.KI=ڇ޵2J$:Oπfo;:LW)O۲Kx"[3ٷ'HVo)u&U'zD.yN۲{ޡm9tȚݛzd^鑉'9uRWByʧ|ѳϮZyiTHPrt&_L5"ϡ)e`MKUxglO J:`J;tt12,c*gw^ ;q*dpS`4xf:X{+-12Wk%./hf_5"V}㏓%!CxoКo,j[q[pRn;适NדW[4ʠO"畤O%}gnȜk!NwU'[ҕeK5!}"DD/rXN}UJBzF`|pMe4m3ي`%–,!{0U8֜U![HkI9GiL })jgNO 7I'&fŇ)I'N^xXcm}ԏDM]/oaj:`n]&)2@UEϯW2jtޣA m9/?uvN$/J),/4a;\?]mx4l@kI5Wvbk_|wÅo=+n߽?q}Wnj\>n=~Oj}O~74o{c|X\}|ob՟,X={+x}GnRl^K~R3 ~4fC4c1ڗ>tfBids4}xTbk=iSvOa,V[Zѱ3.UQ:0PQ<a0Adc,!jT.SBIg &ܔ PMWr*QEniaC#Q,%q"8 8 54a3u7(NsOOǬ!P{}h+5F1.|+G,ӿ*LQT޽Gkϩ [5`%Z\JIKb@$^xw*#kDq 'Zc)U1H*TMR8{.-bq(Ba,= >>ng|Y=M?Ml4~wt#vPO2 #C)*i - +WaEc|U : 1`b`˝CȰɥЏʥLb`"JDҥٍͥn\TPv jCڽMW Cx*XL?EDQE="nUΣGFN @prR,) ڔTI+Fb|G9hi(Hܺ$,eqy QFI{(nϜ/]82vӖhyJbZr(.¸({\qqƳ"Mb2YqDpCJpRB #XŴTgGQ^g/ד_mZ\8W^W᫷u*[ "dй+[A"VƋ̑T6©!zbG"B$0E Qth MG'S$oˌUƅe=kŵ@$SB.>eF2 ,\Q"=&b {ٓ9qWדO'z~2>ÎP~Xmm^w) 5J'22+F{4frBkQo!\=˰GcزųĥTL$H. ECq&А82GY,Y֗ULF5ROxƜcŧL.~[ێ}m~jS,M_Ǯ/ bt0k|ĸꟷ0Eb!$j6+*`UIPyqa}zpEԭ݁ݹG=rn8zPs]uh>Mul mKS],/q#B"mc  4q0;cBC |2.\@iHvt]> P6}X/~8:"8~]z0qyCfp66>w]~n~?㻺#DMu=yYUOfv lÛp7*a-j=UTWƧk)]rc]dɧx[>NҤ$6ARZ 畓@+siETijg8n;%mtVd@3|d TyV;~_-c\k\ቖPBH |RN.pjicvuݺP+Y=`u>' ]ԓs]d=7U'W__咂:VݴS?j9*j\GGƳU#lJ<9*˥TF)JuْU$UB: Xr9, -U)Hä!57:Eb [Jf$y|ti^؛͡C a]-]ŰR7n'V qLjJ+W3[IJŸ}b/g>3srgGeIۀ@*$2Qʂ\ZpZ8ɬʓ6/Y0l.m)Kp~Ыg#~ճهQ) RGM 7%TCsœDE41j+*.[xۗ`0/;CFUk^Zo$g?J 6ibGA˯wa TahcI(ݍ2%,1яv%8dޡp#$cԒ"Ԓڭt` w:)xʯvm|WkGl̂! < CR',*@VL4Qk qVHBIܒ˸XIEF(tn|nPIr{/f;uݵ}Cvv $td7tā?ABR x]/y# $c.=A|pY靖 F1m$pEr-8"=?j^=}޻$3_g_nڟsg͋Wz0 ޼ T"1*1Tφ_o&Y;fz;|ߣ!w(/_W6̜q@ L V=~|np4kG 7dT9!gzb7wdg##MpΚKC_%g3q2/s _E{Ƿ??~;qem[[+be?~ȿ@{1<(D2@_^Y4`d:X̔-]<R>fLŮ>Do:R;~<%gJ'baAR=q9,7t5}M=LԦ6xM׬$ %2$} Wt*=aSa-$8`)FI AnpBMuA>S$!?=1G>|smz8{{ ʘC%JeZǘa@ "JHÁ1 "5Ч4n?:C̋Zʜ2_4hu ) )7Q/L8$D1)hާPC`]-$M0˸k>.8なr*nqR(Y`T"?T~iziCyi]ʕڣU&N(kф=ܶ 'E7tPO&CTfii#GI*e1(M0N#FƎmN1!L <9F9:>~s18t62ʔ[@0o醌B;Efi.?~q<`M3‰ Ё< ` ?Te4̃31)(^kr^I)gv:Wnl:C:#S-ε{-^%ol$FEzt,.O܆&WYo54;*-cbX|BwM~C4 4#z ]\ܷ_H.qW|cL3mY$v%q<52Pkw=ϩ5ohNrqcX̼i#̕8hrd+ɑA=ft{ʷ|XB6τ;gM3]uMúsUUW̌z>*>G=sx5m9>k{]vݫ0l94ck!ybK?{Ƒ$0#wWw9'w  K`EzW5hACnM;"C7=T?ꢝv3 }g_OvKuRuߞGgo:eۗ|W߽۟_iW/嫟83p|S8X@^2k ۻ[r}^ހ eo>|}|stQH1~@>Cx$뤙5aro+]W;!,E5bA8efy{MdbA?q?Dۭ϶nHaEBc"8nTkH~.X;W[nJTFzΆk"lDu5m`** @DϿWT(- \62$V cN2Z~V9ַى|r@y[-v Lai;/ֽ1k=EG][Xe:پ FVCM邗ԂX[nMd5u4hIi 5kCOzLZ`}$ˑ7e㦷|*dv==@9n~Or>.uE?/_N~?9}GFX)ޭq6;^-->[Oz ɞ>k0_[[V镞yUB8mw)s?`GXsT>S̭e_ 1IT|NƘ[/lB,kCTxw"-?|7q=X_\zk1~||]Եrѷvf#=EZ^i-7|,r|yzSoOO xaps5֔T`el_C]7Míx=|::cV,6-y}B[:Wm]A TW ,B}<Z ]q瘁4[Vu2hLXPZhK4FZaYvF gIUYg>Ɣc|W~yzd3tsn|vl1{29h+)#.+YcUm0 JpFkҲ)ķ'*u[bK.^5`ז:W{'U9<Η{'p^v&k/Kz1ПĺM7,?Ӯ-=,eHnJNy)iFJ}'~y aW揎&_=4ߕ?_I#~.;;S{qu7;C׆](`^uѓyeL2ښouHo,_o,=;7ٞ=_Û_w-{jލ}Wbӛm{,?O<>wuq~t6;?æ;.wgQU\_ r>X{3TT/f\gOY%#Ysq۳?.@{7TtŴAu%`&+$zUW L}E+RJ)ꊈ@]1pp,]1nPa hW=QH5*u]mZ@w] Fo] EzZ:n])!2] YW=(kqTt%>Į+Du5A]tѕ>jc{'] >JI WnN'ZTbč~2O׼;SЙCef엿|>;?o}xöʹþ<:kxZYZ\Oߦg~y,ۊP46_VEUknTG[rK=ŕ9?nQO˲Rg kTSPpt--N]7ߥƒfnudY]Jcvʥ^un"2A'Գ H,0.2<*D{&س4:„tާ+%HEWLŮ+u5A]yЎBBb`ѕb2LŮ+4*j `NUWә'hR}t%u5E]!x)]16] O1(bʠruE@R':וPڐudt=(>9ÀiߍAAjS0Jp0jۢktm2\JhƮ+ >jC.!] 0B2b\t*}MdFi!j2V{H1(&ZbוP]MRWT](ishPA0NKҺ'P7?nk-8HH lt:tBl`'ig=Jg 1.a*bZ J(W_ɺ%r*!]yNFWC*GDҹ *8%+{2 i 2-j S:oNHWi nHfAh) Ltb$3K)b`i K&hâY%Aa=Fka>sh(jۢީ뽏 .] -uŔluNIWe OEWL bוPƖ.Qte6ҕ97>] }au%d]MQWhe̘f=Tp(gC0nd*dІ$Õ9DB9CT_:͐*"FeP55ėoP4S0ŝnP ZqP5 dn7X)u2V.MgYhM= BisG{\dlB`7Tt%dbו AaI#͔t%03RѕBҪ *QkTJp Zu3 J)9njMEWBk\JS!BR7$+ڙjuTt ުEZ/M; x@=/6vO+lq6|۬m^!P J] 3Mﮫ&+ oOHWL&]1Ӑu%Ng]MPW^ebLC*Xa#]L id4-.FТ]FWYԴS`MHW CqӮ R]=)FzYWJGW1] ձJ(#Ku8.!]10i׺Tt%>z] ePYWԕtT]p>KWkt*Z}cP(}n NRWdʊgq30ZZZ5ul͏[_Y;:-,Dz\sJ̷wƮhɢ3ʂ.=ɰz!'I%u]K633W &hr?Z`\Jfi񡧠޵ -k1k2 \JfF[TʕC&+o2DW8?̔`z JpM2ZJ(=e]MPWYW)MaǸҕ:v_ D2v] YWOFWaÁ ֹM;؃߳jmӤae<m^M; 1] .TtŴ!v] %@ut~jD+.j2h`B`J1ȸ:] J(]&+2fectxS-:}քOn~ؔ8F0SѴj]Bit50Xҕ[%U mC B6jm`XHFW듉]WLiUumҕt7$] -E]1%g7E]!q\RcIw%$]WBu5A]+&Έ:JEWB(v] %lX@r{rag%тJ(cK.ld]m[|2t%A%+E\bJP>j29Rt%$+6u%+nej=8$9Z)Gl~ܖg& CR ;0N2`ɗc6q9uG #3ŌAJcNLA%["l: ZAtCHkAp ҵ bZr殅GZpZ0 J VJhM7BrOu1!qf@qþg EG)Pu嵍`ոA0]y%+i_0 a)VRETAe?Y\Ke|GrlyeWdgefg_B~;/\=nrs> 5fs] E^ݭVn¿I_=ǧuyYVϮl)V~o9W:gЖz^h<+ˏk A޿c}s^~}rww/>/j _)KqG3+qU.y0.?R_G四UˈE'{Ӳ:5-yw/5xqtՑ'=-.?~U~ZƹΙh}>+U0oJp֥1kGӔN7Pi_FU׵wٻ8$WEX`gƗ iHlH 8iJbAb~EU, c|+(|B;ćA{9~}%YiyY#ps45lz*;&a>'OlkLi8s)Ҝ36뭏n[wv:`'Pڗ~ ֎ҡ&oРӅ;1MG@5r"a-T<0˜{-D&kc16T1:Vް= a'٩Z+ bjc]~P,}պ7D&r@&*e3sFN"½)=x$3qZ 9} sjs Eč)VK1!'@DD3 zD@\1T;E=6.m{9>!lGt[,:T`!B )6*J3cooMS#%CYy8>'f݀F|6_˟Yi㟇lRLma \,k!ʹoL|cXFy jͨ13Xm3]NfjZ >5}o)5e[rS %[I9oQ Zd+XL966f&XjDQ[GKFSuΏЇ9 GM`OȭZk%6=D sL%[2]칑A5E&~t:dhZheZ@u&v &kJD.MlV<`5r-xbn򀢣m>1h-ipa m88l\\5 ({<Ī5tJ|X`,S^ [ӸZ=x-YWeQ\ZF6 ώ XΎ# 0d?AGuc6>k5C9ԡmCԒ}z2:8:F @pŶv,S\<56}w EĆI좦6al<|) d>JAq ~*4]U &u!fB Xr 2uC Ave =*uFmTzCߕF3x@MAC[@h iMB‚(#2} <}QEOM[ѷj# X,x$`&Iy@C\90ֲ8tR 3ω{]9 l2<)@@ 7Цb:7%t4GQF;] B!3˲dD)Z6򅶎}\ѐFuw*Jq&.^UuA4"VFvYQ2TOv0$$+4P;(=-k-ΆoefFrpC׊ؤƈ))l aR_^E{7qirlrFqr|1PTۀ<`wvj?.7ėz;U0] Y̏vw fLm3Sk!Q Ƈs3TdI.st>Sq ֥̐MbXtLG˓]@MW".B`) >@ M&jZgT^ېehT-J5Zz@ 1}HDŹuL< oĝ 1XtuqU% 95Q[##Vw qCKy1Ab;~ƟݟN75V*dj4K"+tmHe z,%R?<rb>MzC*[Kbib|V7D_I hʠvVBܶKAGյbhǎ@HDGwH|B"Us&d -\5 č\cw0XTY$ fza2ȅZ. u"NtDFXt6*R4M1h+!*iPBЌ]-wC:{ B`I:!%P:VDxhm@;Or뭛ޫ]Dt XH̨ @~8t_Ӯi*}t/ } 0؈up!|;@e 6 I &)n~zJ F/Ƶ#w`\˿.U2UǨf^kIly!Y"4ffcWrrV0ë3Vy;% Ip0!9%׎j~A:zx(=VMJTVGJv*RmJ{P <yG׷nՓf  ͅ +E$A%>' 7Fɣz7ņ(XV1^8qrZiz `>Jxƀ^ :H5-0gǺ.T:cUМ.a] 5dn5|XHkժU>+|Kd3 [:g|EB!]Z"u~kPiiSrȍ7߽Apcv09ڣb51``a2—k̀z1 =per_\?Д n F=>dNU{OYsK(t )XI[ 13pi|l)j̖^js:ri XN6Tj"hAM ?C\9L]hE[A4&6~H_CB#R0=`j?6nߟϋ{qf+;څ Yn2 5.hN_ý/܄]zr1ֆkq]n嗿|sE-vߟH v^Bn7irv&;1O?n! rt`A?n'~\|-/do=S^GJivu>.OGyOtuX3v{Bbc=0-;ebn73ig'N !M9JvPEOY}N ON uH@R': N uH@R': N uH@R': N uH@R': N uH@R': N uH@R'  @+TuTF}~\f]=?gڒqIn]lƸ޸$7۬^q_>ڽ!)nnN[+ 䎝'DJ7o-c+A3_ ܞ8gyQhyCiq?XK"S+6CW6O+A̱ӕtƘÖԕؼBW@K;] Jk^ ]9$+نf p:r;]erI%Uޑ6!a|#<]tʋد'?grfWHbygw'?ˋ͏vwB:a+9häw$A!CӀkoY4-(sT~4{!lCW>E`6+]@ϼ!`vۡ+6CWtJoYU]E9v;t%ps ]mNGOW1ˁTJW/R֚ ;7V hoPF~?t?s+^azhygoHGv?՗.=]!ln ] NW *]}d3JZteBW8JW/N[+6CW7nF]m8~*]D~3xdceB^ykA@د)c p]ʥAܱ_ZQ3K H}J\ \&3 JNJW/ ѕ~3ykmH0[`@pU1b/0iL EUϐCRH52Ikdzfy!v.*QK䩫D%CzJ-4>#uK~>*+ѹD>ytT*D=Gu(њR".F]sQWZ*N]]%*yՕk@>u3z>D.?UVSWWJ%uͨ+ԋ6JY.t0űUGVWV#k/; u%DzLaèU'4+˜>u>`F]%r5;uEGʩTz28#ulU"|D֧Jt,ՕR>ZHv 2ٖRP'P7CPfC~B\Fb,/V'/sH2A!}1ϑIfJx,&J F%BٔB!Q~VkCODlh|6P|1&ұd4>gL{?Fp(F)}ΗEC}]m~cY4J6T>l{B)O|0 !d(i[J6l7:HOk[FF\?KrcqyBѧfnq\o\^^ޜhڤݼdc6?>W52 _&:R \m2heC _uK^aU wtywvh{E+.×>򵴼QZX3N pֿw2Rb1r7ͷQ:~?Qۼ@>77i5^ f2;#7ήYe=ClJa^iGjC:lZ{1aAb1Vٲ?91/)/ Lćȯ—>0/T?qz9ˋ`녵n4 Ͻ0lt3(0KVki\ ³·CXbn94}OdkY迡Ycn/9 ]M _޺y9MZQPշ sAh0 ?f*1^YIL:;O;1Nj.=|a6~wg`sT~Mqqɴ~]>4T xfڼ-P˻ZuŤ-z=AaFȝ-1 dlo4i/Q `~Yf7TjO,uPQo6O&jhd0fzmaҏ7FA#]/j@rAGo¬u k/0)zKϘS~Tx ؏{borք끔d)uFέE& yeFgʦ<^s޻n[o&#J QtDb@@:킁Ǣ0t'XI?J0-NZepZq@YL;4 Hm#s"D]KHaɠ)6HԔ1тFQ4`pHG&\Lqx2rla[X ma[PkBE&0O Vxҿ^ܶnj6r>3Yg]w0^3WS؜,YI22|Ɂ,J%11r&Ht:D@I41E 62(C/*GT KIw>,L\9KN d4n_];[*V$ +}`MX6&qTq׆c|3KL54qː_|w4fn{!J¦ ;Ʋ{@lvSW˟GPSg!]gb[…-¤gVpc^de 1nL&2ΜQIc`tN!ܥꔝBN!LO)t-ʺLSуTT9ŢkBh MHEENF,cV.Q`NL)\ 0Õv̂i,<NїKȳOUjӓ1oKrjmK]K |]\|jdVTT._u2i٨a)dz`0"BL$Jaadfrx4,'w"{dbh|ס sU4=,#)蘄uKP4NXDIRܽx=Ts@yr'$º{hH&%Ds1!@\pm*EhgH V Si4}iARϜ066rCF{-ritP4hZJ;u#H RU]sEi"a#PBBJ1ho8j6S2+Z6绢vT—=u:6.G.1əeN;0k)v* e'fw?m\~ 95\AS(RE+-;b[EV*#9}X)<Ȉ`ށG)R"-6it*:[\[x N[lBtH\x9pNq8 ޗco)J)ܴ&W#}kQ_jY/@v .ko+tq**$>u|&V0*0ѣݭ._+?^gE:0&&%1WKbd^s5 ( `rhaChwK9%wtU nfY-Q`(U|4,zj՟9Q=겓UVeӴ>?&Q4FFqkTlV^U`J?r=`ǿ2_Ϸ/??|&/>%8b+HңHQH{7(iho4Uv줊dyӹqw3[?ϿS7⛋8ŕitO$o#@9UyF-뜋;Q\DTv+|By6 F>OX\E&xFHS@j+W%.r#Մ`P mllX*,?Nѽp]I cM`FX&&0UME &"[fO=iǏd]3~:V!YvyCwvNzT!$UHQ+r'=,f HRpb(} 4jYk hL:+RE9 ?.?%430Iٲjpz sZd$A6ƒ Elͻ' ;𕈇ѳc&?1)[VE\ 6 #)z3S18PJܦ EZ)"V"D` m-tJ Ca.l 9ݍv#n ʡu5Ŀ?=RE>G>h``A`\I`ZO`SPFgF0@ԟZj>˝W6z q>f Y$u7VSzr }O$ (IAo19UJ"bH+&Ħ|RS Ř2m1R2(:o;c޵q,Bs/7~y>$WkdHʑ俟!) %")H#iLW]Uu@,@mrpIX90P~}+om1;|=jwy/yݻP2*|xד5X7_Bm>_bRMmvw{oQ ߧn(` [ϟ[Wֻ9͞jqCmb [V+m=]V׍wrVr>[_ݽv̇*wY+ rPg %j&9onezX.pb ^pUTD`Re]#.{"ݴe7kÕf!HsT ƺȄQई<0#9pػ]-&O^?CYqqzy{wlǤ{3P0Z92G1fRH ("$G*B%tlDZR3 E4e'bF()E/8'j<Մ4Kfܩ YɃ>nMara ;Ŗ)(x9zks^\9$(FQ D OJR9I'qz" 8,JX[J؄ UsbJ1,,B(£bn"[iyܯ *;M A\+QH!UIa:%WEޣUu: 5dgϢc˝CȰɥЎ2hR1tibs)q#Cl jӎ6[3+қ|zGRu#hH[且Awi aRZP@ & E|$,2%(~`]H"NuTbک'dg30C!x("ˆ(zDqѷ@rgzd?.Y.tPN:i@UY(M5-[<"RQ o˃2θES$Uhؔih-_~ԼSUˣaY[zy|WItWR/*aP CMG%ʃb ]NY(hX/R8[F, ](ry:}Ê\гPڲ8gQJL6эB;ޤz&QHSZʤAt8_̾jt9KVtJi*WNϹ!j:RGZ';%mtVd@3|d TyV;v_-c\k\ቖPBH |RpM9xhcq>]P4/re! 2687g ݱMfi6^}&{3 Xɻc[#\娄٪qiC -V*#3 ^''<2|'c;FxT7t̙N `S#HUJdmX#|\RBqm$ aR sLxʈa"mr1^\%n$:\3kw L. bww~Z}{41!E*M Z^9l%)a F^ORuFOPuǓ< `L$6zojit(\JF) :pbk)fzvjv*b,*[mb(}yii+\Xύ 81 L$xZĜ$k"zQj}^1{2sbnie)N''ܩ'M<t YKQyy*R€a$M#Q!Mrw=`PR $ W7ʔz?yKQz!㬧]gOUүs:lCeDxZݠk9x>c&2J  hl`&W58+$23-B=[G<\]sZjl.dȕѩ\UIe" Cׄ S*Q3*($z}`p{f:2.$bRѤA":JK20D*T\;)~ׁ~oŖ9{Qp>en0{Y^*y/Tkҍ7@2LL>yCibFW0@,#‰aAeKfA2}xvAh{_yk]`s 0|">V\ZYJ$WYcb8:|1ˊN&O8\O3۸Ia-Z36f6໋hn1V7ߐ!: ~n.݊$QCLb8i}#nOl"5鷚F2ފfwLs_Lp0N֟ot2[0̛?se4n}_?/7Ï?_O|Q ׽~!^. fl>@jG"(kgLGS?c^{d''G~ݓQm".dQ("OGOw{-ߒQJ•\}j0X_;\^WYz\YwgۋqٚosmT2>n: ,mwdvdb8ts1s&) Vw5uF-tͥ͘]CʕO;SjQ}ՙfl(ghY9U9T3ny1s4ώ,ʅmR}wHri=}n9ޓ`:~mB-}"n8^WVrWap]׆RM"#mǍ! .;e5AD=4.>u@^%mDTX A'' 7DEٮQiBA ں })y$_[ +㟋wBٺ?\KJ6R+CJe\Q W(.vB)A^!\1Ea+b]+B d^#\qŵ3;WY\!WYZW\i,=\"2 tj/㹐*W5g|z2 nԤ+b 8oob4ା#*J+T7 7`e8V9׺ܬf >-wCZXZE OL3Ub9̩T#:khqX%9 C{:(T)e0?n8~X !Aj1ңSՃcYtRWU#jpr΀F ᔍ~86^ sDV'2J@fVDG hPSA> oW#˝e8kdelHU>maS @=bQR.m܀R΄P\4=>Rra+ -MR$fНHhWUִ>RI^\e UXwP\C:Ϝp=-5•҂!B+tPZ)i Tw_#\嬰K`XUtS'm,W WFhr=fcB>dQJ0Njb1N#KXTA8A`bL] irL َ>yCԒRÀ$wWW}uޫzje,^ݡ:9?ɾ{79 E S 1_<̮Ǣ|ǔ= W05G_g}?i!L5`:+iV_̅C/ErUTA-!d40&)`Eo}~%,OuѩcrKOaJwƔʾs:>@?!/%ُ%wT$0}7ZDUi&lu\Nvo򗂶!5Ӛr`Ѡ# j FcBI[槣=^"-1gNW =GPR:TKWz0M+LIs*e)t qtP*3+BAtsJpn ]%;]JTKWϑ COiLl=ϺөL1Npl K'ʄR,Y#E(A++j ]ZITXt J`8i]`CW .i̒]BUB)%HW*e 3ܜ%UBˎ~.䤥gHW 3])1+Khc RttPVm[z>tIt7h=)t>V\P2jWO]OwiG8 x$P#?A@Wv=&T=] .= !pB)th RNWR3+Bab 5\.BW NW B-]=Cb$DW 𣇐> .M;](uKWϑX3?"AS(Hq՛z3īďv*V=|4Tzyac{FX**d{J#/A;~m/Tu NuP'Klcn:@aݠX,-$4ei߸I(jr Jl ]%17artP3+1y3'17 {%ծ#]Ikv3'7-F*ĢgHWItKJpc ފѯ]%v`+pCq^[oЛtFfK>_KNE(ۙ? ZȤ8]M;YV|: @Y=P[6Y:.҃7A-9 e-^䩎cڪ(n#)clx9(%9_WJyM˞ ƹEYgFROǿ>e_F72:%Iq^PA!!u*J/+(K%X+cc+`ǧ `kSyyW j"߾,omB 7PH` 54Ҩφ,nuO򼷡P(tcK8X)rYd 'SȨ;&4R.Tr\#8X}؀٢, `pJL=V+xhR-}:[]b,RgQQJ:=c1KASLFJM׫#&RB)7j<2J#"( -J)+Z"D4֭#f:G-.Gꭴ.LP*Oy l_}MzOoi&XeQejF;=}BG,-{QVuj+H*&G7%2nQ#@DzH([gi"YT$"^JuAŔ2POk]QX#sFu I&#:D:9rނffJxEJ.W!塥iر ,iT(v9w !7COW 0z>{*B`SaDd~$Ѿ ~'a6N{KId۟Az:PvAՆS*# Fx6y70Fgz"En>Q~>s?d7w+*r~B|f֟v/ ;EʯO.{IN0It^S)?%3yIdi:,}fˬ7 R\qGLܝJˇ_ߤOoV0f~Cz^ |.aߎM5N oM>GCOmGn$E m(bOt k/h -z^ޖ5\U]ۘv6H<[1 \uK97DU *NC-fɬ_3e>)s}2}̇1fMyIɵwy_.|)=&]>/JͤB|Xv;Lo?_Oq?,|ߛ!Ao:gL;fb!MتDs!`rCW_M`4]V)mn'gH7}onK24~?I9LԤ"}eGA%yEzk*  D+$ mW)Jlij Vn^ύQ[>t9ܭ6q,p* :TLVQR$gOA1)՞2#hsz4‚SoU1UF@&hyze -}6da|ҥ) > i]>T˾y78~lM[U[Ga;V;V۪:S$&nQs L\:2^>ٶ#iZij״.낔̽LkϽ AN>%kP,sXmC5:WxG޵V AL $hV8,Pp#"+M"! RHYvrk뉄OQvz1 aZJ~f6΃IWb/txQs1KVp/hB`96V*A3ARAFO$ ľ[Nc3Z`(3*h2:Z0rŸXJBpw,Y7YO@4nqk_k|wvpbS,wXݟeI vr!IVF߼ΗǙ?ڠqrƍAĹ"*"i o#ʺ%{0=+VXuݘڲp+Ïb@4r5\"̻&X|ZFkhN ?0'#JS0 mHXGT~0!S&@U1 &^$^ 0iH-W ݯzD/Mkz4kz1KqWU\q{ƃe{a-X$.gJFɃuP敦h*hQDs% N028m.B}? QC4)J1$9|T P:o^ёFuKӗZ$晗֥\xʨ=!Zk-&jM39k^ Agd2]]_PH޵94`L3mY>4zq>@b Vhz6ϡ5 Qyxv3_x~0L[!6ѻvf{x2~y3kҎ#-2!GBGx[F4 <*cO3g1b1GWٶ.rӨU<@ds7uL +!=It*AlT"Ol4no~ﯟ?xo)o_<8sr;"@#d<LCs ˡn PP8|7}6]anv( Y} ,>#_tGϻ4!|s6"0S`*4*D7y^AMpz[U[|$zRIoGR dhR@T;BIX0F!1f5%6Ecy#}`BTᵎcp%J|1$1 _,"Ѕdpͧ'2|&L18eGĄYwCwv:ȳPޘsOy@bivytF`U2)V#P94o}בtwxHBz9%<@'IhJ0E꥗`Έs+'moKYj~;*/7V(ek2& p͜A!EzNkꍲV\T:-!46pMN'JVR+ tR>b9}NyDԌaͧߔ [\J۞E&O/,ilGnwބ c \bQGgGF=I3(ʛ+be ]Ku%WA|xKx=R`NpZ|z9:< Wio ~"y'Oh[nI.4֯or^z&ۣ6mI qݨ~ Y}3Ǒ{s!vvmiqrP& G(=a5 9X7'ml.ᅼ>]N{$tt5m\?.UNڮnXw\>R:h;?y.XARLu|q¼i }h{. ȍ[4a^O'F9qxhAx.^vM~f}TuNWm$"*NY+u 罸A; tfOV<1Ftp8 QP0)q+/!o떃sLN*H*"R}2 OH PLq6z7SyixU}' ,űwəS\ v,l5pzl T֭"^ ( up`"Izl l ٭l_8@Zl`mE6{YF$R4\/z#J\Y=L@G 2\4/D0  Z-Wݦ"zr9]S^*`ΠEo PoIhQ܂L'e_@!0Ɂ2-x5U.3SIT9it P y' 4D'mdR)P҂HJ d}>L:%b iD &Έ$UM))VnH(kBYshHz2\HT2!xB~X{tgp#5h ,]r4,lr,. q@U:V'y;9ur!dwH? N$|X'\Z.6L(\гi@+% ΌI@e$:j>1~1%AGp(9ͩZKu)a6GUy=n+ee˖k}[ l%HyӇboN%?lvq]̉*8BUV7sk'Qح_U11k@ 茝C$b4(-#sPʌ>'D1sT&nItQWhc"1MP )K*RS69k8rb2l9jU V1XI":M_xFc?}3x}؏ 5<9սnro鰞Ūsݧ ü742ͬɃWoMmo,)*9XiBD4!^M>ycm_Bt:̧-6c̺ yB٠A8\Ǝ~燘OuagуEGFu{XOH(m=%JpChAEÇ#{(s%øsgh.)u(Ef2"C)yD9lz۠|nGrm&ow;MD~R/^W5ﯾg*oxCR=Y8{{zgT6^ۻνȽ~Al]asm{qeEFnl6b [VsnݿmIwww>?FGZn\ts۷yGܝl?yQͻ2'8=x-K\<龻]dk%crg颲lGQV;45dŊh#EL8hCkfhO⋿+$ %.Te-*Ce$'1h8K<ʅJIKb@$^*#ZX\tH2T=Bd )L AR-lB%UZ3#glbg ..<.\͙dxs46娞5vP!e8:;FRHAU@#&*trfѿ*I@:\8 ٳrPCdɥT &e8&10mc"^ҹɥ٭d]Ÿ+Zz[O{yM>k#)xeKE-rmr\ʢ|V]0 -(}> XK ǘ+Bxt>,Fn}8tbV/jDQX#^#q@r%=2jtRK boӦNZ!cU;ASME1(GD*@8 @#<(`)_Rţ=i[E3K#g _CX/2,%EUX/^/zqdz"Mb2YqDpCJpRB het,}OqGW(e'PaӻmU$>_1T!y_J JJ_e~-t`%T4PTg1(9Q"$NеDȃfDrс{9Jm)d$9-ZsJsøè9PCq),څz6eF2 , hk1-b9uWozp!Ei;X^z7R@j$NxiedV0hD:"B˰Gc`۲TA҇*&h`$cu2CP $Τ eQ48KA[Vkׇ s{6emYmÆyb?E>Gaú곴Ei[z|W@U 41N1˒ D5Brxfe.{EV'2J@ƅ1V1e7ق,}6SKq-PՌTG@WxSWu&n*W61oZf^9 (tE74y?fl0X4NrWM}7O>7lBd.}>3GgnxxcѲj>^XlKg/jet@֥7ɤJ++ҿWQNNÛi<\ _p~_qsòQTj~EC$)ˣQu6r,+,ʧngJ{I_!jy 0l`]agMFexg0|2yAR]Qs*O{KQo'τ˓e5߿^s[-A#Nz:]KI$O*yS ci٠޸P3bwj-RTMZ:dRL6WYdH|B]v@`,&eIW;/{"e#L\G%3G!U#ihY$TeWK.w$ˁ{f7Koe`̼:>^'RB6_I]7yw58:(A{nr Nh-cv \0'tʣa }Ƞ#z4:w\o3퓆7&iL9yF"ag!lu!,:zTa>Bf"RwZ \qI+:\)p yXOW(᪗X*an[,Tim+le+0A|pԥL~[wn0FzD37(Ґ`J22>}wNyg$l` \`HmaHi{~0 ڝ\Jس+W󳁫"<$)j칇7WH dyelIZ!eኤpeΜ\⌌".ϮHMHEJczzpe✌AXK8"q-;cuJt\ѷW)?.ڼE LZVw%]w':X\sv~ZfW?%)K)VE5+mas2^4\`^0.-7FX;6PEz7 o?~st;WǴzGqn.0%0;3g7ƀ 7 ܳh罎T`&HS_D- }߾o_/ۗ}߾o_/ݝ(Βm7>|͋ ǛsDSjʆlziBeB4n\gKlב|[vïl(f')fAf>)Ov`R+E1edd&S9O&kQBĪ,5"1y_[y!C#yF!4S\:hs z%8#3& sUWT } dd.ΐ5"8rg Y  h &n:Tƙ6i)iZjOAo.c~X-4rlӋ`O}ND} (ˊ?ʀ~)'k9G~#:Ft-GW$K} UIɧUYܥ$=rL2G3z'S ]VgVޓIYx3PgSfiY0V'`4?T8d^"W*QϠeSfTWp ?䷌uFx= ۻ~jNBi4\L'a@T4t8]U~81`lSc|#dhUot7krHDZA6G"38cY`;?Xtj:%2 !~n[k[>}zك[Tͯ/<bʞĺF5X~O]J7/RRX/FK"m u_-zqJ>fx4̆~At+Gh">-8H2vҊl f4Y%' .&<d)@{ [HQ`Sj4G9ɬ\"C,[{B-qv#6Gq jWӎ63ح59l+;}@cL@#^=YnIZWR(e+YiCXŁ,d"LȊeX1E6h"$,Q85$ 0vǡ*#GĭY#$T6#""hQ{b5;.rcUDt46@qRG}рg0btdC KZpCVٍ_j$\.-묦%⢩nHm٫zPgH1N#-ZKcĈJ!!2,CŮa5 XVq7YUYےd|GqA{P#XYpSJlr;!],h@8 U yp So_e/׳Q/r\({a󺷼2gvr*_їc.1y!omh!rhq#W}nQ셋ܖFo\G%3G!U#@*R2IgOr]'r`vIVƳo[=XC,R&*w鴛+:+]vu3籫$'jǎzn͡hFzXZZڈ6Qc qc8J9Şru,?<o]>§Q#ġΥ*@ZE?Kn1.8 (pZxɗfP62M$ܳ]M?6 c$JJ4,3=Gz%$Zdd> i1wG&NŦM;FEc8ןLa=r6:4moSr{t!n(*9;1-)qk)fLTԇ6i-_t\00HT L$c)Btr 3HrJ`g@PҹZ2+YS:8^s4 ɻYu8>N]][Pj*m^502; <q6B!N "$H4ል.F~|_W:cjCt*H6U\==Z߆1z0 gul}hC)r+*r*?}gú%lڇ li%Q{pF4dgտd'T5X|>@/d;x(.=@/k=5gl>_K`BͰ*$7;I>$?  lM9H}9bHf$0ݖxB@T8ZQ |kتvYʲQ=< M:sm(\)_\q#)ǘ4ͪkoy_.|)/{L5yW}G}&fzx|P1ntaK~fZT&տ%|4xD:oLA\<0 { EJRڡhzCl8 >B%*a4uJݼoH 1 r2jRч5 &[v5+DDtW j@Rtڻ~Y7ȃTLi)0jDlsO541U,9[y$QKo-<5h0; LT%9&K&1S#l0\q-81e- AyazixOl~-Y[[wBMZB}p-ʅ[U;2"D\'u"'@F@ O&f6]|seKu\Tyn3VXI9c vdq,SYAVXd|] CAI'L;;"[6Y?)D HO$`yd=3G{+D5Ky>6_Nh],جc'=tDj#&a7Gy'W܂@$[x  sH[G‰uXT'%GJJwj]'lkK Up;o*$?߂<tHbaJx ZMfNT^e3Z8N~}6dq|$K{|(I,;5 r}ECHԗsqi42jO%j/m ,5I1!I#hkKLs۱-,j%b?+ "^5l;|'=GBtc 1h7 [SpfWkfaR!rm|9d'=)BZ|2^AocNлװ잀׮g8lԵў]l;I$-PMUiN;.|+'W}ɗ-[9,Q!N+KY8M#y:qKԥDҔH^ Ik N+N(8Kig&`Qm`$zNlٮ zZ=}" RLk^o;BǮsϠ}0~f=`ͅ,iI22*XJbcL+to7:ӘGPƌAc,6ʌ ;Q1,%! Z\tȴs3_)g=>dzM|e4uA(kN|$RdeWAG;߼LuԺ&e0(w\qcr9qHz,ܧj;ҷQ- ѲDK֜ pQԜr$8`-Qh :]TTqdRi U (W J&y$ ȵJ;fAVSRUK,h`8zqM:&>P}zus҆i9WærU-EOFkE tI5be8x4,&EtNrNV uxׁ.l۵,W/yHF$컅 (Qq^,#F)ݍݖ 8B$rfu^B2-% 1#K6₤k+TW38C*O,鷑붦ښa$Ɣ !Jz=-ritP4h+U.J#Hw.>H18C1OqQHi< @yhƖ\}p]vG4ΤyIOuzf\3̲:6fmiI-d=xotј1¢P7O R3O30BVYr.NHN/P"ԡ4g?V $ "DwzRP4tx CWA N[lBtHp5T( qX2fEH,c3#O} ߲3SHQٹ i90B3NmznB`]"Jk$g 00yQU:+'0QZyS~>; fDk |pp|R,ˑ]1dp;:ŷa3n Vjl)E#ӦfHs3\46Tއt4. F0bǣz6uH֝lj1MNq?|~O\z> 8f 4pgK ƯTXV6378;7޼N?out7Î7YpɹM¯"@{pvӰjګ]lrR /.[l@ł0y5[8H=4A@||OrX昬i@4ns.epDpDqUX/̝B9L4KK!6}~t^7{^tZ9٪y;EޑmF[w w=rGŁr#υN'+ĪmqvlCn;c`-ƄQǐ3emn bQ}xvÁq-qpau:\lN8fĂC Si鄽xhڝn 2lj%"rTje<QEpy /lKg4*ƨr KAThªq(jN2Fms)Nf1}raRZu!a/.nI-5x8(?lY&~8xMƳm?T7"C1(˯oNв`X^*{65#o_fPze`b=Qh3#н:&u9}ewƘclr6E|XJu6`%c̹wPTrFm`0SѴA WfՐV*~(f0Wk,Lv,+uR~kfE7٦]p1;k_E>ג)E 07w|I^G$XLP&Xd)d06Vq.@pD;dZH1-$rBVm7-δ M ݂D0G;WzW*Q+ŶUR!\I$4%J+3pU3p%Dhntp Ja!J݁D.ߙsDzJT#\iL@!Jq:r"WZsy:pE֜z҃1pu;5{d?2\ݎZHQv\:+-U"]D-CWJ;zpE|+ 3p}lz vp F!);W\*wm;\%*KWI(CXb)F0nu~.t"XL'rm#ɮmv}HEyHv^%Gg \R[^,IuuSZ MmiQ#4}4LJ.Ubb`zQ&Y;E[ rn10\K+F~͎Q)O jQ8X+Rst(]]E\ +NZ/YbAFI^*N!.喳=6-bt AwCWCo_B~:5ϭ8CLa(7Еzk 1҂o] ]1\BW6c+M@+>;6ag\tpZ ]qJ;]1;:=Zb;]1c3(t]ߣ}@h3^xL0ћۅ;'qZjXRdby<+cC;&c|\aq~)OE:h*(Sq1h9hd1>;3p.y;d,hi,-0\Z}3yC,-҂C!JK+&-stx7n(-tutŏ>=b>o0J#bN*ФDWDWbڣ(g>E{brAцpt(:AJکE5QZ)O>,=t. |jw͏_\pD#i)o󦾾p{mPζb_.]i/?WqdmvP'UwIg.?_U͛\.FԷ/:*}]0^tͶiK!EΫ=~V~ħG|r&m;i`>dqpov3fOܫ]kV⇑ھ~7/%/d++/GGy"n8 E ^23 q ˑPO.w]AyyU߮j yg%l(Xѵe)ڗDުu^WY%*wson!,F'h u0n _?Ⱥ*-QঠJΚT|q:+1·QI7p=c+! حj.ԹZrSmr)fͰiA5S? cTgkg,˝Ё5 }P cnN-K;&Ut(Qom΄1h)EL$?'{j%{(*#i Rڎ(9ٜ\?]= zͻKHXùڊ> Pjf%z亦ܕ)aN",=XzB8 9\ }cK5cEG*Uc|hϾk;OoM- m*F)]hmlMʒ&dK*0L|v 4fèiLU7*)$(@+ЈK*IMk߼ȵ}U 1Wmh^u*L%;KT}P)}ziyZm$Bɨ|W!'c#ZZal%E6 VAfY0H!Q!s(Gh٥=6*˂Bф i'S BAnQxi h**u(:ݡ-!xy9рO+ %eìaD)SA!V/qW$Xtypl5M)[:+BfCwȼ1- YZec`T" nrIiޘȆR{0Tkt d{zCl#Z}A)9x X5lGD&0VN ddR!}EБ$x4'X(F JH]鬇-? TW7"j,8H&`ΫmO J+Y1w|э!ՠPw^KCp2P(SP|"(0,JLhW4 ~DUuASPbt,,x:M;*q b LN QCF:7 d&/sAGM-V│D]`#)#MEUP4kϒ (Ez@?Pi_ ()X|A*AIr"YUDI)bug6 f1J%)`:ABPj]'%CࠄfNY"1A! db9WTU/<-dPgF\z 5_nݢbA\ JHN+ zP( / j;?X׋w7~eu~{'c^ [ŬΎ#D m3bka&a=Cw /M TxU2GH}1[Zm\U&1:#'i.Y. Ao%CJ$JET2R,zGpK`)`^(^b u6B[!q2p ?,0PFu5wNu2<&[WB(NCO5 !t__]\ck>wyv]8r VL%V  XF;KrIn ȋY TɣB- ā 0H HnLCQ{X1z, mM!gT]lJvl >u))Ģ-&7ft aہaryh0%/@Gmlc:LˤUUPZ|MC@;3 5\oFʰ*+ᠻFRC!ۖ55KA2Zu_@WHgYtggMU@HhP*YxPڀJ5񭷦:x$TIf-w;$aS6i` nQ}AOjL!zse_+{ mڛ˾\ϕIv|I`0u gllfѳq5eo@(xjqh\ k1i3jFq5R 2j31frvp gid <%2`O~@rXr* P.(7fho8w(uR"TP=`C(uAAJ@25di3 A)>CO7q1옍}u`n?y xMRP'w d!G!ϺCE7 F-)Ca E5FHKCH4pY/xp`\Sclci0Ih j hN\76xk+fnQ4C5kփ*H]6 |tg҃d &S@rZx tmKzgAц?jo9hE:ڡ*f`c(AdJ bdzl?-Д nFГ5>dNҳ֞fJOQ!,yJi]0d@b~pQi4 65fs˥ZpU].!b!c-ٱP|_EH 8iBeN V3f !KkBE9:WXD;j0BI>z= R DŽۭ^ϗ^rumm،YnӚH5랍WyO?y}n&D캕)R_#4ƺnAtߦ~{[| vZ]Cn7i|q;뫫vMz+f}|~ 0 7z|xտ_lN77g/_j^(g %?«w?juﴹvg_Jgwݺ=4/Bn{7_9ڸmf-0晜@c9 '"q= 1$'s] tBNoGt}Yt8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@' 8>Zo>W?t[M ݾUzw JuI%&7.eb\:'Cs>,{b!.Q:Y"]^-bi1 h㷎+Fi ҕ%iI ^N1p=-m:zʨP^;Owˊ.rgΞ)ŸOӏ҆#K`n@u & 0 nhŶڃQ>z{G=s =J;dfjM`"xch6R~4Pg[*4͙AڱtLk ]%1^%mh %k)ҕX` Jn ]%1tjzt(%jHW34 9G99izgD-]=ER\ ٤`K*Ų)tR|tPU1tp FMS ej8GW ~p#/$CWAWC h]`Mxc*e)tj}tBXtH5i]+ j ]ZŹtJj]UKdS*eg]%k{[z:t%ktfqy :JWo]ͿOXг`X5U➕vҽ(Rw@@SS ̧lT£=[s9wg\d"<-1-k!ô`E1*c^dt@v:*xk.(ĿRcZ0,Zה+nq_b0&}Csn60Expɪ3ϳ`<,(m~МQ>;gB`Qkalnf#vp p\<@rFϵ,wJÃNa;jڢvv9<>ߓsH[ cʠe9(?*oˑMG(KQN)93ZσaLLHIࠀ3ZHpYcR ZH(jS O0n<3O_!5>{ !N웪{_&|v,D#GQsʵ'E "%.*t2b4u @/D+2>\uGn_A)҅}C!a &LI8FҎY | \KLR5lZkN4.uf¬&._,7_*k.#_A IrrUg(._z1W1_S{].Ӆʹ,OH.=_V-&k*Wo)>ݺ+3G-òQY~^t)=֊4 Q*)UI5bi.S(ѰH4 @މ`90PÅ g%x7x33,D^G*#R"%1 vK I/"AyҀt~νxJ@p/Bj/ɝNFy)~leE m((|cD$5\[rpY38C*OjQ?nM2M( c1(`7DIoY.C~EúU*R嫇0 yw B QG 5. )TXxm8@dlWXhΫqP ՈU2O,S1IZ˱q᯽a=O293iȜvaW|nM&fwdR~eo&X~ YsfuU:"Uoa8S>BǣPFU)S:XA(P F0Hg:RP똲 vx)@0@P& -@#* ~ԇ$CV'm`~ ݏ:F'an׼zr](.4C] n(@W ?f`' aaF&g-?ՏgWfV)_;?V7^̦m0h͗`.A~ݳTzv-9Y"%߲k!twM 9jkM]jjZͺU=z1YQ1^ttsp9~qb{7ZV붶rZfАT1a4!,FwpZY[ ƿP.lPԷ37Ad|~y_ū70Q^o~|/(t@Q~9 TGU_sՊSU -ݬ8zhUu UE~P\~9\BefleC B$$W3x_.g ܼm)b.Gr zeW4Q.ydwPoU'VG(X2T/G^1~'#IOgjp:*dEn$]02.jCk'WsL\RX9&wԺ _xg^6n=t4QaJ`B(["bQ9a:)w$( zT!$UHQ+hL8 ξ .)8Oe1>-aRZu!ci0%^#A@ *C($GPQg1Hł3i#1[Ű+D@CQ:htuN}2i!x$l2=H:n3tܗ3mbsK{O[慜+[Mis~1'Qi88/1F۠LnL )&sUaaćI΄RZ_11 swrה_KTJT Nm \q{3%mX>[22 =w˸lipp5%XNDAz9> l*1mD#Grn#G9#YorY, 3bؠ^Fδ1JB IN AHRVa Sh՘ID@92b=6Bj4BZr'sCe@E1eH bR@l:QVt>R˴Ŝh!Hݽ︧>s[#C(;(h|>{3<y4IyܲJIJI? u&MJ򦫳]koG+ d~Tw/@po>$kA?-P¡8AV"%IQMi( 2S3uZ V^B؊O/Q\1^Mn7|ۯE6=3nRǏmh;b3Pk4B}2!u7[g=xߘ|P9>K;CLZׇ|'o(0t-v#p)E## .OI5WkF6SߠeVU æ;| ih6N;g%rb}twy| h<~ih~[ mv$#Azg<~u}K=WWs(.ɖx zM[oB6 յnhT=1]j>zx;IWct-wGV8Zçu{|7xMf^{=PN|ˣCw|ȚG@>_3"q=q^UIY-/Dz}eݘ]~R+c{"qsthokq>B6! #m6̍O;j^|עvp^pUSh̠ A+ǝJ:lBj)E$HBkNrlU D HG4Ϳh>C%Q,%q"8zJ_QiĈfiޏ;u}:9}psOH+6,6{| :s׼@5qs^y9$(Fqƒ+RBrd:$NBOD1P!ǂE SbTrKJ#c1q#c9[[b!BcIpc7gv-E<0]"?y͟h4<ֳWAܳQH!UIa覜SmڽG*$=-trmΞEǖ;aK%8  &e8&10mc"^ˆ]L&bv6[O{yMp[H tY=nis\^S)IHhA=d4(yT5!/YdK4QD lbƩm?30v@q,"ˆ(zDqg @r5=2jtR,:(BMIEBǪ,wdPTqGr* ܼCRţ=i[EgˆXL\F|q^f.u)Yl0.{\Y/\eLVxb9RF:|K)BL !K)p+xXlut9q+}*qW\Bُ#g?nV6ʲh㉏ގc%%V_K$B⤒*W"@EXP1@CvևL"q&e#*GY@}N'+q1q+9MraսeڱfSa[~=5zۖ¼m܏Zu< Meޕ ]_w Ty'>ͷ-WsFQJeeG`"IJ;Z/ίtٯ[u_ 3uGdxg8.(,MJ p5gٙ"'t5d*PJQ+pº'H_)76H>24 &zh-l &[-% z2/kۡۼHY!ԓL}aO"ZM}GBDL0.S'Я3[0餬Q (\)LrLt 6F >5Nj"0 E)JoxCהiKV$@rQ0DrjOT$: o#j_M< V~$]X-HkBiNScN3:G@4ӹp1M"Pe{[RW¨/c-_ɔG+C7$G#H\ۼ.ԫhRK=iu.$z됎MZ~[׾ m Y[92#ԇ_&( '.ʪ zcuegX\%,x#+`u"  NWnϽhIۥR{Ǖ CÃXI"(76tNyCZkW ]$BL@JN0R٪a!(Cz$S MT&H@詣FksN_"舕.%~s> uHKyU+Zd5˳ߓ+5{Fˠ/y)Gs g6&Φmvsiٲ1fm%5NF|,36$rTKFg![Zt^0;"^ڳ455[6|.7p1x]?W._Ow,h]9SjmG,G[XK-P֪sT/G5.[~Pb -U)Hä!57:Eb :Xۑ̗sW7FO2h E\l) 1b|+iX-v C5^Rr*$TF ̊hX;6Go?L~hv84tE1jr&wJ=!&ji.栃E5&4OZ#U!2{{8m4+qއu (4`4ɀ/ך߇Clj ƛ~~RIۣ?ƿ'-8Ym\ﭳhh5 GoVA 4P{^WioRI%(QܤjAӰMmi;mlVP+'V}ENz3Gen80GY^*YN:^FH晉ǜ=A|phY靖 F1m$pEr-8"=?Ə[9mP՗Y8 }'y`D2@zeϏzBm`f"|vbAۏ]fpƼgH_A@F׸[9iGsDT_Dqxu=t; 끄+ fnrq-sQ;npcOp2sRMR0RgʹT;ۇB73&nˮ!m͕;RP}#Nݢ UXے?S݌'8…]O!¥]Owv6̹9tW\:=O,D-7h1j^ DŽÔw1c~7ef좏'Z'lx)Y d!JDHбwsUQiF$NtrpLTdDOiP.HGreחyTh@ W}8~" [!srp|hAuC,b= څǪ.[\l0W\Lŵ"pv2ōv_] H]D+Ǚ$%LoшW`?lQ?K=CWVVـa`L$6zoVPrA sqkA.SRSS޸+b!&N/ӡ>ywvonճS_~4H514rTޔpr~!9ɉTi*26)ʸ"hRewaE7fq5M%j?>7_5o޷anpno{JQgTM,.si㗥Uo㗥4oS0,ͨbz%mfAQkqB*P:VF0V$?]koF+v0]`fdf0IϗA=BdI#ʝ{/ݶ&#v6Y*b:<֭"(eݿ略c-yۻW߫bf'̄Gb1:A&q{8! :ICT2Yl|9{$ui=nS st8{#c,ʼnOFMs`ݕ#~PAG )#'yĈzĈ#>CHi xAJBW3vBtǡ+g׃dstu`+ApKWUA(+}]鞮z~+̿lkOVѶԬ3+ƉCt]!\uhi;]!ʚ|sA9]!/ +kLW 2JNW9ҕ Bmam:ƥz??2墚|̈́VL'+9|H bY4ClC (J͐СL(abSmjk oqPRt觟6.q(N%H}:`:I܎7K ׯ({R|ZLf^kq>LoS(2C*T;BQ,q>]~vh,nPŅ^ŇБ=aPe|Ycl-_PX=6('ZUm8(&jPAh6VikJ[J~_2.yty\V9_xo$t'<Lߝt\d+'Z"ÜL 'Odfe/uӳ+m&YUXs &] OYQK{=BlzY$tHTd ib /,y +HB  - \Z@V=(i}qSZ8ЂĜc!w'-/! _NWmH6Tu.댺B/-]O+D){uutu|%]+thUet|teKIXIZw&-К~xNWF*zuute%KtuBC vDi{uǡ+gכK"%᯿ Cah}:- tulS+,CWWӮm?]JMXOWgHWLqւ̙ ]!\)BWV䲧3+CtūǮHW vΐƂt+WʂCL_JHh˞\z@UFv0ik;O;e*ϒTw RDtt(cvHWJ!VЕҚ ̐tut QK1;Yw坙b@s+c]Jw]+DkL Pַ|M]!`:CẄ"ZzB^tu{v$akap+КWrCi[jЕخ k:DWXJBJevH!vG]!\uh_+0s+n]+,yg *BZJOە.S6X#+ /Q˖_xѝ[_QvXjkN^RW6R6ھ~ 3DP)+Pjk*ro7k#k| Z(M9$ADIBA 7*)3_@DzyÎҡ;Mwf JT9DRVLg L7!Zz!Е&q!¥+thh;]!^]%]ʘR!;wF]Ze[?;(u^++KCt^.]+D[=]#]qcj;<ۙ^dT^8yf '+pdqG7Wi,AӏbۖW@s)&&!Ӆ,YR1GʂWaͧq_K v.cqG`!Mwj$&?Ы]l0׷ap[ppk)Gx:sp>)) 㲑YJ㽤)\iNB)ؽ[fӟ._rr1XRGK WAO]rng4Z*zx$o1Λ+h ݻw@Y w:r;/v0pyAt`HT.zLfeC`߿eۚ6du7@p1r#x]x;Ms؜y3V1rنF#}?xG$,z!TG^UjeRR}f7ՄsI:A* 82'8K9'%g:RJS0h%7 G7}0`K(sSW"W0+>KkyT@hW\[/, S2HvzbH"Zp(_%{M\6mZ̦Q즎 աrI+ 3dmJX_^dg{`Y0 } t|{׮ x2jSC HFZi妕ntG;DxjSUƊ/obF+'ܧkʳ*~Ь&{| eFF Q>pip-(u)QHBC ;sTLLZM@4GKhx4GJFT{eMt,Dmxg QZNd\bJgF MxB@Ч*t:_[DDj 7L[+paFgNKevgPP=*(0G]c̜S9*C n )#s.O4d@JR!TQN9k0mU~챶72ԲK%ȔKRvf;{v in`yvpUV}.6V=[1p7^FnU(㯿&=2yNc&rF=ֽ2a-*7?Pb1+$)ӗҡՂc@/.Ċ^ Q'$t2I-[-KG7A N*X`2ܟe<1o2 @xMםΫl8G㉼n t"XU0 G|g .8i8<գ1t{Dv+y<ȹ_ԉ h1:)xr㧅r.(l&2ܺh&y]#rhdFC\Jv0JRzAdAUY !A5Ƅ̼i 8b zBL`8dv6(2gZ ᙢ}}c2&“}LƏoG%"!*ͫ2yZWUBr!; fJ,?72??{J V)Zӻ/׳dN1Zc}ypn x$c/s^b.y.H|/8 e.H,+@cT'´:AII8u:O2&A \ׯe PRR23C7Ԧ0:-"Q PΔH1rgW3հl Xukߌ&n-ٞ'2sP3PkR&("EfTb^5%1Bp/=2PB>`%ZʫL]:PGV u;F>3! ?ȏ&a:{?ᶒ:F <E}Gog.s,k 9-U0^k#nr P\_y6'3~ZmYn1تgu^AX=x ك*DŽbSvkP\rXF( C$E⯫ہbzP<8KP6:onRU<,7Q]ܻ:m_5vY[qHX3\/s+G_ԕGrw)W:3vUvVE嚃=*w" <$zKY{V]OչAVNaVs̡[jm2! (]Kһ9)tSf#>wJPXYռ-|=jmtG}볻m恐Tkjќ11uE{Z(S/%CO׿NB$ѷ;`̭#mE8="mX~26ӍrtL\\rxRIਸ਼T LpĶ'wpc6q>NrJeu>1iB2&z%ld$H}n nӤ쑉lkŇO7vMFofaJў{L2,Q"48I`JXmt&Ae* F@!l8,)c$< P#@eZy~jdi?;0> C#ɽH=;R_v{fz3UKsfذN%4ϧ8UgE$E#gGƒރQ0z"BX^|1f nW.IH,:@p:EEwJ2T=_6U,X. s!El㐖' @weX˧o@Q3vFbTppvQ($1`*ps4*Au:Ym@gρc˽@J%@ 6 R)MyMytk.f[X1h k{y<TOr \.k׍.Q<čŸp aJ#E\|YL$( #,q)cva>l |X9t> -ؘ}ˈaF=#seX$N#=1jMD&j"M+Mr޵q$2o dTQMd퓇X0jqM%Ar(R=({l@f5UU_W Qj:ՈHGhx$N*I3P* >WX#vFfKgoCmj%}[\BH@.dt5LXz3EV'4J@ƅ1VS{w qa09/zIbYS&uray~*vs9;9H)\ІH' 57xI`p `]!y߄v˹Z9ܞ͗](("P+teVp:W^!?<*o z{{oܼU=L_l{|ǐoO2--/s7pjx‡aۥsmP t6D^XFkj7ɤDC%v0>sm98J&dK*z^: 4>g'Z`{iOZ`y>G mH|Cܝ.w|}lpـڱFLo>a{5M7˞qRafƥ m2pXs(t@(R`ό>&*`@6um< H7ҤZGYcB+5hy8)~zA?Hu:7>جlpc"@{[HCDYR2JYЁԔX N1ӣSANeTD(Ttm s|VJxMAwS9 oJȆ)q}!9ɉTi*c&EWTm]ꅷYo_<{'n/6y8W 6[_bm<C{j< RM$4(jmюt,`DFpNpQZ| /6I-QAhUJ E4C.ڟ\Xύq*)cP IQ9QLIExF쌜6"jrICn-@B2'xh˫rCu7e71d'<[`iLέL2O%V֘CϹT}έ0N<P9%>LȯcΪRJD.XJ]H2o\u!F0'N|Rc|b~Ofx5iڭ?mt,ey~ Lqp I:¬%Ѩs<G0`I`!a/!m=+Ti4bI(2$,1ޏv%8d-AC9FH8%@K*PKӂOecfQ^͏[Zw`!TZ(»嬃sX̻sIE% Phl`&58+$W<(x#8HDGsjsHJk=K}gRXEXMp&eu fa{ّԅ.ҙ^UrĨ:G*A\d4^|TW |}+dſ~_ ?zC?q坋HCV7OWO?N/G6pBpUInw!$֗Y1~|wHY@`KB^8VIG=Bm$*LؒIf<}PZ{NTQho4wQ>޼"H{fFʶUֳ5<<܉mP38>ʙ^t ,;2Jߥ'#!/.O[g6a{]P^Wg k2 M=0sak(E6>U \n7mj4x.ij{&)w њs8aΞx&z\?"CiMG=~ĽyPȂ3"GWv|E :حJ6"q*g"։4I! viP.>JM{|`z:`{+Pi*Q .d0%y.4Z|0"ଶq. pqsA!:6L;#gwsmz@Y\ݘ[gJ%ДK ݛMyR*%cMniYN}D2[c4 Ĥ <(B 4T^X>/#c 'qkF#;Զ2xddJ <(\we^i H6 AzB!J.dPF <+5CbӜ5Ea5-۰AiSD-cHhO )gV'/^ёFuOZ$晗֥%ʨ=!Zk-&jM%#>)=5 u+H;d2n &dWDN28wsREqns.H-9"h^#M-`N@ʿ+cQqNFGT+C2Q|$aGW譒Sx/x*(M9F.Z@J8j=!HDD鸚\.5 pMغXMjg҆ #O+[E=mϚfϫMY٥@ iG~Fa/&4f~Q׉s8dΊ7.sh CTzr>ph%3cĜ qPtdمHQ]V\Oy|'SG颭ލ,ʋъep(U~jz|b0Yr}Ykm0l8c#!cG_cq4-.ow*AyDx5}ؠ.|ݏ/?y77ߞys\q#0lY(ߧl52SW]cM\ﲓtu |ϫ`NfO7kȴ}[B Rl1v:~1˹Ɖ.'nQ`"*D1vxIho{ke[^>o&H*pA M3jgG)U c4c6LxXHlhKYKϟ^;pa)iJ Wĉq a8UH@1hZ-񳃘2|t:9(Qg<eGu 2?o8lg ۹¬a;Q3!Fm7H+I\ҙ}}p]8)~^ngI*w3ij'n w||l<ȕ WQ~nT9́/ @Yz hIBO-3xDv'/*l TL{eA$Vqht 6-uG9ɳiyQ7W`.I1ycyxPYyuM!rLGq)|_W;e,ˠCY Vcy8UHan{)ҷq,S3=7mݗ`@eyMЭ(ȵ1OiRM0&;kYY iћ-ҽtv҆e v7ݯEtp_v2 H.J%\AN>lnYD83jt,e9i"q{}i-? -ZV3 hi{+@B8iiʼngI <2,ҤyB(P9]Xˆ^FUJmxrXYL^jʈhA #(H8H=9q9}>ZC;ͽ<_1؇3MU3|nFeI5d% G{B`966K>FIN\<0`{*;S*f%"h@QAsgт#*$;A 0Utȴ0YπG$6Ŗ-?`+> 6EaX+:X(9Dϑ#K9=#GqtG@nHw92 U>] \ϖAC<{ "q3z< hڪb!+R!ډP\ aJku4 ehD-.Jp'Bq%%G#DJr} TTw+W Wq'W@XP ŕb ԉP\i1?dW@.cG":?,J̑K Knhh=IW.EvWoqbT8^ .( BE1qڞsSxNvJuk͜|HvFgq.O}BZ0WY~~d 'TR#Gbl&j0l2QȹcJcx&1}wwA^}\q}= 0j Ygu-8qټ8!W='gr~L.i8ܲ,D6̨P*yNR DiD Y- !HK8wͩϑIU SKڮY*'Dy?nn"ޝ, UNU3U_1Ck 3RB Πu-g@lbo6oߝl×ۍEKO2Z Y1T uu;jO1С%9*$I`uSƒuA]`xA;(%:Dz=c1KA^R#)VC(Pʍ@ؔVGEdQ0A[ &rX+/B"I!DpH|IB?07&}8O&SG l_E=A,2}E )7!Ambnb 67ؤ6!Go 7܃u D2(%2* 1JYփ \$Z#QgMac6 tHnkj[7zڸ+rwSz'_M WV>]:uL+G0O%k DE}$^$BZ2>$:0 @L`dJ2F|@I GYg%)>H)k7,R.Ȉ(I$5SDcsR"!wK (aއk JYo; Z3g{KE'wWjp$-w"W/X3mxy*$>0~u\cM* "ґPFO3Y¨<hr;(z:$kґ`meJcH!R` 0͕iM<2cۋ O?ŠlSV?(nDM˳ĚVxAo;4  ]JLe":oĬEiwg>J;Zπ}j6tЛLSgj7rýC>oX[.T53InmohU^޳'tmf[TŸ\&j\tvWջ .vi_MDmoy_\m-{^lRHg2 S}ɍa^E+VQKisO5LL֋n94ڒD __H{oЛG>2c11XB4)xak,k$共FwBV8L7;Jv][[Khx{!7`-ƒU"OqPJ2'OeK,X夥d+?ʫ_m 3BlsY04\`D$4]36uϹnY0yNYvY!< 9aZ&46B )OE A(hځV𐲐"[BZ1τ|Z7~j5&O*(q12NEɘ"T2&JL+ccǼ0oޞSH"`,6y͛ZogV2 `3/d目^Z*H?o^9I ,gܘJsETDog*Suȩo0=[lzvk6:E)׾Hi "ADkh.*t2d rIb&y$LViǬQFYJy>i kY5r6]5-8%%Jz.Rd \hp*o^5O"W"ӭ>˲Y=3r*BFkE ()$k,TIJ+x'C$A:\0;jd8{;"E RrwH184Bj\޵2?2R?_ϻz~Jǣb7cM'ͣ.kԮg-Gfv\<񤩃x98UiZ[?GשSkmgQ{3\Cfts~}w'铷7xg`= $d~yJ?{CϏSq\ݡ/'D6Ck.fAq~\|(0lL^5&uy8$Q▪l w5!|eӒ/c?WSsyhXf1]nt^_(4Q>Z_$#iw.#AwB3jgG)U c4c6%XH_[ٰ`.u{=0,%MJbHb*IY$Dɘ~vVK &,ʢ _>6"6 (& JQ,)=KX;x7&@Q1\0N:bCL E52 " Foc(Y.hCL$"wQO$с\Q: b<JYMޑ=g wnQC>i΍2 WFY.iQ:=:D#;CGxLLøGh[nC@A4|PHuS;L`s*%=%N&l-aƴΪQsx͗Q[:[].vTw }ǿp` fDct\`R4{dS Rpc)tB/ L]-C:-|zԻ{< wʁ8PuuG 3 τAUލ>JշըC#iw7RvVi2+V@ccu6-rnbg=5±>?7SzrL:nmoiڥěOmzh;za`OG8(,uam 28?J8qDžI朻YR<P=BZ1^z @og}Xv+1ޡ[na#gBZ@GZ֟.ྱ<>x>USE" nM4y.VKm_{hϢE=lEKysl$RzQ-DXd"8es<)UL Te}0ǃuNm "r]6 &eF2 ,Z#ՁHOb<)t⬯/&TK \~y؇͝ Z5PYQe z5g2PRJ`HԺ.16*&%"gEw74bFV"TcFxI=)ZcKnqS%Ol\wZ xBpEdB ^<:k"+I ڌF¡߀E6^M-&^~//8y u:g)PR>yp1jim8/qRp$ ZA Țik:V?PfF9oS'+oVOl/8DPnrpN ׮;<2T3S 2΅HC $ rtI@e$gS>1cxDXaxT].O8|^Oqe^6 +Y6=>1{`} BFy}/Փ̉*9!Ym%bJY|l+ʭh) ਊ)VkepQxB瓨XT!qi%Ө ̝2NC(w9[vFgpRw=qnr3h6n`v$ j\ƶI|q^Gt\7^Og!Bqa9ٗ+ _3IQpj=2Po x\Ġ,@WJZ dWj0W3JR9ӧK$'qʲ7 8,RX[J؄L4g,FÜV) my!慼;+֜eFƇIj/>R=y6p8ocpIFcd T% 1\azUTIzZV'ižE;"M.$ k0qDDJDҥ~K0vQ%b .:kCϵ{B e// pIs "Gэ .%ȵz)kJ4aR!ZP@ }H,D%L{/ IQʖNNVa#2ӓA1OBc[( sDsĞ#n *qGFN BXK()VXErTP#GD*@8 @pXTN|u.-Q{4ŭs9(}U;Ӆ>:m*e{1^"&18 "8fs!txK)BL !DDG.bc_ᖩEv w37 ~|&G`Ɣs6%sxގb%%2K8B⤖*G" l,E0^ WvI~;BR7GȭflG[Ѳ\4L@s$%%h MG'S$dsJsø,se_*OyaДy3d2Ӻr>VG5+IȬ`(ш+Xɥ Kx4z)E71U=r#GrkjzX=lI~jPh gF A  UYGefp3"اLAp`$J7xĦ؜@I4 uAKkHaV 9qGMæA[r y|U5<66Ki[zlWԱtB:lZXsi!)N @Ik$зy;-dD]m j >廓 T9nrȍz£˨^>.2Ժpy:]dMtn0%cR(I8Hc1$kڃA]RxaR[(qZ\^q<]N 51}7 ך=V2V2N= k5Y|ݮ%6I_-n〲TRkn46FZR4'*J%I"(DuB:Rb&y$`Qc(,<[k从 ^*H)')biVpGB  H`Bb|"R;@&,^P`lPe8J1KeV8e*"TE"*YGM7{,|+{UIyuN1tepF5Jr2. ɠu p);ƅ!Zyxl\z 8Uu~٬IZv6[ _/^BaQo&ޑ;b6:(qL>2~L0L.9ÎV0k!Ž7}ݑ* x^[iiYgBRU`l:kmD2&)yYN]hU*c[Z"bK(F[ 0 /h:u+p\>r{G-[jyf ѧy,E )pU^W^_}!}U &d/p*Ũ U:j)hQkalnf]g )T^>1rb>[K>g7٧q;Tpc=+eB?4._D쁽gHR9sO?\NJrIخzT/#VĦ2R]蟷nsFc e & ¼&BP5;LjnDh:7wA%əSm6)u&7a^Ƅ }R\K{엙Wod s,MFF{؉v 9Zr;Afxoج1l:H=zYRTUIX^b#rlM !fH Ԇt0ZwE%K4+ZLJBK{167ϭr9Rې&IMMv}Ŀe,[5r)ා}MjQ}]GtF_VyqkyIP5GQ ӚktEx* TTZ5vnr oҪ0נUcb l2y\RȰ*Ic A.\"ͼ8Oɹ6;BV$o_௲9?p+% Be&/ *Lb*UQeM&FKkQJ' %hUtH9aMf.y+{S0?ᶦGd{L.HwUie8?jQJ.i.lۦKav2:ud Ci!M-`Vep[odp 15ݺ`$Gz8RzYȷnc`S09sH[ gSSbePy}GX:Q'[Ƣc\jQU-5hA?Fi{>V0#)  \}S#ɼa|@˿xM\ cmړ+%q4j$t͍,eWvOECI%͗Tˬ/.Lzgd;spOl7LǿjE,6*e}wp n\ǚ}y3d'W<_ ]ZUbiwp_ePAm0\Qˍ.96Unm5vN,۔+ȕRuKԵW,L$ӅY봙L%0:!ƯSM/ ֹtnyC]mZ>(W~6~$#xA5嗜wqR"3kg @g+zF`!vNjz.ZIvNP2v~*/dT=+ɗG26 p8lg@kZkL}\b;mV|řWakR2oi*ĿkwBg"gH揳6 x) >xg;hFcvxV˄ $ 1j) SPì$vlƒE*L<SDcRLk%0l8 6/`vW"O{-ʋߥpݭgAn5M̋*(q1Rd"T[B*13o7[iow`vSăd_W[CVg~7/dpU8"3R.}'D\1 -ʒIr "zd,sh"rD䢉 E&2G(%B x1Wt^)KP&#сw#JqT+:2,"1rbrXص8uD>"EsPĨּv@]t UW1_HԜo꥛~K;A Z[&;->oy_cfq2mOlcgo YF't 4>h%E hl R!rDJh-H|Yө/h/ AtJ;`&S=6g}@#5g_kKL7V> 2 _Rip8̵ \"3QKhqD*4D&0g2%#> GYg%)> B(CK: -[o{ ^oS<MlgT[?[);`iۼZyeVk :X0A A21Hvz#6Z#k@3n|/zEO/y^t$XGҘ`%RKED\!q` 97%'ƍ_?*;F K"hHb2;,"|-H Yb\FA3s}g\_+b3zf]U-xх␔m@um va`:H_B'=ȋTkڕ}b|w1TYZy]5^!K7}Q\MlZafc1dz T+*RX ײ+ CMhMp~ N6+QU/)!!Uh܂PTӴL0 s b]\7My˨)I㔻،n L4=n/|Vb d;'tgqх롩g< 6_<$gsy)x0jDXB{~Uo=-8̋ /2߀9~p<3sS c1gLqģ W\ Ne\ N^@">(M]]ڝƊKti&5=C+@9Dz˪S.@6mxd( OI}) Vu]iQ4|p0ƄQ?{WǍ _lK|/R@p9Mpb X,(!=/zZ#ZRng{SEBa$%LBʣpOsXi?{;^toc띹tc^CIZ iJ E"/(c)G6$:tR>'^+R֐\{پQ[d$b$6gtM>%I<̥ AD6=CyV7'گ뮥$Slxpij&|~w7 ,MoUhЂт$W4!F{[s:_2:U{]m^b=.iBH%1h/s)3Kk2gM5 72$cl M=#% Ju;t-#%/*l?kvRcKcCoȃRQ1 IJoJJJ([.`_/i8oY-LS {);@i~}mxؙ3{3cbsF<]q2>rJLC~NC ^JSiF1uIMro T#wXe 9\1&ls+qȢlt WPC2ŨhRA22OT+G(:AF9$!EL> jzT73tts`B:oY/|f¨vsv|Ն͊L=OU6ڪF㈎^Yy2tb~BpPnM hd 1!PɠM)4EYШG3Cz};k Z$'xBm1G r11&"d&R5Ego<17w'k+Ki^~CWIn|ɼy &KyL&P&Nn}vJկ?fogLn=Fz'6W~4\uI@5ZmA!f=8Bmnm2i+Uv=` A=pA<3bn0apLTt"hSCkB̥ cɨ}V"mcSzOG ^3|p@{#`;"BZAYzGZ߰7=_<:#gzJ%Qch|Ҧa?*Ĝq[c-@kgLp9oJ'"e DHґsPȨrZкZgRI8gUitff97K!wiRuY%A1 \Oʘd; t K;߲>-(HB26S} %qW<*_(J,:ݲ=[po;qg`t-ޞdWyOkGLcdRm/'0's6ԥ8q=[70y&+W}>hRM|+*w'k_Wϖ]'fXd@*̍Iꨥ@mȌ)0clw< G,_3PDH:-VŪ5d8$CTrVORn7f#97L.( "i]* |ZjJ1XI6Kڌ؋57v_k u/Mԅ䈔䉘kA::/X7Q4hiIe@2v> A3,_F%XmCV6RJ"⵷P$1Tp7)՞jgrbiu +g(T>6Rk uVMa`]bgkZCvag2-cqGFaQ>M~Om]~oi*%^I^?0LVehSvuMۉ9CJ-:8ഖbA`dxz@'Ya*(H"8Hnqf+n~5C4[UeLy޷÷t99ih_כog7H6o]0=\ n|a`oXjK'erWOVmST_=Ca n9Mgdzfճ:\d6xlbv ;6Bklػ[:jFu7c]g3U'6Ly럸k/Q|>;ttyùnTV7j׳Zcڶ @ic!a9[_,h2/4停W+D57usS:ͦ Nް:^}wy~7Z*z%a!#?M24iZ{sɯJ÷MPPZ2ofb⽙gjgfy7?JeM\ߢC]`狰B>_DGݮʁin]W ֏tv0{*I;7>%>*FOK 61YҧP?: &`L JɌRK  Q1s'6ΥiCL |NPx)gArRCHvbePP9U8˜Πgv>W]sVv^fNl+qq§W41Vēg5Cߢ9?jtC+W&$Nd7F'   %z^%!($k5EPbH䑼YY)T.dQ"Q1V[!Eyᓗ('#uS oZN,cI1lrTz!ѽ.{H~(/m_?ڀxV lVtcR*+€ND$B1յr%/A .J@kle)*H&bϒ$wmt|ޖ`X9qD֖y<Jכz|Mv$ٵz}ݩjv#uF~s;T3F矧/ze1dׂJ衦~S. J S7' tNzF\R cqnb1\BR9t:JiM,3Uziƾ n%϶nV'qg \ xr22],?s!ESJNp9gPJlZr\W]'Aqh7^PȁN=Du:a2㨠E@ Rgݛ8=vs(^765Aa)I!+r&]5/K.ux+ G (5L"n栺SĔ9&}}&nu4Sué ߓ~mюqi|"BBtR2T>̧8h2h-G9xe##plL0(ܬP Ϥr$hI+:4X{gG|U'fm\Cu%E//~>W9MQD"hlN*C%𝳹\2#"$8/>_?M;\.lˠ{%c䃸&? >d{]F"߽n\BNqn\d_[cYo;WgԀ \%h;2{E nAE6 !f!^)쀨 B"Ȝ\zB>PhJ(I`dIZ ^-~>=i-cǎP`uo>_sM3 'EhG{X3ɚYoMTIFS|Ո%l2zƫ }nhlTl#V%QlĽۈkt>cq Si={G?{l~%٤?`1v}50!ܾ"HڸÎ=QpQ{66FJĜx0C/DNVp %Ac$9n@c赪"I d)%'l"y:2#خ[Á3RDM&2\1S~v_ E?3G_8ƥv쀿;^qo֋ϋ %:]E dY!^}hћHW5\w^ي Zƞm=ts{kL֕.[೭ ܰar}mw~'k/^pPK t&7xo׉f 7ý7vW3X7x-ԳcNQ>7ڪg/.|ȅ{}|)eXɩ8ŒEq&RS' ΂wKX??閨[li,QM؇k;CLjlS])i[ U]:M9M"5r9leg+}R`l)xJ%ștU(eA2Z)⚡-,\֎ `)gg+?llu'CI0$'mUU'I Yo6+mpB[WyCi-pc8hX# "_8(iX|'ƾf>,b?UstFkQbDM_9} O F1~RE& r 1WXpC4^P8~ j=Sj:^ |,qP"cE_6bzP.ǘRc=q,irk]}cfbyI{vmqǟew폾>zm SYGSnGl7pw_~[}Q{@8]qg;/zc=uyy=,;wݳe~?lks{vtW&}ќc4/qC" Z=_r/U;lzS?qH󌴑0Dn2k"̚F;`WD5$ W86(ؘrɛfzy~` ;{O_5[W[ jnc:神_%-O/VcrY) _߽;W='Aox[;S? RϭW祫ǡ54M8]GЕ>S^*)LpГ+ MrNWJDpCWJ ?~ѱѺ+FÁ^ ]iڕc5a*th;]1JOztEyW'"]7lOK,&y~Z^,zuT3}9?7/.mv$ɄC~8ک`WT)6|Y6{~]ړKPӔt\8PWݯ%[a|lN5ƒ%LjqvQ\D5ȯЧԂ[g|9;j mC AjDzGwoLM]:"tɤ5e8-JZYZyV~|6r(:Ңt*ES#~h'4C95캢FSR GJ>5g84geuB"Q V"rh}fSrn^ 2(gip#ʭQ~BC -0\g2hkģPZqZxC F 4̀t0\7yfF¾ӕ!!aYK+*Li1+;uh30BR Rk䒔bᨨf* Kb> 0Xl>tf4cSJX34f3ԍNYVIʔE8vpEK3R.Z,A됔2Pv 0i{&aM:JT`"YShh*ciͻCsEͯ{! V"v;@օYE'Nb׋,1KFI[ҁC'dHXlOĹL.Vkys^55ksHp%9ʂM/ dJ}o ~L>Aڍ %l'X5E_Gqu8~QJiY!Q.1 2X .$ZA_,$dJ֦R N*(A$ԣŬs֞Dqˌ:$%UVش+!Kc&  X fM(xGkU"oTxd`JR/>q*!ސSzL|A$mZQPQB m>p h-S͡måmguV(NF5%JR-*jl/ Eʓc0m&t9 EG%l UK ՐYuqZA}j+xYWQ0|$ Xa\3k Ki) >}@E >{J.((JGآy, R ލ [Mnh+UR%b5e #ij*\dk=$ y/AQ* eө&rDV * _tk1aUE8AR{HU%;6B͐jPo|ͩW(c)ˠ5P ( ,*$" m,JdM%ĭ))ƒ> v0TBC\9C$À:3X%@ aPzϤCB8[Vj 0ՙPpq ts8eAj` FY`p O5/U* vJ SѕMp ?H]FkX5=( EgR)~S !Q(Z%\̃pBi5&Y/ v'UP:o3V7_&M23u&cy= m-&إ("Y礡8bL:Yڡ;I@`߻ea;L\rUkvpV4mKFGwF]`漄i>Г!ƫAy@82*}\w,<*$]45-*d`ffSx gg0X|Ѩ`B'؅!p)#ȄO+ϫ2z'E`ӅaFT/> ՗Td2jnu抇͐ B@8KwETFUS^_<{UlΈJʁ`dəL0)a*@'nb~r7y\E4Χ,};XR Xk#tdWpc֣.)2H /Fs 6. ˅"8Bik ) 0A`aJQI9v9 ڱ,<  &v6̨D8K&V М!@Gdaص o`:L$Ek sѦb =>N(xʭ`XW66I2OuV+*-W W̝vK~I^gL }g j`\c#d "eq8cVXZ#e̍L'1V_8m'>3)%LTz40ՔqOkw|9(V8/j5o:k Pٗ6sr0F D 33!-+Z X/ڀ@\D}.ĺ>%a#;c|D0{=zMfҁñPb.⌑Tt+̯;^@." ksj!$*K:_!b] cd4+M !".XESl$R5>]w*;cgL0jl^8[ꘅhxQix)Afo /}>dp,1Ui`mRIutO޾Fo|T1av)F.Aƣ&ӺCguNOC4\fd;Znw溎rV {Ni󻺭p~ei5]W,&Gxnfw)>m*0][gQ-3E9egnǂ 0 w}6j0J ^MEJcW@Q $^!R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)^Hܐ@p`Zܷu)^Ha %)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R+jjI T](W(WzJ !%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R+%3$%;=% G (#%+T &"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R%g[|[/մ^_7o6ڝJQ_VOڒ H$*\"\h8vREI*KzYUl`l(tcR*WHW߸pσS5lLτa{o['v=`27-uQ~}7<̍C@~R*wvˀh ŘCVc2WHӚ{#Ā -c EtUjstUQZCt `]Uv0t=]U{+++r@tU;7\CWTNW`5ҕ\;> `c`b(tcRwK6pzp•z(tUj{tUQ:Kt+ȡ'U_>wu`_:< /KW}.: EWDWOz^O- UE+ݱUEylc ]Uq*wUt cRwJ*]av0tUUEP++zoj2nG0qy`rOQk}kch>ovk-ھOwKͺRmR7ݛ/j؞yv~9m_9/$-ɳjf_W< f Ig%,az4XOTjq1m.T𮬚ًt@cw ߱O.^7(hm$S{#_zZ/?7-7QGߴݠD}(K4i0ʏpJ6Y&ᔪ&&FyI3EiM^& v}F1IhdV&9'.^El gvdawַRForԋBP~fĀՏ=֭n!*\mZh>PzFטZ! +*<pSQ Mt ;]u#]Uz0tUڣSQzݐvu'p ]U7dDW *`?\P誢U骢4U=9cu ]Un0tsWHӕyЛf/j?p \p0xZB72DWOzn l@tUqDuv(tcRXWHWzŀJԻ`~(tUoJ骢< ]I' UP h%SNW"*t1 ~nNJ%O¹;wnaʵ7Y -kS77nwɬ5bdG{V캢o~*r3HFi#`XUXkK]Ow5ߧt;Y77#w~ܙ󟪉}=ko?PaN]]E zs Yɗ/NA;~\cѽ>N~&XI.bį+w=[yqY/ fO@aI,Umޅi1iSs9$AlV>cщ$-K%-9i[Ӯ3Rnc.:KQq !Fϕ*\ c[.Chl&־0ju\ʎƜOfx)^oliE@͙ޕYfb)LQOBfN)GuF*K8k({&Kp6Ww*odW~L l1[U,:Xvnrq\2i]NFpչ ˋww Zs4.HTXDu=Pގ;_G/8_GnYݫ|lj>ηt׀9Ru'˥ 5ړv~w7NGxACuI0&(:s09kg͞wW]x6Iѭ2rukWgetZqqg͈iWkք=^7/-k0ܽ˛xlO/_+ ?ٶSxgd bqX\5u-T֍v )k֛?­^j*W%F\Q$'V~g床G?*:[.1' ?|쭒Teh5]s^gƣKp [u\=|klypR3Eso5BX>=@I}L[k3Lb_e!ٔ$}D zn(L 8y 8wl 8'$,'OKDL2&Y[Q8C,BT"I,VJ?dٳF}p$߿z_'ɎqZzH")g ! ڤ3e6,TAk* s5ʛ(}(f`֧̀['or "TT$HZR78OnhpKy/&衙F=:SO9Ӳi/6/mvxGG&W˴.u&Lrgl3f\I;䬌6)&saJ $xAtU2F0DMil2*j~V(_+,Y3caƎY 1{camEM}/-bdː"ks) _b""yg\!B$’Ed8^1:z<̇]P?_I.{251"1}flLEpZBmekE[ڬ<5"Vqw<ʈxZm噘crRťHZ(+*[~zf<̈ :;)ٛ<mϼhKX`60mЉQcD8d7 1:sTDx,|؛u ڧ3PXۻ/Mpv?}M)~&Ke|X5FԨ%f%kJd1AAM%OFqo6B>Ǟi#䓰yeRKt *.l>/|wO\Sk -RYpL$i|PI7{97qu\cKs%9¢D^s)54*tTFVT } $RIol}:e 4 &z|h-l R[ 3tz=풁=w: y$tG'݌}n`/-ţs\q!"s&a PwI%`:)k %I=:"B2FiYm\D"F!\tA! ֆcX>44ptڥ V$@# 2,@ͣ`<1Aɩ! ?Q$L6% ~BŻǙlm.ZlSTEǜuhdX 6@uF<@5ʽ-q~:&SRQnHrG'qm\HW8Ks. eԹttl}+uNhc@ m`P|pLl NI.SlGknhWUV61Q| #z:! KqŽEP‚7`y,q$N"l.nӞtOGOگx ̃XI"(76)XPvEJR &\j@8ЩJB8J@e$z:j>1eJ*S۹( ao ޞ %z-˓KB?k-k[_kwm"yj/ J\c^iޗwS֫L{YzkO۟{l{3w8T>cP;2RKa*{dv!h8HKU6JC21)#knu {p:!Ǜ-ɉK7vF;h;\sCEo^qL]z !Ab2Ie#+P%Z3+a`EݚDv*K 5]QRm)Z^ANrGl>B'6DB-MtƄ5xI`*Dfv*M7DJgi> .Pt{Q62uּ~AbWs~_rmI%MS?ї'}S159{@>@X2lfLD=^ZFW֥T/sxm^ _ύ8;mBi N-;eDlTzg* x{G8CqƉ@HŨd[X FgI\"(Ο%e&PX(@78hdqqjNAO# yj˱c س`zgp 娛hӭ̥x) R#fcALW_J<ub=GaMbI: X@JDbD#RׁWZ/@+{Bpg'pRY:uLCRrR) ]dSCƺř)jမf?kB[qE75 8/9+lqԸ;5^6|gY kY\jpQ}j2ϗi0UϬfzvぇA在zX4ݔwb^n"C{>nѲn8eɡσh#Ehмڶ0WIDAλNNnx(e5I! vzdii'`^Yf2Zka^=ONPƴ)/Ykc-Hr+]B #bd1 3z,>]U\?5 i&+!mcH{PӽsHW5c(l:la5G)ONDЉhM8)ʤ7~Z|ktLxDhDNH? Q?KqՎ͖'kXa`L$6zoJit svkA.SRSSܵ&/sDuKžu[ 8E廛҆}S<_8 uQySB1^'q}!9ɉTi*Se\QuSvw޾E7f4If ޼jcpUvK7F9 yZLa"T* 2u( ï°0\qh=jʢ>ypԗ[RtcacELBpN)ޡhؑc{q.ev\VxXIj"d6ȫ(Ah83>(}癗HҾbh }LOأ˻[2&"/(nvDm{(wǫ_#i[s@nb$=lLjIᅄN,-pmJ)"l)wv8(FϹ>f([k,~A[s&7d/tM„7#!IDK1v$A7BU<Ѝd4+.9D&2O:46/UѢ"Dnjv{>}2N.j^ZYwD ҠMI2 s#38)΅3 P/\K.yi]TFh [ eM<339kNUe24Wg).3 .A88/. Q%xSڜ/#9DIL\@Vr7aYo+˖ŤH9䍍Q5^H9/{OTiV'|d} O.W~nV;9h'y:m8Z_~Me1h 84JrKc(٩Z18e㴀\20XI|xZp,wfO>|jgytbz#o+=/<#xfna>1ԂɱxVrbOUXG ΁aqhkoz}F[Q?ύZo_WLoV/:s< = s }n':$rl_.|<QwR{{+-v;'}k{)k#2ߦc]lnJDҮ_x7ۡw~z;zlzӥ|®oV~[$a".fB<᧘ezi 0^yD{W6ކՃda2L`m1+欮7aidi [Z򳬒6/Oo.'ys+Sl|6PD2$_y.GF+S9E%zSK nXɜx+2;md' Z+棢 yZ#Bj Xm/&qfr.2}sݫnD;oOv }yZ)}kZ=Qqy- %gKjv6e?*ؔT%V^|]%Y .jmSAZc"P+)dqP ,5YfW<&YTK1i^ߴ^:~͇X]_ğ$)?+gg؞%nKb]ŧ o*юv'.Q^|g }@Lb?٨2FZ^&sBS^3\0",Ce Ĉ4ZԸxkfW+݂ `S&5p#llv^F)VWL}qW^"f3w7׷hM/jnr2ܼJZ!ʷ-֜c=#f 7YM!lN_pB󟅉Y#FO!4pmsȣ4хI+LOadqf^x8j ~S<*g$<*=;*;va-`ő*yraY2̞ Y/+l\k3l-O/ه5M`5(SOq39m)CnRIWK=Lv#>u|T?H7Uuݪ57 0yYvPc=k2ͳNE΀2Kde7c+w;y+xR:>ygv(SdC2)8/)s9cMK:W0d,Gce"WHgYBe})@&,VI7^gɹ:kCmς?VJFeW qaȲw'Ϧq1if\" S&:r R 1,>PNdl2bhUZZZ !FY#KƂK̮z;LLuK ?i_RX&z(!Q@#sF6K]" (>e 4'xcp4u\ٜx!m}0sI2?' fؤ9Th]3# AXrbͅ<-;vmhӆI}+o)6^|.Xy1W>9g 0 9,HL9ZʱVBcJŴ7B.;cl*oձ yxAЏuc-wZ?H^2]dyKG2$46!a$l]HRrBE/$Jq!/$EC`mGRv+zq* \\=^LI xw\Wka]pJ`\Rm+^+z[k#\`4\\J::T%XTJ Tm 3NWh;• ~AqE:T:pu:"l]}굎;a?{."&x9u}CoM֖o=H;"7tU򧋳2?Xk'&3.^te ^gU+䘧tY5ȇ9}c[xn81w[X pN*^`p朑Y7PW<7GoLG̻Z'wԆnH/`H)\=J~nDn:l-xYpŦDD0 ThT-7p*# \ 8Ɏp%=+˦sfU :T% \"B4#\pr}7Π:D%\xYp#FS_d\\&.ji>Ԯ*G5zkMԄqN0ڝtZj_ۺ0oquځ [T J2+UNWh W*\hz4y\J4W'+H;u+c7R\Ϳ⹝ އ0-*U.u`UX`[MO1;TnAԲJUB:A\%{•\]ƺR;5U^ *`:UON䂁^pjyJU::A\E}Gor\ZoZǕ ~s fuywƽuj֩tWz W*\{` +Uٚ38p,RGnpr+UK:T7W'+DALuz+UXY11 +% M7s~ÃFM7BX7O)4 i Elʑ:Y5ޘHdž_kvAOƮt  9X蕮ύ ; -`~B tZ4TZx19k˸;V &W亝ϙש ;ܬRLc7W0pСz=ƴkUʣVpp! pԪ\Q6^pO_AC 𯚼 HƊ+p?d;_5eǔ&vj:;$^57 X2fgQlbrDGMhAR7UzhAU0B /&z<7Q?'t`{߸Y'\jZ)'tA\+z[p=v+l/4+U W+ƱW"؛ TƺR[Ǖ0puB{Dpؽt:zuu\J?pu"D0Y T֩E:T% ʁAOJ3S W*q/RؕtwNW0t+ z}6Ǖ?pub4=Up'UPݫ: \ ޯg^paL@_Z!i`: 6|0an_]iG2IV6;rx+$JǔQ陎ʊ~t7DW.o.VJЦWW@HW0 >A-7ʽ7gn.9 }eAqYɻl>h=Q>S>hd=;vÎsP i3%f+%|%6P{lc [Zd"(pf.`p0eVzt3 -{N|_(cvcBZ &:YдCܴg۠~?]?\iOx:*y-+?}QIgOrÛ?|+?e+7^wh/_}(\w8~|]\3дCs0r~w8WyY͛8\J>1W|hޝypwFfqٕŷ#uq6}0>U>q/^ݽyayB}* w(?1[iˢ0]י)?1o~ Frr5r#G30PW}/B͘L3G5vkLŰOs` ?zJÙk$t!) YKݶt8Pڗ cGipgYhPӅֻc0M),ѣ[+ţC"Z|Xmɤ(1J f'٩Z+kՆErv.?j)dnZdCdvdbWxDy/Ø3$B!KKc2cИ[WW(:OfFʳRL=y+ |" 쳷z&Sh5jO//ld;b։i' ĐRmYŮ9S-X| hsf0֤1%^Z32x Vct9Hnfj)W@TcÖ'z02bPΗ42 8:E/kk1]fڌҒm̼&Xj!DQ[ӝЗsQ j^l5$jCo4aXu),ŞRȄHt=KE Ȏ^d^i"og0dMȗBi>d1֬ NJb {v  h A@ka&h&^veől\\5 /PyU5tJ,d. 1q9 YWeQ\IawgXkK~^v#5FU71pDRZ7faV#{80:TmZ'w/CAGG\_f-sk'VKlܝqjXe0yy ʄ #=5G)*(_{B MWlBeU2IuHPc ,{Jn]EFPB]1a@J#7CAJՎd]g}xu}3//cm-x acGv.0mf`-$ >:|@uPe0sJ_7#J䲯g*ֺT2I (yR@ q,G_~U(+"`Z HDN62/ϒZ?F2]^#h]%0A%-s 5x$; cxc`QU,ԀGַf>oɎXżc3@5[Fe6I<}t./.Mߛw y S8(dU[1d@@F$4V=b@.B^Z@mDebuj W5M /;{z |?~ (esT&? B#xBdJ F/Ƶ#†h0_c˿.UFŨ<ЭYZR&;[Aq^)k P. zfy48iUgS5n(!/Gq/vdsC Ș9nI,`Lwאz֣cm"K3HςO(Y<XFշn֓bCcB_>.O`E^H"|Tr@-r;32 ՠl%b^ۋ[//o[$;vk4\{4{[poϿK7!bVGh}k5јv򢣪mrjw}5ګCśħ\/|u=Mei>;N)a\p_| ~?7whxGvǸl85v/|+qϣ>9O޺-y7OBq/_n?|8 볧O',6y* n̵|{=OG8&C|mʾ7m-= 71\ɦÕE7\9=5L WpŖmRÕp+5\J WjRÕp+5\J WjRÕp+5\J WjRÕp+5\J WjRÕp+5\J WjRÕp+5\J WjZp"h} +>}ío76F>vM+6(]ڒ? ێ?}XjeH{ }fCt?t-oNWEGHWP!{ nI8?6r1{={BB ;3nvw.o~3'cX{f퓜9Yݧݜqi* 웡iYa)M?B7DW&oЕ'TjA_ ]g=?1At ap}maho2Ӣ+~]՗=Ec]0ylm:yJp#+ iCtm.`nmNW2#+1J!6CW7nF]'st%(Qzt%M]0y ] _mS++]=Bqe6DWlvJZӕIҕXY,EE(V̊̌<"3z)@L;sZCqY\.Mgio1d)w覍Q,gP\MŹ,-< RnG+KJ)O_]&>NW]&-3qWI 'Jvjߡg+pUXqW("\UVSwWYJ-:w+N5 ̎~h7q}l7iul')9讀PKs)bb:x7!<0\bGBhr\XhE| V;mW>YH)zwHDm+ej8L~[4:bc!r nvhOڥkPUJ9C brӿߪ0s]VFͣE}UHr"R1)pH,G}Tԇhk/ 1*p,"ae:CTi+z{Y 9 $F (̊hr CYRW7{ 6VNmtƫ"pp(p@ΈC3ã@RUCi>*%I״ L9pEiYP &-9A4P,SUeDO>Z va(g>ܟ^_4l_Szt$. | }^O-udq}}s5X| ;R:P%p1B͹_[KǂkXU_aiK[|oej3f)H9wX# cC*1舂g5qF;PNWsYAi9;,$'.(~vT+T?ٶ7dlx=XwvLU2y}?,2$q+4hy8)_4鳤55jσxBu[t=?P ?BVखog B;HYa6C>{fI6詅-s2}/9v}_Tϯ^]J7gO>zhO-S `L$6zojiH,s) .SRSb-8fiKh̳ qC ȷHN 5A)΄ȉOQ1gZ{dqqIS>(ҳej|I*FSNgT>!s^wd*g՗4ݠTϤ} oc-LnzQO/Xi#>(rj릸; zpγj}O nI89NzWʫ>Z]R8vMa`x%8[Ԇ4gd%&4H52xSB5^'q~!9ɉ4GuQrUEE֥x1*[]}Uk|y^E6X Zry5s|_C|&?O"1B;opRfD8 -)4(jh P:F0V zt,f 0h{Oc[ӆ|t1{lS;,<+Hb$ ^DWQ(pf|P&9h#r!ra=76Nĩ2-5P IO5ZkCSRƈYGE^R|m5t5ü1|X]R/).^5+U݆Tu!^Ξ/. R5rz5b b*!ibn~0fǾ/6M~`s[j'fm4_Y<=B[}šktf@A: C4X4$E, .9rea!NV5x1q3f( &eF2 ,Z#ՁHOuk<K-;닻rR:%|sE t^nj,~Oo~lsA[^oK_qp I:¬%Ѩ<G0`I`5[ǃ{ah0Vj"rqRQ :!`Qm xD1Y! %iйFyy@\I9NG:.a^pB$>Z# t`ϓWnadsG]K1hR^GpKPIr;ok~;_oТfm j'P6Ăw%Ԇ\zZ>(&#os㻻r2oC;7NK{Z~z; i^\a!9OIB.+,B]cD?rP /|tO{h_C~2;ei"GWcfUϏC7ܕop \s'0Ʒq<&XF/,2jF{w?G+e+jq0\6Fo,֞_>T犢S޲ڿs>Zi\*//V}ˬ9qM+ޥ.ֆ}ҿ\r6)l3fc<PEIW{ tqbYe|U U*5}l`M2֦4e0uro$eȂL?ag۟PǓE/j:( - b4.x L[&_'xEt2o ¹ugArJ?_;Ns˵twC E7tPUy^DUy2,3ު<Ofj'eB /Se;N)5Bhc[w~{vyY6/$Ҡ@.g"d}E +Aջ v''v2HA-m8W~v9AB"UTPfu "9ʭt -:%Qi5ke 骘݁ޯﺤݚn,i-W/5=t{7]P_".XdYt Xoh)Q P)1ZWlsn{zkِF B9x1e.>_ez귱K߇}n4f8ո;~Ԭ6Sf<'B<]_f^fX f~^6ގ|/r"ySЎC~IϼwpKYԥ,:єEAK0H镳FEo`Jc;┍TF)UL T b(W1H¢xuYgOgy {?"Qos-}`@W c 2KR d jҊoA+oJ?Q"}"!FFC4R9XV[=)ZcK^ĶlYoǣ45Z>;[2m8i)JbWtHR1E՘8!CiM4_J; /\Js)\fZrIמ B̩=ei^hK*N5,8w vZ `%>Ng@Ǖu@TJO+z $[!u$&l"tՏeAS?ɵJLncJX?;kظmB`M̛<ݯ=Y*&2@k5I6`SK3˽ $J}8#g(YRc6>)Hz,tEz8 F,1P@6Dm2rJ@ tp>B G.hmMYCB/K"AFt_왁Ӧ\ыhc,kH>(";Saf*fd|]9l4 \VII7rǭkmMcx-B$j08LSmx NgrExޅ/eҏԯ  SFB̖ (~GS,6(H&yI̖qPwQ(c>0hڗ(~|&˫o?;p끯A)Xc~[NgKvWOr\y5o&'4O,?\_Ađ­=`GNt-d]5b*:}Gұ)P=BIR3qR#Z.i]-WO7϶3W% ) e-KdjnR֒.yݦݦ޻CyKi J9MWSN?/'gԪy}^Nn4kXD-#w'܎#5;Ͽ_>q|]zg m/'k~Xlx}}NVޛ1t^f'je9G&U[53nigygs\L`k=\g7,)~Xw|v]՜꾛˨nok֮{5^f]jcy+!ubyC_,dQ&Ui#9Yˑ6gnzu j'l:_<ӳf?2zcw_~zÏoI޽8XXwߣIǣ@>o{jfSk[nwPPZ|2.}{`YIT5O Sl>.'q3Dׂxذ-|N+OSBz[ ;=ҍs %FOۄ~6& t R$?Kk|,WJSQTMOza\S'h,?VxŏJ0lB*-j; ĶN/g:'(CNv^|D᭛ (24ҧ Yn"Ώ>+:mTS| 4\d[ƂW/$GDwYRH(>$Bf.q(J6ZAT`G+B,`6OI( Eo lѹ;J NRJ0T+ PPW19= :2y ampEʢ \bjM[Y@z#Kd )fA7ЙX3^*H)h$ٌ"F-,?<9u B3l_S't 郔DZ`TYr1j;Mr3j(dhowe3Tw}1L9-sĿ|7g3'56Eh,ܐ͔A\$G?MhiΞ B#V)Fj_] ]! [JL跨6F*)2 R1 C(0 2HEOEXϮyNB:6켥{oY2gJ=\Ӽ>yqQJh[s&8Bg>Júd i/O%4ajCoS?6 K[j3x_sZ`eL;kJ团PL:VӤ5씂{Ҕ#Rp-*E#Y(Q$3&,@jB KU,R$ ;7rz(lgV٬B-V 4` XY̩^-ɯkv~;ysZfٔ7.z]Uﶷ"LEݎue[߸l>͗=<|Br;n|ǀ;{rO^y%Je95\m;}E=lΨzaͧ/` _Y_}y]4ܿ|eW6n.~xa]y??/S{އY!w_}`Fw-5 Ϸkno{龁9TܼkzO>ʘDCtlC="` \w yBO5YA=2QhM7 6>8?2Hpz~8h}t^S!ߤQ-JO!+bv.kNJEe}R-Gܗ;7;TbH-Ŏrc-g14*àCJQ: 1Kc,`]-xt36%Mq_Y_L*ʘ!ԸOU1cJIQسFÂߟ= Km.']y6m֫V~8(M2ȜEN(NBGZђH$3,4\q밤TBo.-d:1aNVZMRJo9{UzӅqơzօzԅO o.*2(^pOrWe/hh6[|Mo\c'LY6%ŁPJ8+XXd ڟ>F2dFv/C5 [k6h!X`(EόJEDt})E;,隉9w EkèG{|ԅmh()j&kdM0.Ė[WZa1g$̀I[ѱVeetR1ID63{#a}bU1BOqF4=kD3jQ#ޛȚa,YI B!m\BT?b -`XؾtvPHX AgB ծ%V.F&U}״ꍜ=˜d~ԁr4$_go\r^Ğ͗H!3e (| 0)(!6 [kR|N$FDqԋO{㎡ > 1ESaя褕*B)OcG޾צ5nXYJ>Pӏj@x4D!uo!s wri:\ۭ}&y)S o@},q#?{Xj)> I!OBɛ 1eENY7`5v/|M F*'EUitm[u46/rG~6/>;\[|vdaf?fޤ_^p츙2.U4PڎazԵmT{m#z&H;6,B/l麜 ìL&8/hvԐcc}iRRx: ͯT}.Sx(ٳrE,.>'=dMvA{Wt1 P{›.st)$o ).Bж"maGSN.DM`사r<*mFDZS OrRPKOK?TNHiSaTC7V oK@Bĸz>}y)yM *m??>i՛n =f"|}3w%Ax *m'19_&_Ҹʝ{}͡K:|^N>V6QZc "rTFC'iaVixx6Qt÷uxuKUZо`?ҏjڥvAWJP6򵰘D.M.u\HWu]ڰ:Ѻ}F2/~w>qhMX(VדFTR9Ƶ]Q6 x :|iփKhs& ӂnc5.^۔u[n}F 8UPѲ!{駗[Y=M-86Z .eS *M-҈E7 38HW ʳ=x;2+2mzL5>\纊ᎼRc ȌB1QV#7;E9Q>qݛ oQOT MpM&ZgRobejCr(Ml㥗z80BVs2z+޷"7yߊ:D }kuSvB|t(!f룮0f*3'T!'7y?r)O* XIFӄ l4MNi GM1CjB`|F.Q|Dk}"Jﳮ–CgWQNuGn׺eWq.7!BW!ꩡIW :en']I^WH)ͺꡮ4L 4⪮j 8ԖϺ:htE޲gK&+M_WDirvG]]!0(FW-](]Uueti6"1Я]wQjuC]9 U[p5vM?"JYW=ԕIHW$1^%+>몗 $ %O4Z7߾܁FF %\hh&Jg{`nes lJ@I`'i|W| /kk~Z:|$w)k/^۬W|sޟ<_?+X3l>W iGY5 }:t06{9yq'N7=0w)i-96j V3X_?E'GMmSU6棍ߚd0)gt_x|ITr1RsT h[U._YP+(w5r2~8xrGN/]yڷw77۾vJm6컂bh+)1 }'kqxh HZ\HJ5%OVd)(銀ApZ2+ttbpҶ7-OTWHZcl+ tj6"\\t镐I]WD ֬ i~Hq:Q҈w,LmX +y?# Z\8;Uqك,.b+Gd1Y#=&6ob%}lb+VhJ]WDCUuOS9f#`ltNB*y]e~K]90 FB`hx:6"ڐ[td]PW^ʮқ*#rCj 1/ a1,ai55M!ϑ裦9 U\tE^+Rf]=]-C/X H: *u4+RH]ɬ^Z!`+v7.BZі򑔠z+e 8`+Ul"J볮z+mxNç1H]@] 42몇'! Qq/]!]-uE&d]PWy(`qγ?]mGDR@Ute2ҕZas}p5 ie]D˖u]9o]!{+µl+ 7wBBUuiuh~Jjgҫ#Jvi[!hhuH]Dimt5r0{FW+ Jmz5:h8h *w5#ͯ"tz%`+ʳ+u:u]ejS+-Wv^*6"\ \tE֦+ &몇a*lJIf M 4 U7# q)Vlo]owe 9,q'LJ)&6Ke4 46&6QcHg+#|z 6$)YW=ԕFHW].X."Z|vEF/uWV1kp !ZSQ`/u啄v6oTq>DrcRf*7TSi Im_ІXJFӄ̑ ZɬgieP(|]Jpm}vQRuj\Nl8+u{F\6(Ⱥꡮoc+>}p!p:H]WD ԬJkF2'"\&"ڮ#죮hKFB`8."Z/SQuC]RF"qyEWD gWDiCUue! SltE]/Eku]DiMUuV2FW+.u]YW=ԕWʵz+RY Bz)օc#m h6&\FxB&J_1Q*87CU>]NGQ.t]z;Z0(RވB ꩡN^u% q>pBZ(Ⱥꡮ3VYF"` q]׻uuE ͺ: F;F" qd$Z-RQͺꡮtIEFJfϞ)!(+.4JsuV " -32ڀ(*sS5,*Ug?\kQl؄".&6QMl06"\g iM[Rɬ!µltE+ 2d]PW. #]++2!}ԕWpr(1SJ"3P#Ns$XvAu\4M!iTJgMPI9C]Q|D{=UwRJ잏̖7CͽF^WoLfW~}2X}yR^7l/z[]\/`^|sGbgThR_}tN~ݡxx ClMW' )^I/0"'9|; &`6 ?Ul.ɋY7 jjTO*ȫVھ,Tg3[9v64x񤼾X}<&ge;)4.x+/Ep끑y}h՗yIת#KBAu]g z!uqy>737gxWXa(ox޼9D^14qB˫WS`u6 >ەg/OYI8m?C>/>nzz: ۪.*Lo?=|tIdp'?emyw%ݡ 2>t3.V(鑚5gSsl0`5J>pNCQ9Z\m]^Φ`}Y="1>o>kR/ s9]G\d_fטe3FVz9_Q}W5V̫ŬO߽zw̷O񸯻>\~|Q**hgQUO ;rYYVŦ/ONe_6_LߐBEdN\Cb<8T4E?'>Kb%c>ZIEgw9"^4 Jiv=]GzT{sD*?g*P=T̼SY8N`omP>/kYkkgv|w5X\u1kqFj+y;T&;1z \QH |LQ7<Ey1Q#n @*K" ԷzkUza~|lUT9fWhqYj5GE1~Rqpc|ժB39J@ۡDrUH{0pZ#,=w .R?ߴRԃᅱ]W8.JOJھoB}73iU6`Est=ZNUW;I6/? әHQRZ9/O?N^ Ө⷏ErE?/oI%^[sceǒhj",1ŚdL4c$˸6q<33ahc#Z ;1)f2!+yƀ!3R'DsYʵX0c(Ҍ&Y`hrh yoZ81dew(PC~^pQYA*~w{8i/b޾qXNU09,k8MzDZZiev.poo%cX 7~TUI7ɠL) R4s2;㾸NjԴUY*2BQEbh/ŋEY$u2}Ȟ[Ti(9ń ,:MssΥD?S-0Hoc#r}(Ϝ~(狽qyhy<^Դ6?F<¨]/z-5gswv'tOIf E$SF #,"ND?_؞Fؾrz6"EvGQ.K&F ,:$#UoͿا44#&YJ .;nt/1tcL e!7ӛNppocDȾJRf fu,ֿ[̱iI'XIՉ8_chw|ř-alI'ݤ7 nz*T{8Z!zfh@ӷ~ef% IJq̪,ZlY-^޼zWU1jqR͘4URic*zE1BueDYLUҌ/͓}R|!oxQY$q hR Ų!} i8AI4"@1,cHEΙcHu$*PoIpcGEe5dv|F0h'}wgĀΞ,'qxr% !%͌HLc͔Ɯ9c0fO{t@BƉU:*_9*Wy^9EvtfN;WWNwktBQyvQuUQ+)b%h;gSiZ*ŭ0$ $Əq(SgtI1-i2%HQ(}){Jft)DէYsuL'\,o?K( 9.I4^#} GƝG1_t҅̆nYIǠ{sv,z/ÑRv#Kbsi%d,f[\he/]eBo,3W|q4,W}$NYN1&;`p0xNT)~bD70 Ig.* ՚&vaeL[8u3 Ru{wSS;}M7AG+a=8,wEw^%HU]+-/: ,&,OHč8q H_9%XfIRN혹S #uC ^C%G` ^줛LM0^5|e^#%m4_VGN0[, SvNms(i ( FsgH,aqF $SEۘ9IʌcXZȃ 8!ĘCB3%tŻ/lY,.ޭ8ޟRt…Mʔ&aKuI(Ak".)R"VkjstF-Շ^RiKdU_(y '_ijQ):zV/xBȁZ"5l-v܃ÊSCzN̄/X8̲PIgh3h,G7'gTa/1'!R/#cE%_aVˊ%+0k1e+)&3Deԟ-\5c1w, ĬNBdpLzY̋<>ax{bfTnC~"2VPI 7TVwl+e'1IYytQ,P$f+q`Fpl>(`2IN{nȾ4s㋒txfY+԰ҷq2 pC^zrQeLHRzZr PQnLe/f/pȕ1e0-3W5ĭ8Imph/δrR}N-KW"[Xq@ cV|"NV;],ˍVkj;.4q,ewb3E}ޖ<Pgtno/F/j#m>.$LK/ Kz ٵVv~72@e1u`oYb%! nāP6/f]Q:-[:J'Eڧh9痒0k=a&2Zm&APA /̻7e2y(%5X ukɥیhW"i\MHʣmM݈<4|8)$4_D^KŠT&ғjWȅ.1+`?AGc@I6D FHBd}JxzQMo]م40'*/ 36im\Bѐ/5JI?j?疇f_Q|>Bl xPOCD,_}_mULHXY4jM?:1,Bɯ*?oGbA=2Ð*$ ]ΐ?\BsMsNߍq̪c |.Dz<_t!55+R:WfQq4lie>&7w>$ LS.N%<K eׂ2FQ$b#]1azَbLo cM_ߠaFp!Tc b!48qJ'qغu7"jHS#HNDμV\ f4uD6<5fee.D\J%R I5]{qYSc;ː 0ďVC#n@}$ A.7/qwQo,j*1/c'nDh)rDBc9[ u9NۣP‹nU )O e @}UѤ]v8]O>J1B:dKk5YǼ'pR" Pqʳ(5h?fSLnPPE! Lpp>,za\ޖ7:4 "{d@T4U?`q2D2~>}D1nL$[WȱÛPu,h-p*w2thaF0"hSJH:JfLLR:+ϛ?vp0&cܝtwЫ8vE!dÄfbZ C8ܘ"1Sk <,>>fא+ayhh$Qwh* oSCn & 7UӬKp#&Ї hwlpmgb3&Y "M9>+V6'4K0BjUr1A7%#3l_d2[0"[ی,/G6|x_˂$d+ausw?~7vo<~1րMV]tXQAUĒb3֘,#cb%LտRj?_ǂO#G@I`?5L$]iϒ(X!)ґHP"Et٦U%hʣ|T;!nQC<8Qy/ٵ1aNl2[UYr 3ĔH3֊4{Fڼp"+Zut,$wCɏ{L\#SIe\Fnz]lP'6 rٰ, X$1NHsfinռǚ~Ewx~fYĸK* XT'R<VIGFM6#(Ialy- >+ϢgVΙb\r%ly Jl:%#YY-&㟽&ŒxqPN~Skix}#|OVg5!0L= -r. Ĭ)(NUp9Eu.\)#,";؃ Yy[?i%e-3yˁF+)̳Ft n!a#ɚwx 7V{VK oz6+W3w$i|Ix[h?|KONop[ga%iJNմ:.sRMq})c@GPIKlQ>tk~z;Us$g9Pr0Bzq PtPe_`T4x&ԉXNR;Ɍ7iN⸵?ʃ|)Ku4ov7Z( &0Aؙwc:cByV%l> 9mMFLdR{I;DPo߽|M5F CK*[ 75 A5h{]+FzhXEX<-?v?FE$N\$2O:D/Nƍ37U}d|4BH<^0F@%!);cV p* ɞH ̓-"F:Ŧ- K3@@=~ Խ/ e@VT^SԆ:O|,TE T4(5CYڷ}zZ^sZ }" ڏV$8hA'-j)jݍxad.Hfnٿq-נR9DH 0_]ȕ VvۜT Y8'KA6(mְ#Vɉj@FރS[e0d5i{T ;Ѯx>Vx:e5^cDVAKOCD:=_9t"mmSUA:<[j1!JD#d|*LTG7ȓ~X"NZ='H4gmv0]VBe)pB4@_@DPh0(e[x"b)h6"C1@eǏpkxdEq"~|<\Pi4\Fӷt50Ff8`TqN„Qs#QE\ 0iK^7oi w6~z+ eX`O `7C`15y3]Ybo6Q<fz%E%iE~- 4W*Mo:934K޵6cb4Vx4X4{vcvIΥ/1X6e[Q*vJs<3B܉҉튪>i?}SŠ6n/fe}˴鹯\'дګ/!UəT9˫aPZ˝<`_<J޹_Єsw^a-aD`zqo6/?Y^5UW֋dY~SgҼ օyla!g,/3ۻۿ=e=~)<^3ӭ{W&?ٶy|-)ަNY:yhCw1\^d)LUb$+nۃSK_rȘR;9$6O|HNe|A%d\f~pX&ƋYlF :}>(OZ^X o;`b#(멻w2 k } .53 .vTd~xh U6>3w7ytǗjd@Ts]"̬K%J ޳,4]:fs$+47-/87K}s=/n$^wE0fjwr+VC k X!` ی蹺Uhk6Z{(Z}w\` J+V<9ID$P CU@(FKдpwuBnWCXˬ첅< ]|vVZ{S*#dU~=κRKomd*%Rµ=[@X(翿3>7(eɓ= +ЀF˚i ~0*M'kX1 ={0-<4Yb@hm )2LL92SհƫMi 5E[L{篶Q4nNW~?W4"WٕCz[-H]cN].XC)$C5U{"jAx焲Y麍MӰưs{4hV/sr~:X\Ír,%cE)w} *h..NtT5<6#^_w̚HM9, 8&D)Mh(lͭ 5=EOP5@Ac91q;o f]i_3Ԣǚ!bڳ{a)%4CCS:Pg?1>*xh.0(Wo+va1m ^.'LNzW-`<'aA9+mI#B*p dc[O_gcב.5LǠ VU7IA 'W]/4X W1#FĴQLK<$Q䇵*(ANkvM1 ]*N[Dhcq^k S8r!=j5o9}1WqAK>XMfzїmߓVUj…Ы,vGؓ IB?< /b"ű/2j=?~_ԫ*+~2ow\]qRfJJ>=vJr-Y[w]}^1pJ h^c)K`,/gR*ʏ'G$ሹfٽi@_D~(, 9Ky+*ty#$VMꀛolqոټqmjo^e@kv6BTk o/!IoSXN87~*,{M P^~kڥ)SJ;@_z VT5/6ZfōoT)sC;eb D!%IcgάtloI,tLl'ˤaVhiz'$e4Y-K!Ƀ"-fҐ˰g`z0-NJ'ͼmPYݼ\%%JIpNϞ8i}(P! h"Gڰ9e m&Ao.xhq` I Ok?"0s] F*DmtyQ& +Hd#l qqjq𘩯؇ MեS9cyZܭ (Al{/T.$t dUZ%F|V1: DZ\H@H T)>G E}{뮂Jl^TY9ߗG]RXU@z7^uyq@ɗHhw@^61nXE@v䅮5H+[ݗEámjTm-9?7)Ҡ5x4`KyU @úo~s:hnaD*eK808vb^`7G;4 AƂh>;(5f}0W4‰E;KcKPJ, NQq& 7hK Rd:q1=[eaM,= $/?<}M+ƞ[DP'`Kw0ڽ`[$(8 TCLF/9D--nGfΎEvlk|B&%Hf۪tNlذƧcE;25-Ô|.mT3\,h(˷~1qqzC?Sq}P8?RTqO@:3ơ(E'dz:Ļ RC~|٘x~gߘ8tXi0 *uŽ]ќq"[ljZJ$yGzq sG4h,}Q/$cO\APA{3sbb̪|" *.%1a(AU O{'a;[FJJBuw9 ݘY$$$hǐC1#mN! A~BuqƉ#Nnm z}Q"DsͫR(Axl~).9r|Hu)bL}]Cvg:8|ㄿXGA;ҹ4jxȤLPaDV>~gD HcgWD1<#'9b:)Ic0#W=V JQ8Hh.uDrc4D:vt0%'- }~i[􊕄1 x7Y3߲f)Gl@߄3ao -#5Ԕ6~=6n7VAexQz ͓,Gu?ZDLO8`uØG`UD5F_ m/Lq% XP'W )=qaf:0z%R\N44 w)1 SzhF-U \ _8E-}eiEGM_=/'X!!h9|/ϯRDbN{1ǡP N8gWh/5ҶЂK#1 D 1NU "&"om+PY7DobZ}w&;F_ JA?IPL6KPrmlZu\m&jfod̴!Pݯ?f]xsc͖ DE[ 9|qnڠZ7M/hg^QΊE{O&1_#-LĸM/_'_^=TY/'ΕଃӒXS$cuwXB\fS P<ν+YYd߲Eϥ(EH",cuhhV$Yf| >(hǔi47YmɈo dTuiG$E$Bfm[bQ6J[J%Á7ʿz>C%)-eH )42a}fir3VvBG^^׸i^L1WyƋJ#S 2 (!T:YR4%41CS|WrY5=ʎAAdrG.r(Shf8R@3>gShX}CR6W4oMNV {zÚ'MӢX1 -a3[l%52OW[(K2iԆ—o'$7I&iZñ9_8{(B(dL99L* KD^ аƫ<*~$,K ;\1km/mqp@-C䐴d{qJo P.+?SH8]Zo!\TS]*sz1bPol+:s])3$ܩΚ@&Efv8(Pc!0n'fֹUVf]),L;d>ƸW{fóᴔ7:uXPBhYYpf6\ÂnY4FE)ȼujHqթXV빶J!n!րFpIb'F(U15Mq5)QݓƲ*l~wQwټ-2 GK(Q=y[A흥Ep kU c54ޒA1s!<Kg Y^j a%RtXCr^v(b JJ<2"b?3-i<fppWP@7S^ɡ5W{5@+Z2ݤ2smr }q?__-tџR,|ӟS.)\#;trEmiu2W$F7aDR]K]*T:_ơV(GQ{8i. =qQr.j{8Pc4N99_8A D*sylRErǦo+b) # 0R[KG\Z!7,f 2ݮޣU"pq:a-r_'f7@s,uŻNx)bB2>gYɢXKJBbV9X0|Jeoi5'O_Ss^|tK|ւ25%q{0 歸 ٜf@/}n{}ߘxAUOs u N)]>e qhs 4#uxz.{WX=:LhJhJhFȈ2.m)Auq?ޏ孄_B5F&`,BHZ,%50l d BH(>:YD %VH)IbxPY4qY0G`V ћۿ-FC$Ct E ;Y֫ǥa*4M٤Cq'` k -Cx-J'e:īAPQ/[Nxyj ]%m,PM Q!]^7 YeEC߫6x>,O7 UԲݮwqX2>vcyxaw2j 5p/vkf[!FѨ(䐖 r{ĮRo)[魥e'_}&JgfcFMC`ixY+Kur]27)pgկRvoZ{X`sk7?(Y` uFN,K5v7~(džFHSA_X4vO?|㮎!Osќ%x0qVxM(/e$B2Tugt  Mn-YSͿ TgOd؟3?QL%b +L zso_)J&.b`ao=p/?.J} Ϳ%ʧ4l6b=8z[sԳ{<ٵMo$YH?OLpz~]/Z< 8 `M[.LϿri't,pH5폻ig3.gDjY"p¼a*Y8fg1Xn?rU4ޡ1E,xve;3C(P̏4ޣ3jԃV'k*h\|# &UΥ)C&7`I.(mZ#(+sV%)Wv%x#iK.+ՅQ16-'{U24Vɻy !5? &K. ^ y<1Ju*"xt^4 ZߛrAE4ѽG8TW{$Mt:29cZ{8#z$22D@=`qL;c/i+)!6HSBz4%~A\)zBɈ xhRjg <n02TkY8=xs^d öWQPU2˫vb=/7g?ncS<7a6ٝϹS>^5LK0~D568nEP-֡c;g-͌(Fǂi|@>#3y!|,n?8Ig%1 {i7.r|e:3$reyJ"0dj6u_^(ҡJPPΦg;f3f2G9udedKG< ƹ2q0^0 khmghOᄏr{QwB^m lM[i=!BY~.cph^}#"^h1*he#x c_l!/2e9.2p:؁T Gm{ Hޖ# 00 !y0uT<#Peŏ!,ECVVXH6 GCm:=4gx?BO|2-dP(;sUEBD 7a` #`TV-_SDV⦆Nua*h\ĭ鲜bXȗ gC>&Qw: n@p$\#ߖ4,SX@RCzte,uR9a O_nbΞ3j;AWM8P9VKkyY{a ׇ|W#L i>|TDk־ĭv=ڊj: I2~d`MX~_up4g<F„[ ̋z{E$ n_؈lAKJ(6^P9Tq94F*rƂQH# JU¥=VpiQhF4VYFs^Q(ЁMjU%fV9 {+ᥛoFi·L 8 a[b 7!*F̦VaA|iZL!TQ(\|(.r߉WՄlZޏvYO^wPXd м<Tegp;} !Z&2^innݠRΡq:Ȧ9Ei[ x0Xx|e40+@c 0K94hvF'BQm Kf9O|ApѦYNjd,{EØTHFnJ"Q9\`(IF6ģ($$P'@ͥXJ>ttDtdvOi@:b N$|^Vw͏Co=9lZ!鰉6 5~su,ȁZSR+nDR吨Z 0;-~/SWEᝀ壸enkY0NsV#+7j`nKɂQ~F} qXvP~͐Ԫ䆇G?\#4;WAYC!F/(Ĉ Mpk0D`^ O|ٗ#76æˇq6ǃ؄x-Xy%`B/+q'-JJ<m5\C`X\ ˚kkв,d ^ @qz\V} pk̾_PEk9Eز]VYa7ƾt1]!:o;q/Yj{=٠F !ĸ풅c*'VиD0@՚slK̗%S-߸~*+9sa5g fe^O,*L;g OO5 :0$MIJc[(-iXl'wa6X^8q0+琬0ݮ\l"1mF笂pM[_0uww^_p.&ZVYLq6t׎/|`#$w$^X^9)L"!"b)@;PQ"Fߍ .oK)qs|T hT5 Q[޳ʈWXD!qWYteWغOl/ kx 1_= `q-6oz*ٲl:vOPzFcQ>5Gʺ7,we(.Z [헅T0>?rYVI}h-',8ᦰ2Fs,jx+,ڀ#2$8XMpB^Nz.tڢ6]eϢeYv+.5K!ȺqٻƑ$W4fY(zj4z[ݘAAf!K*>ʍdD;)2%JPUAfdD ")fM]گ {ƈ_L*n#ܦ`B.p~<@ 4y+ìX8P#x<~}9f㛩>Q>{ÙpX4g9%2Ó?Z=;UGJn:=Sjȗ bNcc_ƬXN4 x4&gciw˂FA׳GPOՖ+C]zE?:W\\OӢ:|nv7l2媡qA/;<0R'zὋ.{<(4hiv)q ٩(e ;[ikh--[^oMROXj7da}1MJpGsD 9G4(A׏%dƹ&_u:}n[ěA^H/y#D],q|̕,{>Jk˺ijK]8†wybÇKvs_=qaG͒3NFa*Su;N#khw:YNMKKW]N:D-670dИpf'Tedd|15x<NLsݠscF7)g''{ՠg)UC"__Z\ ~V EۣVRC8H$١6,N'5y~BoQeȏpj睆 PgL%541^pet#CHm7>D}W pD\-^PhNЂ! *(Z$w02om9 ޡ./yҕnv=.*rw DCPbHPM²~4rڵx/aZY~>%0R=W9j 1-iW}aOo0m%ESY2B x- ]!lR$.Kf a,IcJXi9GZsXy#[&C˹ {r_TM*VIu'/k}eAve]bx%8\ Uܩe_{ bwf 1y]>2uawIY̰V98 9NixX,[lPxe&2혢|}*HȈOjh1ؗZ<̃XL zG1emrsrW.weR?U%0DĖU'RpYRZ>f,ՄkB+&HRmy|uV!]ni\X)nwgcQ1<-7Tƚ$6 x=J׊1 wlPu\4OkIBj 's"w|Ҩ: բ^WVs:^ F婿ʾ;ʘeDbNr4Uq\R"EBtiå1L4ҹ,cN $BSS+#UXm[Mg6gqjYBPٚ-c"\my'%E2*IseLK EBq![qGRxY 5( AץfFk5/n_AY^S_('}Sqfー\J޺Y_oױ[u|V>K[p0w;SZ[O%'U1lPѓ@ؗՕ6#RZ*,_4a`žXLҢ* N \Zjh#N A)9Gԇ@TYcLюTYV(CKlӥdm>.C{VŅՅ{P[(?a"7IYK2ťRоVZ ZE]r54X]1R)O[C(Ҵ01 c҃z-7y wNeAB *< r+V~Yc.(+xh J95;s9<"{9P$4%O1fEQj8È "b]p-9UQOnj~```fo{_lF#t=Ce:O?F/0?%yE>|3/ gQB~cOMvqq4Ȋ觯S/>Fvz`#^NE@c.FQqECnq-l 4ZFM: _ Rpz̹!ӎ(b!sVNʸQϘ[kJHSxR4ɴ!8e] 8۰R kԵ@ ;fAhْ=d*O(rƞMъx $7w_ ~V M%:&|1ILXqҖ+py>LgD4*hAEva@6;,d3)9ZP!\a\2n8ܵL9sx=}W"5"]5yS.8م). e͒MsfL-:.>Ddy_D.a~2 K,8I&M)\ #XA0YB>ND)YНo̰Q` %e:9o, hiP",GZ*,tZJyuu{27 @85 (N4865W" G*kh̒IYnl kˌH=[t7_5N9S{H䆮}7{@PXF4-ÓNI𶩵qƳ,$1q GfL7ƤRJ\:)]UPnPSh@Q1XNp-I&lL qySnlA/OjKp'jeXeXXLe`$B&L$JAq{D ! Jt9zZB(7sVk\[')$!E$Eli1,)2C(HaV#}d~'~/'l|35٧2?٧o0=,/< l~mulu~GR o IZ0 S ilTBS>vIR=18V2A^THM.W.}ZC*)¾g]+;Q'er%i4T>Gz vI&W )%oIQup!&S4V8(D`˭œG-/?}dQ4g\ lJcLRJY7bHyYkq7g`{y,|M+/0Ӌq3 X#K$i1TRJ8p .c^5H_|k诏1/CTnyrQ涄>F_E 򶿁klN \(4JEq]ķo5J]sxq2\ $Kh3Pu]>y15Z|jcN BjJ*/qG-e el|dž-tdra*95e48RD)dh(cK;Xyp 2*6F\W ]~]%v&Ϟ0  F"$y g˥ Z1Zs1U];\P^nxp0_F6G6q,fܤ$Õ>uO"{tJTa|xjrڊY]9 qY6:7?qb/fnjh1BZߪlaÉCzI͡qƱ))Fݝ9\f~Nۜ+J~JE1==x=|2ƹe*kZ)Yjm :mo Axrވ/Fl/ꂓ@}Ctݫbؠi|  ߄)Xz94yfŽG] ވpJV˕_燏+n=}O>0&ŏ ৛br} cIV$ͯñ>fF#gn%Ոb3="YXiS>l֍ƪ$*ԩu-c^վU^G2+v6EaC f{8ͦ"a>i+g$^}l=N#71q°BØdUlZz V[3{fH dlc2mu![1[JlIZ8,V>6O+l(Ts ׏l $M'Ҋhܼͣ@K@Y}zve\jHbI:I1UPd}L* گg:|2F=dzԌ{pB 9t!ȠөFmI|{&%mZ0{I|hVMud8P"r֞pILڹ؍Wu-'N-l2iEXN !!=#<g\6:: %CE\s,6N(ƕ4BpN]u_* !ԛs.[ƼJF5xKU'˘^O3f=2OmQܜlE0N P)X}#LAbDXohtQJK 5^l0Ŗ~6~Do?>s;1W:-^ q1K?( uɊHFau* aˌl1":XMxxk(X R[TIz<!p2o%nkVQ}{iqpJcuu} &уS8lȺQ5CJcg9Ϲn\6)CsL3oa4D֐&=ɆctVn(;YEw-6Uz֪۱lQ1!xKv/8Ԙð8cwChw{5 SC2Ѷ"߰mڸ 8-{MyHbp7{p 1pb0imM67TcssI6e|6l3D;~-MF%:-xDlݰ_5Qk,#+{Sv V]A"F4[ʜbL[dckdY9cA ǃᠵF۔Я٦A_T_6мopڤuq(qI'ޣa#rL1ʞVE/t7 $qTw: U '+Ae5}՝~x/-`gݣ6 x:C@R*~;θ 9 /ﱕ*jA,DE':c^|t>0=ń ®O4 h}xNXsV"\Nݔ[?{>cQpLzJkX/췻 'g$G?jӼ,Jl"oh- #>h),JbN^|p 4s)\&4ł??<kׯF+kDE 8"vDOeV4n9h`E,oޚPڡD'_XBzR܃hD=mNK͖^=/O$z#PѲ]疐-dXGQ^oNx!_9՞b"v@`gF]FY3_!)4Z^mw 1mgRZ謄JJXQ{ՉMR#c3 n5Rl!VĶ 2]Ms8]잶YD>j!v4IȅB3id-%@JrO,+SɄ0j]2}NVþ~vHAS]_ouˮqgwm.,y꺗3C˨Mm`tzG).oSbw%I4 U>zv]p݁A{ʽ9.sXUc1W\!.N1dgW,<w@VVx0`.~WU\lTwNqjP+%](͚u񮴌) 㨗t/GZC>Qkf9YC-z\F|293&PC48U+H(ڭq}:$E4( n\m$s%>ez;J.[q|Pp`tC lJMZ`tr"ߎb}_.-gup^@Ɵ~x q>WxtG{ 6(/]yV.լzU~_z1Ì:=J'vM_rKWoMߣ%&f@=s}M%A7Gw⦆f m8عK/0M9c0m<Ɖ5N,qbAXǗM<~f&o4y琸8H%Rc])v룷 ؈`d3f׳^.d3a%C n;pc7cd`g#o@6;CwJuO}uM:a,33|u@Vy8L6js*lU8~*= A+ՠ;ؘq `Ƙ^tZ-4/`ԖsuPŊr0ӳ/{,U6*C] p2R[{2Mm|5'x[5湍b;I,?l Ғ֍k Ѹz|l0)Rq#㚒]lж'ؘA(d>?XVe.5rwK]j=w^?1$°oޘu@T 𫞝7[6eݨn nkжʰ:Dwi<7M3Wobbm@@0+3 pLlj&$x*!SVw,86F!K.ۛl&Ȏ=-!SKƚK`WXz0X (#1M!.K`\uv|Ms-:]d;ZurSPZ ah"R7% Mp^?ˉM1iXŊ% :ڭN.D{!ڷإNэQ^Aۻ8 * ZG猵9{.NF:H;列 :[N>VV/tCtOn{hɑXXNfb+}p]E|7{R\(?)kmBLb v$oǧ!͟҈T˥\?*T4cX;hG:XӃ#p򯃋3*RvWQWnW^A絡OgG+^%s6(t|6y?N?!_ĻF~^hg3G/K׏l]FkrLrCνʎ;2GÃ|C:E:zv.2NKc_iT`<@flUrOk=?S6K/m!U>-{=kqĬ>'?!~]wxwuч÷'ciY||2"NNvS:bAwZ[m"Ju0&&!@plue_D6U)6A8鱥w77\GFvPd50LN҉ӝ_?)w٧8 S%E'R~ko~o]N_:a8L:7?a3 2È*Pqcfɾ!'L~Q{ADMCXSc۟VW^ڮx"$Q 7tӕAsd:'!K/'X"Dsv77ޮ|V7'IEU~QV z QdMl D0qDJ !k}=]A{" d"6 K*AW=T i[*b86~O27U7)7D8?~ۚaބ('k=!NX!dlE*MPo9WdTڔ^h{k?еQԯ*P [юP: uRpz[J µnQJsa sT2Ehf!kA)zeakoJ;ްCytܽ}MG{{"I_)i hN {g6<4e%Ë1#+<ݦ/J9GdEv!آGdwyFnټhlU䩹E"@ǐv44[Y[5*C6E]0G4"h'[ˀRFа6ZmV8J}",Z<r/]7&m[=ČBgrpQa=|#(W(Z3SEJ9Mc?g5ٴY%=x&[8V6"%u ;yKdo6;$9 W;NƟ ̩;ҔFc)8J zQ1լF2,VTgl56(5ֽ Q,egv{\#TʞmͦC!z4:sY#QGv9+hc8:B !&6nG\D?[Tf^iMycPu9 UJr&zn zσ;[fUK7pwŜDI43ԴӮƴmI#r/8f:)FPCkeƅ0FOo3vP܈ٕѢV@5>-Pf.tA WI8#TL՘YRfNeCܣG&11Yx`-h<}:]>ݽc 5Êտh;zT{tz l|6W-j3A|27 C!-ͣC *YG ې] =JND CJ KY5@4_rW8u%ȅ2h?Nmx6^e9ҭ%onކYӱ1`l$Vy 7R!Y0MYcsvYw'n_<јBDw\&1}|o>W8Il-8xUF [h˽Vޯv1x u 2@(U 4iAࡣȱ 9#5BU |%"kIvD6PPb|\|);ĸpMAW_ ۉ6\s/ت f>Cak9x Q͉T*T{6݆TI$rc]WQ- 8sW ѝV1`_t- 9}p8&N;$g0iK%gjl揰jfoԆ9؇6m9ok< a| ;vW2|40;'@(RX !C%¼م們v/Pbfa, gl_1Y{Z6Cx#E1>M qZr5bCάobwTwދOΛ$ -(; !޼$3%|YB9F"g6 g-'.2?@1rd(i:|A.Zr{} H7WGãE fr{b0J7#tfS\5H %q Io!RITy8zLJ흧bN՚]1C0(k44FǢJMh:| fgY3aX~&>=ı+M]<ش"*W!JձoJ,'Pb7LcF~HĖ|{̽JT֙pfEޯJ7z@1 ;=-[!O3hبeܜx3f5gqj bo] k47+Y)JinR^)+kYi8µj «sL` ]hNHRV^;{E PwKų<{P%gjo \iL< _~:30?̢=ǗJoa2F9,fBVD=?=UEPym׋闯"0ӇӻOg?|/~ϟ{93= Fɧ??mz1mg?LOMž)nU8@x%Zє_ P/r$+qٛn{KfbV=޴U:[ٯj+~Q2niuL,ӺשMWC7vZ膽h.T٠Un6r :!*{Ng7p|f@ItpwaX8b;]q o^3O+=?{4MDf' hf'}-klWU{Sq}RWUDžvm'F=JF=TŕűG6ݣ/ӫ 9ͬ MװHKlp3g֢MwߵV d>Zc#Zo=DwS͵kḄzxu"pJە*Bn\;sќOmIJ;`.,Jcs! \oˌՀU l?>: g".ڬ|0 c&3l eE4 du oLիXhn:Q{+kdsNck\ -'C~&3K3@·SyP_A~35kn1os>)y_G(>(T}wo2(.6d]nH;^a#ws(jdMq/ D%q 2C`1jԄ#7÷B$ CmiWe % !ld7Bd̶YU1+5<8m'vCVeY_;ih%l玎|._޳><Ƶϓ0-dPkq)j2)P^a@X7RJ#9WiXԵڦu_x՛3}xGU@Q3U;6b.ZKz;aM ĩ'9KPPʵѿxv#Sg,:y^?g|8;&}H9@!LeB>o;g}s9$Њ6e(dOȷ@C`&OΕ cB8x`y m]' $4ict1AfҙܰW]`D951bW4[' CS fHj|F$s-51< s<!oLG:hBmܧAgS͌V{~]&4Vg{`r5՜D^a A)y0cQ3(YFc+AGG1ԢMb[eCx7(ĭ x.cvoƭ (7CMf<׵2Q=Ǝ:t(ttם '@ÔZhVQĈ'bh59g `zt:$k'3*f.\F1A5H8iߜB'x—PLY|Rgzj~Kz盭\ȏZPMhs.JEKXauJrB*W:Ez'JdQh.`Aбg}ϭVg㕒+UE p*!!kz_P[ԗIX gs14e:RTv }58>eS JS2kz"ٷ:KV VAFOu "T Qi&v0)G5~ Ί< n=m{}';׃giye~`o=ktY2Uc?޸{y~,MNE縡]wSlӳ`>pbd0]D=#wz|WPE1J4GElǧ);ue庉(Gj@Jtssp!Ѵܬgx5p9E.e tFK6Ї(֪~&`̶`W0+׽z|r.jZNP˪#+ KM- E;ėWwa>535J|GnH204:w{oL& twVH0@tqY,U)P)#!Dّ3"K$R;ICoK3%HQRTYܿ#Rlt4>gVZhj)a,D~jmKjV3bj49*ȰH`!3m˧#^ff$M\@ÙlS,ap#2jS:V#EP+uM9lmŪTlQXvVRo+,(|]^WQ97g>nxB|^^eA^Br ^:~ ކ款ݰW̏.]fr̆YJ>qKzf7#Z+ooY5aHβH\ؚz"L'^bbPWz~vrup^Tuf[F b2۱%2Z>6f]-F^kjp6Y"l< XJJT*?Ko.5A@%BU=JL Y[g. |2أӄM4:"`Zޥ-Ҋn<" D)D)nƣӊnMKpSkիT?dnJ*ZWvS F5^7[ob bY9.{] AYbHv۝>H0踜6Kdz7:` Wׂ4m?Oڗ뢫4t󻯟?t) ь쩲d Q@qVKbTS[:r\,#ҊʱHjkkgV%딬tݕ 9{0 Y)rbb${ IzhPggl4_z$ꇓ;.bQcgxe;IQ!o_,0 0yz_9Jja@<>.Yc8'󺯚?-eus(&gdn$~>_4[g}ݝD&%P6)f#t?ɾ'ȹɢg]|V|ALH Gg*lAq)sa?hLN՟V& UD>Y$J$/D,FũD#]\D77h)w@Q}gǪM)eQ\AmR#$H۽LsLJϪR7%( q0nwpzFKq2zGވq$酎ưA_ I);K4Q A\ 0:O"[@KzޥeBy͝|Z]EwX(rի+yINUT*SN5V\+ >`,Kcj4zJ 07Cf>r\Rcw&Ey4?%RyEwqʿ>~:v B8}~^v7ͧ#}oÎvtvc!U 0o"7Y 'wC< ,9I(xb5L| 0@ {&B%3'[AN\pΉE,$ₓ5z$/2Ԭ+,I7$ey&1x?NLCu3+Ѫu+ȉk*v.Q-Q.Z3#AG#\>Zn9uMqȞG\:anܶcjgmv"^ cHkT,(4rumʾ)KsT~vrȉn 9‰_ډ* G < x'ȉkQ/d%gE/qPȊBUTfQ1R4b>'ح '9 ."HaN)rXa|Q- (F&D;AN\ƉNjTtPBF3Qpvȳ[&_!\jխ 'V;ɢLNQ0~ضruA_:n/G5G h<=!f5nk9L 嬨 B5vqP6&+:2g˒+$C*%_fapZwֲY'(vH%Xt:JZ&xEjԺ qk7,јKq#ӨՈϺOgv8ߑ)W]Sa# #^K %άo&ng1ioLp1{bZm1t?xWV>mdiN6Q2>^/dv_[3=^04mOU_ (t#Se fePdzJ||9Ouu ^-wd|dHWV]0ci}2JcFvW1{^"+R0cb苳Ɣy<7swv\\٪x7ĻķPqR;cPc'6z0y.NbPv,6Ȅ\M)bvׅ9C1Bڥ-= ;Nj]\oR1/ [ }BvBX„8 K['jwԎN1whR6P;vˠ"^ kMﮗ0]w5څOAN/%)mM }{WL gCಂljfxv\- cwFkI8sasq:i ?ۑ^;g\s1o\P>{F<[9o챦&:{>ZXf!}^uQ+.?,_ Kz?^ vT|xFڕi#:"ћ4]<6e  n\ﯩh&W&=~^1=ğW* eox(,_Ù{o񾾪}>Fl&FvT&w-קqu6j ۫9eyrL(4qrR5(2f&3 Tێcbr<=g#BC1Y1/ rHPNyC$+^oJ #7Y)ތy!!yZ o }pi( c%+~1b >ߙb ,u,zX$<4Ȼ:@ <kb+'iB~,u\#OeAKc#e vaRCuV+bx< Ͱ7~:st]i~87n]i:}s赆Be5 -Y(JGJ . vV`:zj2 !(dUǗRZ0U%;Mt@VC*L5o&zLA>Hے_%oi:⳯uq !뒘by=/+vL'ŁN`yW2kalSvyvIVE>)7 GJcvBBFNm cq BƻxW8MlP+$J"| svI4j}<Ӭq _ ʋN2sᩇd928vPw&^dd^w5 _c0 uS>>F5hg_ռ֭ՇWC«m 5)'@Ac;+/ 8€$Wo ,ZvE| d0qT2ޅ_*~(&}M7 / $oT}xSx< y)uX`v(11QAuį]hM>w1gcB[I~y;}dJVW)ꚿ ssw ̸Vn87Ρ8cͿ$ϳt]ҭ!3k8i;{;%+`YJ<X }L1Ba0,30:Fsl8@P;vi|3 ubNXHx:nyw,M`6>B@c\ O->bceUqFSX[ɢǸ7/wV)ylݵbژӺMa7"Yͻ4Z~dGİ"oA{/ő,F0 WbifVY1,İX}qsLfx4|O25Ep,?ůXH寬>+jJ_rK~3%J?j%S5$F3S l{Jk02 ߳V>V3W!"+A!Bgl5}w#%蔕^c8F&&ZKENs0oiiͱ,ƶAtv1[spyH&ڻu&fxpI<+o5/I<+9^T@#GRa1Gwh qĖiI;%7wdNaG'.F1zN* ]| P@Ň׋ytz ukzpwѲo_fcaY~;7Z0+;"Q]e$A+7_1WB k|$GWxUCw_د]Wݯzznx)",Ba#hl^:*T"[ƖU BjbSyU ~qFꯋ >hzq4S?9iq?~h==tH86~;&)^ w#Yk6֕?raG9⻲lUOve@J{ƀ lDC8S4cWTG6ohb I"Mhj2NhgV}u1->7CAl bXłج]{Y,嫿{(qB '?e(7z~bT=q`a7Ҁ-S6Jo8`oR_?{ȍ/ڼ*@ X`g`)eli%Of&(Һn%[eZM+{=[E4mW$azvwfմtz2%h1tBI2xdf@zPjo@+Z1 _7H5qQ "GoSZ! ښXN(KH*g|TB'd"CQJl0&(:%@;@t}Qhi5I,a OX퐔TôGKILrD?bu38dVx3){“I3$1ed D1POk2!!sBBټ/ bDT169Y#iyhm,xJT-$N阞~HdAx+mw?M'=0Aer"’OZrb98%U9]$y7^@) 6cᎧ4 Q#!HZE5nl1=BRɖ>0-}Էja(sAJ@o70ˀl7ŋK)$jWN=j>U2v;,8-gxDނHe xhW n2ߒhj'vxZ򼔦EdY;2J2_(66/JO:: OYݛ"Dwp??_E+椇U֖f֬Xmc6t{P 5gJ}8s郓;ԯF Pkn4ms^Z윗;yNaGQE%^ 4AcCÛ*9p xF;7;PU}%eF䀝yy)-IV՝۝&w`sٙHv)eM(UknC)| RVS25Bc ,q X~-$7jؙR-d4UC+6VSx$ ejb9ʶoR kbs7X zvI.=vefXBY1;I^#\8;׎:/KaR]G5gOTU6&g1jV(XbsHpRT*8aPZFU;pՎhI#g,okx%<ĪU'J߷})}ކKLUID%%rV{UI 8%}4rw=Y*WUҪo,<JJdDǣ©k^r Vxv`2JZU҇YZ ^U=QD-9ɵBh\)<zjCjwઝF j'iŜ+<J YU=$vފgV<+Yڷh 8=4.HeFf`ZXrłuFq;Gl-Vb5L'j ШpU:}`D!-1eoo~NyX{>Oo ʾ˞W~w]i4 ,(etL;/&Ɠ.M҄s Gxʊ$@17&cM$44;^{iFn$Cr:-H+[!DYc$CvDb&)F#z;"KӘJl,uAI &,NŐ& N1l4jq:aifm;H WۜAڍP$$UtP)P 9HVI[ꔅt&[ FZMzøY~.u\hDsnRik$}A2Ë$@7J6\gQ`sU9vMzru7& 8At$;ѓȂ +'!N zg%F!9#ƌѨ3]!4*;6%MK9lGBt$!hR#(>9!F)bc1锌t:%8lD,iV$A y9e~vq'u) ^H$+%^0F$܊VMybY4`5VikN=S)ꔵ>hS!ԺF<"Hk2zֈE#+)脪N# S#C jc26oZ*a\ontQRMSG*ɭQ3ˊz&\̦?ds)P=gE[Uv=7,{eE#ǃigw14 E)N?Ж\ķ W;zK o}l8)| "԰/[3Ӿ2IÁl2t9BCM颌^C{VG˱E^X+NmT/MT?4R΋+KcGDiQS꼳>9;Y38 =,u| A;+̲ OQ/AbK5e6u{|#i ĸ;]z> r/??}h.nc1^}{ބ@귇k;lW0qpX|Mq>tz7Zy7fIcXɷooi1EHh"7p^ 5;vېrO8[?Ǹ**EQ윢֙$S/N#̢muI D)sht60le>nƵiAt4$ ||<1'=)=ߝ ~*; m)g0~,sENR8->T>`VCy5?߻qWG0wwE'aM#sI 105NؼZS5;#1w [N'Q5_Wno_ƐKajW˩X!)O6k4/?܅s:&ެOxf;Yn"% 2"%Hì6:AĒO""JgpMSJ/-w7uŪ#J~8z?ˬ cwˤYyWA eJ`riT#?Ǯ'愕'~7N6%4a~;h2sƆjF#z~˹4Zž[IGɰqq-~?Q/2>oa1fo,>F87eMϣ_7EhWum]%uVYӐl0#,'l7`Hy;Ɩ"0![Zt;KZhVeq'g'+ZԷ*^B"\[Ï;`$ 01k\/7w"U TԫF0:pMsSz~QzUv7SOU7cR7d2oMpYG:~ʦ FIhf=vm[NKޕ]Y"gacD֏xּb"'50+-u Kzkzb/Ƴ쳞\],1"oXƹ_f^5(..UtFr)ҨPIct? gDM4|>gZ8B2Ra8g#w ~MĘ$ !9$Rh YŞGRaE)*k1dQus盪FJI p}k-r5z0Jzo緼&krZ|j6le<>@̧aዥa?tV̻8 ~Lg_Fk++EO|w+VqrD6qmq Ŧ{F՛%)){'n#j /?~8q{[9WwkcJbLt4WF)̙c#-=42% *#Q3e)˴D8nchs{]B%(EηBxWNFտY=6%8Ar׎I~u6a|UI f5 u6{:}tϓJ"~kU5.Iނ= J1Fw1Q"~n[hKܙ`)X01lyf`.:eevut06エϦrORqPx#PW&oz҉eiY'5_  PNttbzb6R[ ?w Š3QcӉZkBU4[H5T)>PʴoDZ~Sx5+LG_|4,EvP3ͽq>LGLzx\G.MosD"FN'm9zM 3BF b?.;0^c`x9ǟjeyr SzFZ(C\`9C>emV 5l FwoK|SZ![8`U< eXnDŽ:luߵ}&RnZqo;=,&:+\}J/D[q \ wULZՙnA0wĜ#& 1 fL5#I')ѧ.>Z}]F׭"  ^,DU,(QJM'&]AA I\Lǰfm^w&ݺunbݭzݭ27gp%hX[^92JxTrg9X `D_U^\9G[ >H=N (n OsO-ZߙR#ay2Xx[Yݐa/f F67d%SVzJ|u懽*i| s@0?=üa{> <a.Mސ.Ebv}+_V\^g+S нzC$q58u"/X#w+.(#c.t {7nzK̏9 ^}Y<M $eo_>%Bjb д|*`k֫r*TJ 9( Gi|y箛2J_n 2OrEmy"}^7}JI5=FT$BrKj vGJt튶d>TT~X3j>t,*,vmW{IilFs)/8f66I#„>.>TM(^n !\qDco_, \z|hzJ#c)d6@)xfИsg5Ǝ8BIKD͍@QQ^{Z9vIep.I6a=~;ξF] `HuW ) MBjT`v"&9BXb !S?vEIgL8@-WG8;Lps9 l vXEe.I" bKiE'{k}?/`NrֳDҖS'_]GXA0X/ue0˱g kqjuS &`b-h8j&3մ؆ 2ճw<=;hh p`Hn'o|S?=юd-夏hԳ#9=guŐ $piKQ8)aKDS… -Vi!8"l!{:"Lr:>Y\@"e.] \+KvgpK>3Upg(Аd܄) <l<'> :;@D(H-9@J)4GH=8#8ab)XK

ژ(srF:>B2jGB(ZEWk^"pO }o )Lv١s}q?/몮ԒE?e1[է'e=w v[vmF;m@2mKLX"v ԰ޗwKq7o@[=׳r=׷Z1OĎGCӰۅx7x$l<$;zJnxHt6t QjPN&b'}/.<$C@>pJN]`!ծSxr77~IYse>ɟ5JM#8Ğ/14'}f./c%=|W )tjGfd䦙 ٳ4I] eNf: ›LRka$G,[@{r?؛S~݉q0MyDs\ٻ޶fWi E@>Mܢxӓ I[4A.jYR$q俟YRJM@,Q$wgٹ^MjR\izJm!rqx~qʩܾ(N &7$|ycVtwi0|n<+mN]s>68hZf|0( Db(-Vk ?<1[> 5 Ct'Fĝ$PØ0 uJ3#AsA +]XH C[~pzH;e)\e`/9MQЂzrƁ\Hb&+2ܣǚB30B&kݦE͹T1 +c?0bq= -}2(k,\R,Ą^/&ox|g+h8R-2I'c%rrw]<:’-% =Kw_SIe5Kk+w&dñw{f1 ~:c83󞡵v̵Jl.YI<] 8iygl!Lkl:q׷a/%ҩ\=PG 4 pfOǠ؎'?ݝY ^&]){1J|@DKCD_"xm7Wq=!x1r{KF2 g!+ovvJmMzǾ<)H DRB4tpDlv;xxIqK&ܹHa8vEl;:vb8OLv >āҺפw+f)ٳm/n7tfEIˠ7MaUUqe6ґM | čQqMs k:m CUÔݲ>ݿ,;d|> CqIgy}:|v<뜍WgG;{zz9:N/o{~pp|荻)Â-ok +tBnҝz7@eo%0/}(%?Kzz~vB ׯ)t :jM÷WQzix8 ןzz3j<]{-+z0Bxg-?'e4Zj:Z;lfځW'n Wfwy%('o7H[nљu>oi:x}=̗Isw!?OCד@o/?ǽkP6h?}\ݼ~L2'?ݟ+)]oi9ܥ_Nq?/:Lҷ (gRhoR~D}Ioۓd|9};|;Df"~~v`ݻ} (LIOoӡ xb{z7|9<jz&Yrg߿ޤy23垜ݗR6[d }st17(kqadkM( ISי4WKk # 㳥=a#LVT 3D\j|)`|6:.mv& !) ~5c ӵU^ߥ:['C^ֱ񜸢[0Kz<_#G\<ڝsF@\!JWI6n]BLDk3la],*bU!F:PKyVJ6 =Mgl箦^ a'Q2!fK/_` M J|Ey s& AM5&kML>aWIvD,録C(f~KO-gGg@-0ԓ/|'>;PCi*?ʰK9|qY3Akc.Uz8٥\jx 5Lz7o )YN A+<%K >+k$⁐Ӟiyʗ.} 6`4P=eV v8;Gf<]<޽~$xw7xRjebP`[~qry&wTh4'ǽ|0,Ms; nbM*$.+_otE&sO 0k !F&X@#ND$asa!y>fr;AT od6֓y(8khd EA(& B*B5o[mYzN$ U@hEe 'f¢]\Al4jm}.O t;i/'w{;|'mp;GrE;so)I@tKaLJ-
DJ˜ahLAp!#,Ha;K&<9C^ʇjj^vgÄoͭz ;[Vbf҃ W) "N{_X B[ )\O(ȄŊ*d:+ 5b}^3[6C/b{qu!#XFcb#B Eڧb 8ko9ߓ3T^@k72󺗢[Fodd-=ΐ$j_R,>]cy3ԝ+>t,0(^s+)K%QHp_ G%"u,{Lx*M["[FL]yZ? WV2P}4A1:B,4评0!4xL ⶊrsSQ~ӖiFNgA'|\nU%TF* "ӧP3TöԺr*֤SqhQ0.U'+ghBԷw%T&kT郖j XqIY&}3]`ăCo_Ƙ:o& +5_׌f1Fg&hN69U7/X}>w{5ިc/֮d*kƥya0{Zv/& ؋[gf`AIWķu>*NO\9d0@ipbPIdmwcO[E6\$06p6`;,Tk~^$ʮ`16Cl)elIR~謸Gl8~1<؜TDŽp?|;&C~6gXyf Q`EG ЃNϰwV)_u?@Z ׭wܭ<gsrP[ܱѶhAHb3h-yOђXZElqz0F`iX يw7w.D\QEhѲ.[Q\io00ڼYv`m jw:܍ЙBZ~)kpL Jlgb8nw;Z7Tfu;0Z65U&3E2V5(ĝˁ{cd'ݣY9Tfή>D2v(ppv2 ֍ޠi*t!tgF FFɯ$A&0EtͥHZo oXh*"j5efK09 (%D<=ؓ@|DZo(A*P$uXjϭkϭkϭ+qn`sss [w.x˹u* |#ˀ28 Uqҷf&O z9m>[=O%p0rkg cVRG;53? $˚Ud)M]B]يZRgQ|ͲICv$gS+ww6Z(WzJQ(ʻXwmNJ=.G'fD&1,"c9r˱|G vl)M~kK5ڧ A\?\)ߖ6:x5;oS&(z7wc/+SċrfVE#>*`h_.؃-(yPBu*z!=v"R[eB2u*>Bye'm(ZXZ*M\1jEBs*I@0I:#rS޿_:=oy*D[ I06|?';oۼΗ^hK9XNI%XLU53&d 51d1 67&9KGXB\[/"Kxlț֋l֋,{qs÷^GBAdm0 DsnBGJj;gS {KE%j}@.ӫ1ppi㞙Ⱇao/㯵*Lr$a?t~YHthpj>v,،ivQ'vr4M` 3 C_i#bj3iFgQdf&J*pSJ!:[Za7H#C3RIH$iMè&Ӓ]y4Lg.jՆ_c!mz5vi)Fӑ֋+7^Z>_Ե",WcE}^̕y>se\|缨k,W\'cn@0a"QLTc!bXIi$^J^5`q" Nmnsn82 MtƨRgFHjPS"/"I"2PXqȤ+qj8A:4Q(T)FXxnՆ*n+.S 6n⢹ĥNfXI]MotU($JG9w/O#L|[^=͞~cdE2nS2XĕFH؃7d6ǃ_^н'T6B',TFKLB!tb-%ס «LǓ'O. F`ExDPOM{jv?~(Jf5T= Fre\`Y|΀!kMf߯(vKj'jɧbU uKՋW%]baQ irM)>s6dLQK}ؒƈ=.Jy@}x;²J9†U 0A;WiVegXkdr5V;"95/,h-&2QՠVǽ2~+j,Mysö;/w4,+霧,c+.JM AiAJSi4Ir¢R)T la|.@,=PoH]4fޜ<|b@6Sd߮웗wKm}$*~K屪~zVd{,RSme ~sYeIY?e}>?TpM\&Wv7qVTo߹_O߹Z%?_;ҷ( ۓf͌qlJfyC60kA59Тi㔄e{N=X^6@ 8cSf;mZf:q4;TI{lVevfj>уDIĎ0G-Wd9$z6x}LJ1 >4JGجmgZԪgfu~l;ffR4:&IMkLy9@VhRvҶ#Z2DÙ֢Xqύmmu@> 2T;C2wu:Z;x 59xq~~I˸2Y<'LxhVzg}& su9 ھV/k~@X%#w,= 0T#i"T$ /G<]ҧaL9y0wo귿k kFKaA"ը~r5hUDQ%0ͮHl 3 }Ѡ kC_~0Q1ė?7nZ.U8>+j_nyq$7xtѸBAG|y.rs1>E1B(Xx`բ55t 4dRblA:28Jtn( E;9),Srs0rKHd\B5C^F/0]$ 6Y{C2x@k8%\|bR'Z9F+H-%Ru[{2ӚD"h2TUXًA #+1(ئI/(oL0CAY":.JQU`"U$~9ϾLdM= y[ڷrHmjy;nSzbYHnHXbuަښӜ YMض= g5 ?T+d͹J!wX|뵇q8G< pdO.kyAKm.Z%ْRtdEH_Ӥ D 1(iGv*&|fN4>vޱv7AO?=*Q>4qfVG}i2'vi|γiet`T gȩ{*_%$~c;k9M^GR&ZWLnC% Bt'&"L/ˑKR 3 pՠh֫ >R {jKb0sbu5Fa!@t+6i8x:ՀCRVJlQ'  "e9DVօHEz3&dmk.=Ţ``Q8ըk5mX&3VV+H2J2Y` ͋geE.g1a]r H)Myw T_DM#EEQ(Q"50zN $)B˜ŅOg#% {Qie5m&TBfij]ZZj @MF[+G$fNOmF5͠~FҨv^ ʖ aVskR"zX-xSB;Ғ}ĶM눑ļkъH2Gكz37lhCzFYA=@*D\;$^PO[f2-,U, z3ڶ6ɑF`]<#+Z ]U)>J< هy-ú].<4!KIzRۖ6d'HN=+D'[{ހk {# i(v]nT旋|Ө&⾽>jk]x~y۟\zu(nz,lJag]c׮Vsqr2}~qQ֪6\Cd2eUH`҄N'*~ͼ{!Zbcvɱ˱k:ʂ=f$j\86q8zH|j[{V_;;sH1P@#D6g}."jj1wdrz/ZRoՒ&*RY5EHf˩h6/e/Tu&s)ks"xRT6D|qq/* H3jhT4_1<9^.W+SWMk\߼XLwZOۍIfD)p^o'%CX~boGt Bno'%Yޗ^՟K)0SJfѓM:' Y=}r)wc$!=$HOɥ݁HT$Omrj_X=I͠c# I (5[|`VRq>~LdS0& _lo83|cl?-ב|uoΟ CzzW8Z(ru#lrX(@(WXah@YUO^_ar!z[l)E`@k^Kʎ!Q E2"m"|e#Qg/wuxzDJk:콟B5}ڸ֬'ȤZZm Yw*9VhΦ/yLa]5VJru\-ǥX+V{͎%-m"ɿ"n6Er9 ^&v`lrlfr~$-KM#ĶZM֓Uz9'ۆ.o[7I 3#C^;PjE][NkHN[: u` e}XM,o?2>q8E+-}@><ܵP~zKW4H6lpwU|{yKݦuΓw{q3W ){8)%~uJTr׸o&pW?z2vPM{gs7aڝﻬ>᳻nr%҅8䅳hOQg3r }쪷C;?!`ϣg()D"k9F?B`'QēpFީ:c[-VvA F`LeJp g<8 OhlOSͲR)Cws#\.'˟cy,s,K'o>']Nx㑡Ky+m* PoP'*.p7\Ա2TӺv9"4N"& pD-!P[,Fh9N!R2b[bkeca}(@=qa'[gJ$>!-ڜFI2:_'Ffuۙխܬ@zw$OѰ y?sмg 7-܎Wunr AuY>N`TܽRu0;DKUZfA)fFЙ$y_$ LCs?* ÀC 29(MZڅ( T PՊ%c 寶C)tтλL0c+fGrRfj-`߉R sv Qc-6KOBR,YJ9_;MCJZ*,Ė!\8;l` |)2=qw}Q0}کX> C[Oǚ2q<#˅A1@|Iy;!H,?nP⬿'d@E} Er||RRc'MvA= nX6_05ܽK,X6j ėӫE +bq(,ӫxxg _B[_z3,27T[>}8JqS/R{7Wr6pLvyWt:=upg糳`!/;AM.'R\~%[haYZIݓ{s;!x]d肳VrowMiØ @bY_[ip i:L+!{kǷ[2gׁIh޳ƔQ=&tIB 24`v*!=$ \_Q&=<\BIjIp %5KrB{JZPT18 L} /xqbOk X)j(EZ nYF`sPuv>v q&5HMl77`b⏶&ZZ oY4Rǹ#KD"c6F\{enQ+$0TKS!tLNx o~kж˕D FLa)ޑ1gEmaK`kQ)k:"%2 xnXjO>,!5m;Tv{`e$ bMC@.O]5ioE1ۡV &!xoxh<5;t3> {֘0r0T;O;1Ϸ$@p,_==8woP|E)Q_y'FpT)>߆~Z<4.9Jloy`0[W7ͮ9-|s_nfK(k~R/ uO]e#[ 'm}=O*C0M007RkP_8kA Ŝz`a `YPufLԎXpumw rCX99rHv)I:*XidFQ TYej 9h9 $^|e0 k`d sq6w6 ;M|pp)٨9݂u&N>I3hN~4+TNp)`v yD햋]vӋR ݲMn%8䅳J@'vXf[Z췔7~KlL衔 KfEX\nQs-T)F99[6 ;V #W*R1áRE r0#mm/'Ek %bpDc-a@Fy' e0FVx]PQ)ru | \\\]AQcpV`+%A/[Ǥ=ᾪAYԢ*<}=Df;s:{8o= [2kp,YQ [e=&jަ)$ [Rul~`v`2RR)'e|ٰSE E12dQUn]TPǣ~p(vlt_V^cx*ɐ I 6ڤ lUz:c64cЌE L v7r64cnDV)LALQ`Q61EomEʅ%PEZ~é˼+Ck̊HQdi.f\^m?.S^V&fU 98ɐqч0yHӗ~h'gʡ7[\\ĽL+znbFLV29Ɯ*{镮 [X:!,H\+I{Zפl > X (U9媼,z; >T>gW1"| Xm$Cn?/gn0)P!i=J4*<z +;"뻻}wJ/B#ǑUR*Lz[]VP&jBi7 lWcONQv~M5hYʼn#ѧ˝[#^Uh^6}>qPADD׸Xq}5A!kl)RvN" K DQQ> b,%eR'{gHϗgn*$Rː& C4Q^m.G&ѫ b0{"#Q- V7;FxCPTty(XDՉGи|~UL gVF(O[ -`x2Q1Oqr9'뙚!'yfK-p ^O27@G3w{?¯*8r=n t, eZ$Pcs e*J<"چKA>98if L1L!\oy<' :{άUDe;[a#c8KX:qXjrNNA(:qm3Xc9 lA8Ah,a:cH܅;w mM8^Y=WK.8?bqy9Vgino8GeƋ>>(V*|AĘ3)ǎW7P9T"n)EBx-I]+L8AbBt[k+b;=."V8qP8׸?-; 頟6>XO{{0Q 2w@ܓkwdqḛ屉2I(kp3ǂ9%QLqJp69 iY(5C0DКkGd#*uE6`U$ &%i4'q!~=E]{>(v, lӳf+ \v$+~|xAに+Ǐ~ k}vA$tjKT x[+o=j 7ߙOONҿsO7]{ V,{v/p\++Gwޢq;B`aH`ϑ]_s{zjBhQ]T(n Z0jɨ* vʗ!JS#^>w69=fb;JAe3i(1(+OKR.ɋkh UkE3cM:`8HEŃ%E }%#ip]XdK61C}%JJ[LLjRU|c (S{ΓȭC {( !Cymma;imG:UH-J30EqBDMI4d9tʚӦmfN f2o1Ѽskeaϖb-5ԻZ/q{7h:O!N[_]mr\Wm-]InG)rpOc;73(蟹=aX,oE2rIr ɉ`/,ovߟMAdPVikm3/@|1w *G`?{?| r/"| o="Ҷ|RZ1e"OE*PAS(@1`Ĵ4!#NRZJIJ~ДҍO OE]יŇ8Zڕ<|t-?LV$SkӯLK6e{T\LO+E cUuXwEӿ~ߚſijf jN?mhc; Jy0ɱLN7ql)~}7'=^Ko#; $ ׂi 6 <`JuAt |LϪr2F.i(!ߓp1!ODJNCIE$e(fJ3k%1ֺ{ ʹd'a06qjukPʼ}1]՟-'y?@+&OXu>^zT1 PUID"🦖׼pS<5afݓGEy#L@_ 7^]ڮ 4~td W3huݶ$H&˻NRc D ?|9X\ dy)ӛ}/ͩW׸Bc(JĬ`sӢnwمu\+Vr덻zG|1sT$W!USs[cZl~A55?wNͯg|R41 cz݋`jXi"[a 2BJ`*<186~'GX,_4 ɐ E0@`Vc*mPrxZH`>{Ƃ/(;NHZܳ2C "AY\isGk$m!;HrNLcqch$r[AUOh'jy hAP早.Ɣ_kպֹk͔O}uoZhy{? s̀&az) }7]SZ|@4X8֚"?Mrz2|8u$緃 mA_>S28;i6ig]L1p&x'!ʫo! ,O˃MK(YfGSrYهM|;v& 8͟ 9 IO+l\y H.Ud ƞuÒ>XenSZnZԺ!!_V\I!%:nU1(":U(cݺg:Ns'6m*7uCBp"S.EUŠT-u.bZnZԺ!!_Vs-8WءuAѩZF]8չM[ -jꐐ/\D+! ٳnL`V"S2֭-7iͺUnhQV|"#SL1R`deIHp@Y.9e-U 0%$N 76[x붽}M'n71wDv[ Wsh5*菘,7*DNِ#?<`(ĠfDM~~+gCM= &2dR"Ba&jG@P:msތr1*[K•ƚ+ptk)pn]qMemkb!OmZ 맳R\v&7ĹJ"ˎZ+P]Z]);f/+Oݕc0Z-ȏLPvܚUma'_sݾ `e؍ގsNX?bh?VJ'2vǢ#ߌsZr}ivL!PfG?(- YweZuNU]z3U?oëV~ Q}N>X;z[gp5:^ 2sC1[PoUuݕR^9vzNZgZQ{| ]#pNΧ{^#>PB^v dB-3MSZU?˻+KmP [è/kNfīU RuJF9(uQnBZs$TXuMENyhj5uQYQW6 ' Z.6d կUYch'MbX mڬzka1$^.Ø\4 ZKĔaHy|4[o2{"3/ָlw<^M?A}꽉w*6W2K4N5Ri4p ~oj4~]蒆&\tSk _5rZrRE.Vi̞{b\?@;1鮖h9OM+ )B/ Ua!y0ɱLN7qf4B-6Jxeb0n__.>yW*?\~i|IxK/)B\yD;F(KGiW뒫1JJu(IB@ PSݭ :\~ ( 5Kn,KAރgwf(.Fu&>0K7]LЯ@"h;6e$h:͢~Μ>FMqMɪZ_TTTc!f&],*%Cc"(e%=,iA, `a)l˂J26f\Z&Ql7no74x/ۆ(vNtx;$\iwEn-$?:`v.(M, 6QhE __dɠNnr,Y2k?5mHM+٩NB@Pjtnx>;)l^Q08@loblb)Lί?&\BmJ1D1pr \VPG#m$K/h)u=U@`؝I3=xu˒-ʱ=ER%QbQN^#S56ٚ8 AB-AZdIc!`RO!ȩTN)M$2+|ܴ*(xaD' (㧈MӖq1f姴iG1P5ف/":)ܿ[Y+zEN?*dEFyV^ξI6F8|X",9?*"AE" d8J(h9]HL@ )8JcADMrTǁT2f))?qnoc4&P4۳kv.yvīZN˧h=Jђ;D]ސqk)|u`Q -sU.Xi^m[f..q>?ni!)T{ 'JKHEТߐ "p}XHسNC'BA?PQ41`l$@*0 !P eZuJ7g -x{6UO]QҾ˼zsu0;beqNEG{P&r՝ Pw!#g;4xT(9Mbal8>Hf ?}=<"][g$݅LGR;@͎f - T,kϣOeݔwĺy ݣXQAݏa&&={nU,VWk9jS r5JmaKNu=^, ͂l>tk7r~{~ [6(n QsUT;~h2<,9ڣ#PS/}*7W|4lqq|n_P8yI\#I_ޑ7|7buĚhjUFY!Z9+biHp- (a]2y QSz`RٮPZ>m'O=%⿧گFר̴^G6MLm%hqh9/~z&Lh& ]_J)@R  uD,RN|%̠_^Vv^}N|t] }MzW;diGA7I-}q`n i֞^dPMH=q_)8UA[6^>qHhH$L 6h ۷,v>>5v7Cw=,y*'|s쌠?*0;%sd KHWޥFfV[Oi)|lPrM"V̓+ q\1b Ÿ_9SܙwϨ nR 8s *?餵ה[Pb(Sԃqcɏ4K Pv=CŤkgD3`^L<W*D/Ѳ;E``g1mf4CP!sadbGp‰u8%rm^,% biwe pTia[9Fwl˒36Tp~s@Ai ߿-sVz2v|JjAzM 6T{bС%7 Zws?Úm@!\+'^fDzOjLq8̈́pCkN}]YUkw{n  L7VoY  9cl(.Of:Z)TZa]>#VUkK缜 ԧ Ar3;^ti>*l=4yn}5d> 4#[v)/431\)-AOrw'*_Rm ZUtߝ F4dts *ʫ;j3J%GiV.$aލو Yo;"[uj"8 JܸPe2smk{FS/op]BzJ3\;,4#mbK Kpx[iI2ؿT@j6q!u4FSw@ @bH105^k;UHϮ6\UMG#Us"LEإ4N-uMt6 lpUA\ dǴ! tB8SRZ,sXD+?%`*[7RZOJTQMT+|qLKj^]qNKpƘ ˨ LGmmjcګ+s\3y̿D@EVә%[؊?/Fp,iMN^ ~Dt2sRɑHxPfڐJ7?[󏳊9˻iq5EE_ky\ʝ~|4;PΠ/r1?EKo&*q5+|iH%gsfr-(&՛% QEh.lD?XwIT4.WwX~qѯyo~_`w:Z ]Ts`oiW/}q(97E^/z6*1 EސIqQΣMçSkG{+>/H[7.Î?a-݂I}LsAE)>kVf`gإ4ļ 9hnLjGJ)ws߼@R3}2s'j O>oD݁ǹ Gw (ֵqcP?Z(^ffynJ3>hC±h8_eVWR:ɥno0Hncm_[fR#c̑f]`Ռ 9Pzǥ9X__7F+3hc_U3cw3d~ AtLv>uj(Y]?t+>6G. NXn z'b;0 䠧bV+1˦H^\ Wvu';1={uX_8= bOjSS!m8;rtc(*_:"Z40ѐG;pY]y5o ?-͗YwR6L@{ 9vYjg I`ssO qP5>~ "!S@i%kZjL8jIg귚CJ1wм6ˆ4"G6Z.>X)J()CUz4Ti8* eQνP ;@fLT7OJ{jƶ *}<L2=@v^\[gT1y;iJh|*[ J^ov/P4rjm<ʼn!C VxX't.E@Q@鑽pIY??R74ϳ ˜343xPήvJ>ll3%^s"Ĵ" fN]^dh  (Cd˹|E3vdV@ %.Ev}zĄaB 4e68t{-(4c =}6%H&i4' Pb)9x|p\Ӹ{ns"a@ 36yɞH&"g!i:MڎbIbh5ͰH588|Q7Z%p`GCPOsXNGC-!_|El2n)#CdE^3oJ Æ4nFa 1i_9f/$>B/ 6%m/ҚC&r>F >PsZV5&D2C]ihN5k>SqE83d( 4).I'ѝFZ]8g%C*Ήc_gm!bGRA1K%(-Oq;_ɔh)d䪁ӂ|Md М-+'A ZrώJTQi. ʯSQ!m0tjE[Xw`~\s4w%Z Ő'uIZE0$ڸ;^_-$RWڲșgf8gܧlKMTQso7 O/)B.?[ Նd \I>Z~bd%N݉!HDk&D g҂ԬץLjRu/v65>Zi%Όj>:Y}L|;r&j9fƼcmsLYڎu9w99Ǩ= IE'Y0I0(@buMgL4/"Py8m8myeU_9|xM'1o d4>_ĬjfplfQq 8"zsfj:͎̼9Ugg~K4Hՙ\Mrp# 1dU괡A^y~|z}Ů6NfbϮ{3\YlN łOѧvgjlZuKyh5W)ʡ6}*+{d[]w:AL-#=J76׸1yި@8 ?GB߇d6*P* "  DHd*Yh0P&c0b t(2`(4A=JvTb`4NDh䟦YT0%Pـ v*MGd|4pd[&6`OWKz bdUطIHa) Ӳ؝&v{!R?vY{ T5= ԏNCBe];2u|t}0vլvɷ/("as{.sRFgAn,t .7[(u9~$Ӑ]@Vyv`gjVX,zlEzEEbw3L2! W.Z_)0c׷.5Am^IֆY3dK+E.'Dv7fMɘ=f\ɽ 0~-̜);׮+~MIN5qꪧ?R˷]md{hKQ)VR꣈$HKŰ46[DGR&qg <9f(Mi ʄp)#ØqA`Bq,T1" QBCP,pT䧱[4 ]2Ouh {-w]=TH&&Jaf1XPX?Q3 ,hZw@S$, "QAȩ XBV!p摴t<]]^Ʊk !024{X")FPʿ 9(H蚊 jKߦ_8"A.`3t/#ǔ|&\LwJkcd,Pb(`OƎv;"iX06>Hm Y !Ɉ(%P׎#maBBmg Bo%Fr#i~ K.d+'No ܂&nu徻{;g RʩX./bEFßX0q[2ؓ2Ve03 /Q"Q{dRdܲ Ԑʃ ǣS)N]ѯO3Wǿ.a2tbne NdwO9r?;p1ZS\ЬTDy6[xg`>la~YхZr.z?ǻǽ}\|!Mo<{6"Sem80R'CŎ~cuF]*//u8ϕ1A" %R,rCu^$홌xy\lq{%eTFcSͩlNړ%=Z\Й]wb1ZS k NȰL5mN|=nw)ݹ=ڴ86mN۴W$LCv@X Uh\yвimmC+O̸FƌD {CAӵCxxZyњʹ\qWv bD8(ai`3ޒ%&N7ŕ2raZ2CfE-au|c`6[?w L$#QǏSvg/@&aEM(itڻ?g}?d+=7^#=Z^'#7oy/͆#cUe$HO|]('%i2fO,6fks 4)mS&8ID^v'J@j`$66_Y((ʯ PbT2hIۮ1<; $gx(:(C1a i!KHH E6DQ(2T'0rQBuF L=23Ik{Ĭ)ܯ=q#D#qb$Hْr Gʌ(e<ZR1"XJE= #sIKi%f/$YX+Q$2"XWr ` h0A7pK@@0 QXYPc Kp2 t&]M>{x #c%ԨOLꁤ󵥔V#fN;icTYnQ4ml+kc :u8;A;YМà}VO`'lvŋ>ݛ-4IE7/&| r}~ư~nhy8\@fA׶E߻u| OO7lݪoVؖ _|om}F Ta8۞:nS'BB賺劣"qӅaHev05,@gzܼm5#K0x,K^6P ޳;pC jʓےʘӞʹaI+}]r)1=[m&>RM 7O.yv*?zb*1C{hR^A _h[|Gr0 hQu~p.8䭭#~,̰Tݰ5ʾsɟ,T=T 5H?uKɟS,lݴOl%Sy=F!kS碷;(T5zY[>S+VG 6JpvQZP^@7ABsA}M_nQUQkkќ -2)@tO'z>!mS<]kT5]oo VΝXho~J&PS|@`Ntd ,sVӵxx \& y׆ Lh4n])s<0W w!5\)s,oFM|NБBA gEN+spt `m9I©`!~刿B.%MCMLa<|dq7Jcn<fM⿵hYy^{l60I*z^_ FJZ(s'L60ԷKwh&RP&5vxae$fVWJqt#ji g8']T +:m]b`MK[%x,.7띄>̚ȟFC2Cs,xs= -DKjD0en\Y3ʐRq16S'FdOt*EC3t0{(Y}:U 59!so#Em@!h}V-X[~bBkGѣK~B:jy0̅(Tk; (|h5mQzdzn EO:3OGT (BYPiwhgGq^h[`IE`{ECh_D0.'E@&Fnw(]]]-p %e /O? p~)àààò^`g)Dy#0["S- m)ҁ}C :sI_w>WWlZ>st+& p<vϢ3Y펋#x- Dy\[(dѯba+e,8ǜ;)vNK" U,XYj2@2;EA9 <#gF3L*KwӁIwn*ҽy@L-Bz$rtkT ,#`3c> s c*2꟟Vž1Hz2TɈ=9}‰*sY*hS aw P&zPGEh/nE΅'Gs_Fx>Hwy3y)?z<|b;,F:_bO"ċuk4Ex.O+쯄9G<]7i4A飁Pj >c\6Tx0x7LҚk/VQ>S0(VOGI¿ޅ ĘƆ7jA^ͩ7fa\ҶY:vѠRі+m9FiT>S>O A>n\u5zd*KTpOF/퍂Fר gq{Vܲx b(XTz3-N :q:VfVHE81#p={s*.W{Z9W|s0wO\PcvI툸vIwsDJ>FS?'ךwL 4fgcuZ-53R/QQkQ3Ob4w%n4K?| bxEbADKMas-XM'zn)-]B+,@}fᓍ6q48;%ΘO"1cRc?E#xJM2th*@tlNk O,u&aIZx~>(Lr%T UL?Z;mb5o6 PJDiowS0:Cq]٢  KMHX48ղkV&Ht>ݐ&WXt*'{xnR7 Wm>%Hf@hs6./^{K¢38;{\hD*>6X ]U=q/*_\ Ps"OiC$> 0Dr$^nˆi%N%#' 32Q W5_1JӁ5Fi:᢮`zooJЎ-ޏيaV Cbr,cyP<-u}.O,3rN$:1*wӁwoJݙ-FȻt+W^rA76*m=D>aF;bj429c(Xb'S& Ng2U.3*°#ϋM=(p_]^zYr[@r2@i#0:$xf${g[.ϱPyhKBt^St8 Wma2Z|.w;O [54< L)< ń 9H8JeB!fQN9A~,ZƇNOL[TLk5OL|Db%]JL%r6&$s#6&RE0NYƽCº` Ϯ!HHSIiȂ{E # *?5N !$Hb]DwfH$dK[$/BPpnބ`>oX=Qfq{=hfQv2\S W%⫼! x\1 ^h׃\2xZk@93X؛0r F*2ϬE"6-97XA/IkI~< lx}q p--c'n-G>OlZA4UVPf\xgk*\9iӃϝ.Q%z@dr8ƛ 6Ҡ +geDrs FEM)h\vҶ|5_l_H .Ӛ9= L,\w!w nBFZXrȐGi{ѳ]θf~TfJ4}}gOgakԉ;~M}uyF?ts"8` !OY%jʠ۟ۧEoQ闢\hɇh_h>Фl~ l(㛭ѻWB)T 32GGO'o廒Ƹ7ʺs e]xFh`z{i>xz3F5̊y}Zy9CSW$L]#<ӽӜ=-ǜ^~__!Ia12np8; m6n]Ew~]q!ZQD|J~zt*n>Z32arѩ9 vq[~fݗ 3z|O5J*4Eː|LH*)hyǍ^2FN/{NgT4-9b|/zf C)-1q0# fErabp[".Z1_V9WaѦE;n`&=^9JINq9=Cn>n3`)٬,YIÑge @f@ 52$8GGI kc@ʲD\XccЉZ!)W29:_pMe$I5,LAS*V- 8!:C:+bA%D$PQ*an SģCW (KSR'Oz*UD$`m2KVh C)yIFZKJvA浰 ^S?jd<{vI$ p\ qTH5&i4gIB`:wuF#zgQ14]ZcsEIX bb c{J*w$JHXnwDŠeLB*@^hT(=$%R' 1Rnejqт誱-?C1õK{Pelr7\L @$' G"T.!Zp [zLPio?g8a:tߏ<Q}Gt;3;Cf!쀢25n}GT-cM sw $3RjJqh腔JTKDҐ$p{:1ZVY#QHV7A s!9ȚA0<wz |[c5Er dQʴMf\=lsB77q};`;oMFsp(*h^R؎Xs,Z8k,\(ܩCJΕyɩ8 2S-VqQ_w5zR$BI% KUS HX5̡;iqQhW̚Iwk% B &s/ dDR[JT`iKcyŬ"ϞC><`fkfꢧ&Wa~{)G~7W}]/,{ ߎ..qF%la&L*>|ǘ{xyN\dv8=W{gOs|y-s{w#r$ӻw,^h}/|ޣ/Kŗ+{l%#xД.`>NΕ/k~KF_ Zewoc{;. ێ{&|Yy~Y>S -\eCnnäzh_/l ;W`IͲ $[[S2}ߡg/.>N-$QO9U$i/ߍ$#/a IW"I$su+(YCh8ĻeEl! ^)qiD}ҧgSz! S^H8.v};q$RON^w )H.Q|r.if\=]ғո;>RU&K?d,TƓ,ODN{%KQ`cEۂu.,֡^0l{>\-#j3Rի􉪠fƈ v+}C#8lnw;joʇHԐj s.L/X0D"a-X5Td*7GP-A,OYϏ8L۟N?E$jy EI(9 @Q ȰnAz1#%lWM:S9BM*Q\R)G} ǽIBjP  1tOEQ _{-kl: XzU /'|]-(8a?pȃ#y=_-$'A4MP& (1qS^w+ |!WYιwwE_]סʵW),?z.?;G;B5촚nE֍Zlj8H`bT(7AjBpy|Ҕ;mG!DЭZ=8FZ0( #t[ 蹒@b"Y2CB'q6hK+;bCKV=bHnrUɒmm%/ԧd9eF4Q:emwYk S1aPsLxqoSH 0ܙEn!4 6GS2;A!O?w)(}N"B%+M'OF4O**W!=*o#mɒ+SN\R,W>iި8VN, G'VGoÉ {uPQݑ7ݶe_B*iTQA0MA R~L<]&kl]kNYs%.NN?W@n$#dG 'UUYNa Xm%-Zq+V8!VKX` u,DۼgN#A,83&4\j=u֦|4)p&FQ9@kB-嫢0ګ4ʓ_LFtu'( ͆qFc:VQ6=.1"oÑAE!0CqR\IE=Ѭb#ɳniU0E2HE4z&z4gs\N!8`%xu>]2fSrV?2Mƞ)\W~L7. a>r?m~6HI櫯ݪ1KXƮ W{?Z뛑\1I5 ۿ`}_Ob5W˸G''vrښ0Xhv23dǝM-YݫU ?E{'_sy E>\ +Co͈B]#)}mVpS"zD.mcrɔ#+P~b#n1M~JP$!߷zHI×8Ù!Gc awկS]%xlb}' ?Ǡ-zd"V3 (AV⭁!$В\jߩq4ԁSZύj`H*F;sXb0`@Ls0>4xa2$B+>dH Yte?t'`v.ģOjJvɊ' ~%{j`ML^Si㱉b7~tro`#eEm1c\T!{Mj(*u{ 4trP Z^Kj򘜰ӦHa'%FH<\0:.8"9Yhk Gl Lz X>aoLB+vN XԆ7Wtӯry^{we8h/0Vry|9 f, í.{>3ZKQ\'}ԊHŞ{D+<=߉s#Wʠ Ve 9ݛې=x3!|NE)2?'B0f@H#+?j#tku*lR]vJ+| ?mx>C>W5OP9i8BHJ;֔ĤyңBi ^蕓h^ȞS` (65:SB8I|$iR4`|7ϗ%r̈́(c*LCDLl 3ä#OƝKDX<#O0'SmIK#!JV~,8}G^񊆒T9¥U éJc0ճͺ* {Q?N:h,S+mmMOfRMÅ5kϓc}#Fx<.͖q_=v]Ecg&y8]e_3h3ܟ :v8y4_ûi.oya.OQ2̝;}HŹGsl8m)BR4Fo::Cp\DFoB*(NL@p{q 3 |9Ij{7 3M~:xcӊǛ8bxUK+HNP NT:#EVYmaSF=0%S/rK GPQmL5Ar֗::tӍ?~v82,%!fBcD2'ߎI]t>+VAj͍ͦwWןz<멯SdJgh331(U>Cv8Ek_#'YJm*|E d~U~(d|Ǫ.K" ڱ/3;V( =|8ݱ RXRQũz;Vy5XM@["#ϝ2F- 2x墷N @zU7Mw/7tWPpnfy=(^ÏE{*.&FJ]Zܺ_:Nr!Da̔'>c̆2acdt0"RA ^,F1`.f@s3m".V/z{gA*4scW DMs9&"8P{ {Swm,8'q0DCL10^J2F|@I 8$9ec%&9?Aȳm,^:@kIRف-LE$@Hvz#Cᢸ=~ᶺmnK0>, O!FQ \y_ 1(`1._Rr4A " 4#M/|KJц2ey%$Z$DQgMr7[䔃vJGq!* ťH KED\=Gqq!ҥ-A32_<ŀG#C ٠VV@Fi,@I!JNX2b$c3#!GAWV۔h k a{z*>.7Uaәt!3fˋ5"ӈU/XK6.ht%|=fD v P] 9`[6>+7WqG#JgVmK*ɥ h`4Z+O * zh<a6K6`se-;]Ef "d #*C}ѭhkҜRs.7qMPH7_W!}h2$ 3" H$5`kM2o6)BZe%'Vi#) Sk1&)&N+/.)1>Mc?jIgx]iB6,ƶvtUMoj>* ڜ Xƽq0e[3Ð_f3mIG%rpt ֌;dd+ߤ\wnkredD7 B-h$|QHE[9*GP`O;'qB?|t;;|Y>vqi]_Sm"#RѼNcgWTT^,嬑pX*Y80+/rM E YmwPbH2Kzt/P.n Kg,슄lZ7 \ j^`xB Fɵ jٷXzh=H:9B(աzCb_`$!~^- ̡WK)@򆙈"`0ٸd啘r K1S;\(\V_U llDS/H) CMɄ.l CmZP*d$K76fbIR&B. 3tT#уҒ<4A=PvT587WԬtB{p3Y3fhҏU_5~: cUt=X_Oyq s "o ?Uҏ߿A#]腙[Y"Jאs$n/T[ 荑{ N $o F3!5,Zqt{ SfVm kKwa,p1jaU[̳fn2J2[1ivҴ=DBNM~{ .݂.mǡ{ ƝGEZ}ʪmNFԎ&]hнXtsl9-[cb}}WJR! x|οbg^Xf5ׂm^-4ln% Qؽwљe0,~CwPp#7E JcOL~g"W4 n%}o7+t=f8~2ٞ~kxy;~uoA_x O6(؞Z鑳4e&{*> pY䨏y?=v>RY?Jgė諒-^Եl0_][s"G+zנ_&B>΄c?;u!@ HYP@4ice+2a4RV c-] oip闒0Z&(, $SQY Npp!Ңn^KRJms-< ʕ%ȭܭ8uea!Nќ.矷7p>~4=U҂nfwYGbNsˁ[I[8_˖BaڰM6N&!̵%]p]Gɩ)k~_ pb'=n=Z( Ĉ\p} \ia@WsCpx9$s)}ͮ[|wtn9:T(!tPgӇ{>4޼SCdi#ƍ}?q?~|,tbS}(n$Xv! L0 Rl>^Gn_5X~IY2=K&h=jX91iC+bd[sd%_vV={xBli9RM۱7eO#'8)u>3Ϟ8K~tvm#vHyzPA3yr&9ٽe=''5EΏ9W]&aFW Βɛ͖UZpl9@n>?h^Imr69tI!V6?k6^Aژ7|Nn~].y9u)@P 3Ag; *Ƚav]e J{R!0c+뀽h @w]6hOo}` Tڧow6K-m1'AVy)¸.5B{Ap}WgwO60+n-hT^i;~Vn.NzvnMO1kr;' jJ[[9yݦ-f&AНJ\i =Kk=V |/U]RTLW[9y]sE 33TC)<9k-,YAp I6,ֶQŵ^WŻ'%6ݗưiY+R5Yx9^-D{@%WU?;AB"?lbގz t4y3 hj591C>̔'բ5@ ?NepWϏo "5 OzA߅ǮC7 l (N_'y/n(x E#b0S1FcPxID~Ny;;/W1PCOMFTV۩M2XfYJl!C_QPE1䣱)8e/d|ݫoxzӟ|~^]x8E9Q舒 *DIGs#.x$8:tQ?} 2tۏChta>4#<_o:/e'A~W~Ε\ɿ%9;Ӑ9KR2;<; ~f٭Oł/a Ae" xl#7wF?t:HҶK], ')#*Cn/~}AzN~J+1TO#Ij -/Pɞrĩ8ٞeMߞCLݡq0cNѬ"mx͚KuW?dqv&2ۡ*+ë_tTYzD Ydt4%ܣl40(3$b/ΏX4=Sɢu;/{H EO1H!xPX ^Sd"BLryBUf&FH}a6R." v-ym8y*ඒt༥)cq\#ErR dRPXP-Nx υJNp(ea<(#$}g !GChh!' @L*k %Hyʔѳ. !*cfiLa  RhBSg0GK&CjUBzHl!d@ "K9CND` [FC y*$ܩ"Jyv(jqwkF Bl c.e iJWם'7?!"ͯd?ٻ_vOn0_v[ܝ!bo73>Xf뇻OnmK2`1.`(Pgb3 >}$A*M|]yhA0f V'3flIPA5M!܌rrZ[ʐ,:Y##$C8CAHa2A+LպԌje&KNCDWB M f22q֔Zj{^RS.MtJCإ QiL)Np4$Gq q P'Y 5<JH17y@T[$rLD-Q({MPBiĔ{Ђ*{Z>.a ?&{%D&@37eihbg7`Yd]jUPjcH(|n = yrK ݗJ!@vE$a P\ uʨ 1A0(<9G,.zfËƬgWXy,oIT @zR $yWWWVvo ہqH5y1-0ބr$X#EAA 95ེ)5oT!tbؙK5{޵ۻ=lxtl^ܹE\#Sб79~[tNuݛvo~8 Mp;^:nn?!di 5.Bq9d;NPUi|e@Ҽ `bjڒ\EEs䮟N`޻=tV-m|vN[&5hss,RTW7̤T k1yּ꓅m=YuY6ɠamX淾]xOeȳH$F9QXV#Ey}-DϨ2T".J EHA-V y:!6j#{B \B:<[ -~EQ+x+yڢk@:ŠEDM1)0.JkC8)4Gyq׋_n p7OYn`̛wkbbb3>tB 3JE4qg$0>h6R'&D u`֣'GxͶ(BcYn@m[}Vp+=JAXʦS&\YMGo]`F3#&pNpkG|]XAYO`$s<__4G%ABX\o_ޜD.X\Tim֞6Kt+e §TO4O_BnBe75|1igAY=< V]E2FLW+㺎J=L }eC -i<'cr+a¨~DrrӉ}L"uK!1Eh95W;&OKꏩT% θvhթ}WmUoȂ^-q)aG d)3imVj:\NR62;QoofmlIۘ_ [3!5m'Z)mz;X_h(-a"gO1[],2iW aii4, "1j/ax?l"sKS_(Dfqӣ. N'Қ]lqP>uQIQZ#3N%;~5;XDޤQRkTsNQLtw^PNr[n1οf_/_mxoqWO:ae3$7x^~ڻ1Lv~хi&Ni&Χp][o\7+_vnEY4$; <ĝHr"}[ţӧV:"u%&KI5R2m!kRAdLJ*@ö"T41ҡQVC-oo2>:ћg7{>wtقvγ۷3i&nOh/퓴8KX8-UA:եF+ˌ)<'fE͊q7늻.gDG_ ѩI$]tR$/NVXqRn^!K\i+$P w`OI{,AcKm=a*uLMۮMVYT5j*;l04\Nn7sS$40s J aW׊~k`eևc/IъUZh P80vf<VIm($f8ɋ1#3cL~@~̈dy^d{bI]14yq 1%k5fJW+omBkr"'XQ+} S磜|}=n&7w e/4KCV)@0/cA6f2Ag&~pPZːiѧ΋_>AdJ]q."K6Xu K;dVÅ#lՏ/@@C󯿙WeH6'N0Q!ͫ9>늱wx?@G> ].|,؛CU\R>,~mcZ:0#akn3ǂ=)8,EbأA"u:1]augA4q`=wɳ6W!fY^x(5r?&+7ܿ z?. hMC8^+ubcf7Dq<#Jx gQhO230ᬆWMc JNrHNdAvO3uLwp8oq,*vA۸ y( R&Q`_: =Yuy)>=X~⫫@D6IXiՄ Q6673bJ27لIބe{Yu? Z`ccQdy!j7`[كC]4Mrڇs~8 ZZimF;"ЧԚV+S|`W:%(K@\z s1 v at= !Ĕ4|RwZ=*a3k#'h wOa j YZE@R!JّT(U4igC =7ynGp=<Ңks:ZE&UPD-)f{p/ؿ胱ӝ`ľa'l[SLuO[NkÊ"*tBZ#H B=xNb&Q 61(H$kl!Ř3UI$6e@ kD*+ RјNȘ(wTUb$v j H֦uai]ZCSlE, |t0r tK$i aߦbbْ.kfG'˨v": kNן=(cP޿'}GVmZuK:8IMmZKGamC6XYM룩Rs`MkJ{,zi]rE 6xN<d6.SU3;G2&Vz~#sE{K9,{뙌⥹-\fdṔ}e="69YBRԛ6DPF5.C2))tts45 Ge'uKK| /Bm_I*TX"g &blrA "G.[35KVVoS>o 1#I~5xϼ` %۶ ]Y&ellu띉[]lFДRV.Y'ǷydQEfV-f2:j|[-_>h7m%N#Ԫ1ƟN⳷|L򙴊VR`XSv z`[4M ߭c@fVZ<k5Z4hTbY׫(tRSMBeT#%bQXp5 -N n]Jf5,S!x!%?+S"u`rd^Z6l ZLզ`y#e=)ǡ1ķ>{/Dvidlbt&Mf-F!<` f1L*h,QnVRT@5la( %* 9(93c, HQ(D +6vbv]JIHMfT&!uF@̜=BJso!r"w%$2+E]U/SY`af~!f^2&C&7^e -Xke,/RҠI1MVy $yge+nmiS3 rD>+[ LBydHhJ03Щ?-LJdy~8?|ۼmJǻ$IlecNMU{ nKErM H%,M]vMb唭W'~v˘|\M`Sik ޝoxQ7mDڗo?}9EQwA.T Ӓ4 ;EdqS{VwL1}oʫw{R1|U XfM·ӷ|Z!M} '[Tnᠱo__]HeA }Mr3j%Oث/[174L08Z7փ /Ɨs 5Su.N"T@wڟÞ]/Nz截\ . P KJl qppp˝+ $C#V5lkTn., (H$-[J8Ѱ'4;;rs9XU9c|t%zZ ^Xl .)1)gFKS(yhS^i!8?m> dˎuc}.@7kZ3ԷJRPѡ!>uyqrN_C4V:V"B}d*>D$#9Zl@1ٱol(l*u8r\U;i!_}ׯGL%fYs C;OŸ^ QӅuޞS)o0 [Kr1 yLS`|EL8t YkЁ:?>]@/iPcҮ K4䭗U(!7I|Oǿrqʝj~^\ikwq|rbߵ `۩F<^4&XWxhڝP0+BaV¬+4+:DB!PmX: S֨_I:6tAѴ?CG-y5 T(j@CQ܈ݽ!z `BoHg@qþT "M Ӥ,}6މD;Ӝ\n] >+O7믮uDFz1RT,lh\Vg\^ӱw_g *l1_=;ҹ\|WĴu 慣S",ߧ_'mB ]yBL(C"yzi,[4MQ7D,B0AI)G7YzDXkL)ľ# ${$b#%&@YSSTUu"ϱpj=[^c(+b̬?@5Ut~yXw1VKg^=]1{mV .;K13.l Ne"1z P_ٟYן3Tռ6'M7o^zDgEoYpDVla))y| $~{O/u߈;xW|?{Ǝ/4h\姳N*ld_J5ٲeIG}V{!%F!f5V>9h4_Ktp̊N M]L:^猈QB'@1BR)&*D!c )!3ڣjDjb$2LbRƂGc0$:X!vZs/ MFc) jȇδel7Ex W:74f_p3߿/=:3E+jFr}6,13n@~y" 01- eH(wJ4Ϟv*4M30ϽC  y t9ӼEdb}y8pA=a[K;S.2rҦV"+k"+Tnu+~En!ʾ?mLyXVO8N-7㹟͡>'Www1\^?= $M+ޟdiO)զT}L*TDF6Ⅶƒk^YʞfsblRIպRaaڸ>91`o."kdMc,03|*PW\]09@v;2]jm@n|gL7Ax3 }dZTU-cNHczVҨv#pOs\|Y,4;]CAXF8첾a3.&[K5s|s!t\2ڇl2΅ O-awOZ<1ozCKD~lfP|{d_swf4+R<q:VWOO +9ڈVOjbcA~d\{l%Wm%fӸ>˛kZҸ1[DRCEM:YE  h.!s/ȜQ|\XCg{WM=Uhc~8om|jO)`~0  F?n"ÂDZH4!nCKM9jHAN:)u#@ʱbɨیSAoo.K7p'+qGwJvOke\?_?om{uwV.F`^bXQIljE| !Buamn5̴4XtLZ:K AJmqmU&Gql<UhS \ FbA ZJ9hJ1E0(* 8FD{W[F-]q(bwMgG=kIۙVQQ+Yp 5QD*\s.KAz% BnZdZ,n1N<=rodVnj2S2YOlji}@-P` !ל#7:bgk9S륂v+Ԋ555gq*9q}iR}9*Y:r`"F,ɪ˧O\h*$yJU/}(u)cH5{"Uyj?û}LAЗ{$)xaMn )^(=i\N_NW֡&\x8UbL3r3?b1舌r|#ɘzZ՝ڊ7'KgoҕZ!k4Li^pFȼ[Lq .^!z=2~N](bȺU2H%,(mW/5з@":we9-oR~ )U5Z's=R]P N'E5PeML ;,VJwt{+NW(?oLVaq lmesЏ~\ 竫x fZAoI1a8~H0<-) d:ˎ%%0K66VK $\5ϵȠTe1r⊢z=1Ȥ1LǙ~h+1x<,YdgmnC (4JgSZZYAP)FTٔ-V~h/9M<_9gjktv˝rPZZ"k҅1̱ }ZTn4s &jiM]"#ч,eC;K$4Reiob,.5״Ү"ԙ, tqbJhXW袦4>?};`1n=SIu1":J]tQ_}׻tchouE8}}8yC/ԏ{EgOzQoN(ͧ܏6OҔdf\q)oɩv_%tJ1d i7YREs*8h&ϕ,"خMFNzu$u \[Zg˿WmMI|1"J]ker^DVFtIrIYGBg;^':H&rqe6~=S(Vb [J&YuW[}7!>ub"Ȅ|מ~`<?b1ezn10$[*ъn( sz(K0E]aqjj2(KbϾYcⰊu8PhODYhzmʖ(6VB0JW`xQ9$wZwŎ=Y+:n,z%ad$Q#) ,Xzl; pY( dN`c#|FS.]_ki;^ QށTZq̜rЄXv=DSخ_j abbNg. ^F'\2'zd |L{j/BnGC7By0̛r9.67oςuԶĚ ʈF RjYxNtN zEΧ ôd Gu6N΢%1⿩c3-y>W\5qʂ:nd8}5IP7:YXW0|;ޕ#E+XϗNoct2gi_#˝oǓ,|$rWUXY(UH4gCFTd02Q$Sj }^K)@8$5p0ڐ Rǯssh?>38}##T_HvxzA^^}Wl0",|wXmf4;zCPdGɦY9g8Di/:Ex!>+3P^ўˉ%M;bqv)WⴃZ<mO9h9WK(z oCLthuoJ*IG.=&WrO`89?q;%L6)k@lbD4MtOH = K q0\}DmqOU}|B)֣\(FԅRP)5= )SYՏU}[4Z "ԊiM\0YꝐsjDTKq탪KsVlXlmVrOj+j(sT׆+\f;2m'>jmfC26~ƋLvv8E ;:rT(v}]xDjMGűD"ϵg$`C]Uun2;П\!3ː}# ْD|:&3 Uɛ>7/[C!lQ>k'Y*W^P0qwfSEMHwҖ-yfKa-iˆU#Ƕ1F}ȩmpJk9F3{R\>*NdqmUgYu5j8q{]ϊ8\p(cQ {OPQqiMUA3+wV:UgEjTuV˵X\%J$_ rkWhEJ|2g>Q uV(b0ށ YpXH^xp|_, DQ8"[؈a(eʸ0&#kyG A?C9s[TpȐA sm+VUe,9x%MA$h$\>$(CX PjTݺuqs(10a4@ݏzs: čfp/~+AI yɼu`gs`TBkf7q7VFNĞ fj Q@G)nAszx2|=Y=d$;aƊ=EdO=*1op[ɤ]4l2 9q˙]mtFwGn.ULIY4XO+x8S52Ua}K!(TSrh/Q51AMFZYԌk8DFCyrq=s?}PŃjjg Oh럝eڎQqAdpfg}k>e$^p֜T[h?\\PyZ|uj冝?y1|*bdٚU,UB(Gs΀ZMEAن2ȥdjѱ_*ƍdÉF& *"TJ\Puq**Юy/`aAZEEc:r0xAE?c.N2.k((<ՑB _ҝEvrOQnQxn³e:h:mMqk=ld+qoy.p.|(/9#$<`UR:Abe< d9KP+!2q@KhuvOºBH/<ALp/Dt\QtJ/dj9+,5Ƽ2s1r%S'- R bZhBC0 Ўi:3vj!ec}.քpTC6z{ͯ2Rq`g{0gW8I0)3<馊Hn7#bHƨ2MkީuҦ且 ɖy+&XޘʐL&oZRBa cțB{ jn]]5B#YS!c8V0g: \&) EB$Y3E ,='풳 3BHqu{lW# \yd%索XlXa%Yhڈhaz68֍rCU9ǭQݓLHtoQC9.rF9ddQt-PFt+oݺ~Б`Ϡ9wTrБ(3q"ȚZQrzy9˗S١T\ >~|ǯu|v"0[%Atb6}SppCbwE6o$`bRK qS`jC$1*ьwьyҪ1uAڢ.#̈́)ТŵtV%= )([U2/F I}ˣ_, CYNލRN1+MJ=FE2;'I.\4c 50Q)"ak{/c%V-ߢd:KA^1g3^<(ae{ǦnL[^?FDj*wu<.\}5{9n0zq}Quַ݊u:p)ơ|q7.TԴ '(X[~>FP$ AE4%sx|!ן?2![NtyjN=ʵnfv?/DA] =َ[o6, ͕K>4Q!u<m,uCiJ([*4)BjP!j:)(gxJw}7葵m&=zG;JLy_w|4j#v?-{wy#r܅Bˆpq^OwQӗ+'M'*J旋pA U?9@U[#o4,:9⇀x3~ΦG|Ź7a42 +( ޟ|\_^ɟ~g7 JP8k;gj2=*[(Z=.1 q<%V ILJM||ogOhV  &{=x %bssV3};0'8PO({O,2K@Lhyb*w0G(4qR;AUk(1b]J6K<:U.o@=8;ݖxp0gT0IHXu$Q@=Dxy(b2_*}%n[IhdW08D,sbO>[(=jK!|nv< PGExU#,Ώ.D׮*9UTrriU6M$cJoa,}H %U+)vK56U(FE=\!xtZeMVa(5x#czý] 1}hf~q9~PTmjgW59;-# g#FZU߀P,`rUx!gF~RS910)XKw@߷1#4f$2^AM>O'onz\Mb9YD@ktž j֯7`:A tfڗkku魴֠="! zSrO}QTĒ&Z}svzW ]zvnczB)?'{yR*9zYs<>yp&^.O5zlps R5^[V %@< drIe.{"i:=}J];y ni]P2O+$:VJs7=^9$R/Hw{˝ɮbI7<Ԥ7wMoLdbHͧݛO+=Y2)5)|Qr7<֞-z»je Gb]|;I &%pޚǘ ݺ[Ik[8r BqD॔iç{ZZNT>Rr@9v]BX_.i{s!T;ެbʶ-mWEA%Bѥgm-m/7 `qI-P)U۾5[ }~ewW^5W̠ӝm=o}<ݵL/ suY%|Cɫ 'D#;u͎wHМo_7B=%Z"m_EF#bY<ٻH4Mej+0rM)2513dVOɢiٺv#@OH:QwïL5` ; =6 ܒ{>@:ПΆBϪ3OK |fʫY?yݕk/o_ϸW?z~uUM~z\QijG;d8 |Xjj'wy}w.ʿk$jlѧ$R_Z,Rd, @nց %D6yJ^|hJnQa5co~OiHU&iڭÞO?M 7N' `Ѷ#ESb"V!Q[f0ygΊ睵=`cElQ8>&Nrk5Af0A C_ z*~q?ӛ) [l {cq >_]f^ҽʯ9ߓ*dvtHf_|!k-Rye'6BΣV_j Yz`g$"o"L /ol]r{d{AֶcSPlek~y/oOjcvOuߠ3 4)uysF$QWf 8bs<5s'I:֟#z>S#u&:֟݊E0 ]F?r~ XF'5ONƆY,vf]̀b8]IR4^~5xhVAKdictF%2Dt"hDo 2f^R-qWxدuʜtᡅִq/IQ_4&!5`R0*?[˦6rVW0@j))>2oAxo~,by?(> 6 }$SjKT>$Z .;uHp\oY:[67+M ?\DWX>^p}n%}U,{˩TAeUU޽.Amm/Cmaϗx؛Ԙ]G9S33iiMrם/dCefQk6cQև4*Fd8tb;[gg`PMKKzQ`أ]svꦪ4W>&XH24,QP.FfjG?x9Gͻi,?XaKLOBVj-u'|AAR 9[kQ6_~x)@3E%~e˚2|Z|e 4 pn&2h5lHlAe|K 5iml!&yCԔ8"<~Ihxef.@<,%1=mo(+ eL}C? kclxVVȫļs_X26:J%F\ɂFR㐜2iANYIL @Q5ډ4zອHC-5sƨ]UE|(zTiUol2 +`-5JwM(ʾS[Q(8g n>N3 Z6WƝkL.WELeɥ5s52JVdi M֒ӊ<D+d}K*FII7GHVyǡӈG'KpOnpwom{1bI*uR˼=G=#]{=,.RnÂCEl &-zg+UgGcLfdîy2k׬չY*;xmë/d*λS\ӎ SVu 6$!"=E܃YEdSS~U);AL^Qq.4xyz=?::Y$M08+ 2 à]#R$nq?olomХT0P)88S]#1Ej$P\0Cek\\J Z{ ysM)N,Y)XV`1>A o#!8[_-Λ-  +9K2 9D2ĉ|i0a9 >6C1AƄ^)%4Ĩ/G`ĻVyu^6V$4hTд@4 b-G=)u,VM'\([@"uʫhZZr>rGi"ښI6Vo9^ g~t;=շbtrY)1b=UЉThJd[/)w*$*B~~(|+_r n[W!qM nrEn{hk@0J?}qS݉ufP3BK d?&hY&y~y zo/q:Lud*ka,\#>b":ڡ$*3i7NjAOi>]"?6P`ƶ/u@F2ʴՑVyM^ְrAA e_x"p=ak'tsm4Kc^ha鸷 Wa`4=C*n\j!'A.xisMTEp8I&?0q*O#z3U;3ŹZ=O[ZOF!%2bX=lpq{`h!:(X$ZihWFu[ ҫ/1˜=ޱTg兛%?M*CxњKW6բP EsuM_`Iȯfg-%濲饟=cIL~vЪ?XrIN ^6(nz{Y}DʮRmSqNN5rxNсmSJ?Wp)gv㸕/Yyz_vlBM:-Y(9Y?ѭ5ی9@ۭXC^10'ޫ++wV_ŏ뫿/}6ӇW狋Mk4QT1Vr"O>HMQ$8:Ml 71P ^]E/{\ IK&T@!w*1I:-xg]5ؾCȨPڤ awCи{YN*WJ_VJ/@XןX,'*wWJ6gn>]X JWe>fҎ!?LudcM]4BK]o3S3QK~ Dc:|)Y_}`Y[H06{拈70dP{݋#~fWGgr>q7UsSn?=h*nvR믴g᭟i^GQ(5=G {y\`~QM~ՍF#~qDܙ Ɵ!8]IF-͡nu$@zy÷54$8Z zZ8ձm"bQ3h:<0ITYh:|Ը˅S~|05n ?rj!P VIR4^Z1ke#M"ZLm|TB+'`Ԩ=UP' ɵDi7Ug 8X?A2Mp65P;iH8Z#PO:\--o(\Kp&Cqշ4>lI9nlX~fFa%3P,m׉L bC0e;ҩ<4[ALAqHG@Zuyw nY.mOY`~р{kNF./]]|9_B"2ث_dq:O~8tڋ\!Ex`b}6&F\R/kz)TZsTSl\E:Zy*kef^$J>*UW3Ndy*1Y?5WAõi-,w7^ PMlvNgbʙ8J0Gߌi3 aoIVNzvI#lycRCI::8+G[MF(Aѐj*~1v)FNbꦜӔ"60y쪭hʡu#)G-V*:^ܒmQ<)%uR<)G?v;TkiIm z=3Vki96_YX..?? ݃'ܳ2q\˻ӛ?iUԝ}?H+& >?h9ddɿ5i|+w?d^̀572~g~Am/k,_g>Y{A:XMC)JFZ/2h>9El&36?^ɇuc ~ޟ}f׫|k|up{tB9g@SlT'OJޭpTɣ-3Qu8e; [AZZ?b 'Zj@khLcM1/Z[04{^O[ZD^8$?w2 kWh7 % əmkrya$Z:PKl..`K`_s)V6#mw:6{Q4/?g5V8=>ܳago| 4 衱:LnB/e bKm4˛*lik 5C?$ FJΧ8i:جAYMG`3jiRJ<6~CV?ï׷ɴ~X=yH[l˂U9UqDz-Z ZEti0G8kc$KFvGrT7N%*=H X (2#/lc<&Vh[^D[izq7;UA9GY厽tخ8e \nTl%5U oq|BG9NV:x:yf5:eqCD!Bl*7X~qs܃Hi띤=7x ^j#=t³^묝(< UWIEt|RlT|\& Zn\hR$H1KAtfS*\!.w 2g$8eaɌKVcRkW>]\qk_me즀=Mh t53b;#fFa%ӽr(<%%Q7)sK0Y23 ,hN=Ltsl?)c*/!NʘH8sR C2e$nwa 5ԃx#gi@qvuyFX-YOVK(팹vi ٖp4ӛ,KYγ,g]NoN̹5 [V+kG)qaOi6x<҉&aL6Zӛ~hߐjiF9rzϊ{*)J,̻Ḥ?^ O,*Pq4^o tCr1Sv CkwfUF2Tٵ{SPSIe-뱝R,K5sh@;Vk /YPόdX:RDU [ḥw`M&YIGKT i<규ݐ147o]5:l38GAKʹцp?^ެ> 0'E]~_=&ƽZVYe䄔=.|cOǎT)iVHv(oxJh8|#mᨳ|h ss.S&p/gY.ϲ\e<哽PI{+YvIeJR򸄶imHMj$Yh-)ȠB аP?!P%DN?#OMUi 7a(;gLWA֊(KmT"D9G&żJHZTH21ӓmz1!Tdh7OT9ـ)%׆2@MEHC kEn9:|mQEh6ʿQPj1Z6Nug!>LeIaN묑<|W_^8ǀI46D0!مV/qV/_+&擘˟sOƺlL-td G E|zt; +B)ƺL%hե-81i7N<`*/揚@BWݺG<` 46s4QoFRCݭ̌f4  g@H;q Λa0an va Rh۝sВ=H9e*:\|CᩰnVKsԭ11]\vβz8,zVVi<4A gIyR;EYk2rmU xoYZ;!&>W\%WC UpUCBk/"-ָ5A(\ٗBeQ jE % f 5M` J*3MsԚI`$T3G4" t6'PSlѲG6Sz7|01ŕi$v:9bt`Jc8|д!}ۢ ֙Rm!bHPC|^K]/IYTvLh*;S`^-,.WFr;t ZֳNu۾wzYį>@8v͒c~CHlLH%_Ji(H::QfwEvһޕMw[޷n =wU,|,uVocaB%ةohkxӳ`KPS1Goxc~m"yOn{dI Wg9&%Ko`R(]^f'#5mxUi{-5ZɌYe޶ȋ_/o:9cCʴ9)GcOwN!HuY%J//N9`/X,Rˉ`:FU̾:o'֜oV C@lAɢdSvvuduWt\Qs0nǢ+;+;*?Z #0r:,L޺k1~ګ?V7a1 4`N\XڀTc;ܜ@y$-|.vr "ֳ1a$Z]84.K>RދOWnIC7p)9/{*dSrNl2v,9j%yuZcm.nHH ,ՏRMϼͲ:S1k=a:TyB(!bC)9/7;g̟ǜHʞUO]q>%v\jROd?H9.zX 3IQg BGJ"iƘ4_+E6 "Yb' OTB"RA*I', D[H [ճZuo[3@FzH?k?>SAS?mP7Cz{ 2U d&Jk.I,F1Y&{d[]9y+V ԀaUM|~礼~cgj7rMWVcbpʴƮtԮ∧7k#bz7׉:3yQ*;PZ2QD^ZGn;;RpJ8XkST:' Q6jlڄBQXc<&O"09<* :&gJ7`O]_2w+|8oGnl5Z!t9{$MD*VS^ѪjJХҶ bx/liB L2 /~XKUkT4VAt`t$6$"e+] 8oɦ2FըjjL:\U&nßW_Xj,>R/ǥjz?LAh0Bko~~ qDnx7U}??w(LL.K?K^,jxss}ߙZg߾N)TEdō{1WN xqS2Lċn򏏷JݯnȍY3A\п ]Dhj` 1A9S\U>US\U>UݧX; @-&XP4-ifþׄ>糠/iBSs?{tL̴|zp wLծƱ*밗kK}ןݢf}}Da=:0yA]!$U[o 0DE))MQFUXpwUѻ RYUZ==omsWZZu|ΪVKk%ӎLiKK#uhkFԻƙu],-b UuhHPul:yԼS\YM#ul[eFpBDDF*8:gR4Tv GPpQvDx*a0:7aVndj<;"<_]F3D)t .]!A!e$!F8j+]Iio!SZ\'I d`c$-V6\-5  @Nbvb=͋gӬsathd;*$֟ǟ#l)tlIaUu1 }f(8|1 Z:{ҩ; 6COiuZ6Z9Ԙ)N!U=FYgsrsnsjRVPMǻOsSm/kמPz @ہbof >ƏUݝo+yN|?E/Y# ;;% o$.i> l *1o?c|;(RRMߎ [r1 ;H_-}E"b[ZXwaP+Y2>YgAj/$,=P,xkqa2m-Ѧ,벐cX1İKÛoьGZ? ;LZ.Nj Mv ~ &^z̀AjƘaP{O-+ d6: Em%7cvt%\QyM jF4ĕ'% :c'[~.Q@ ꚩq &-p~Hu.S,?sϞo@guQcnsx;u³~yw Za׸|sKt n>)#3kcd-c &l'U yUMȫjB^U>!׺IҢ&|1&WTUc }D=%(Oe#뵺z8Ϳ9觬$-:ҟ@I)IyQTIQR 㩣Z{w,lJGɖ5e m t UDixr14i9x|/&?/x;;߽;y*6v$Ínt$H8mnԺjӴu$&56VF=ض gofHr'6vZUMAK)HwsMI!Faϓii40 Wƈm(p!2J !±cT !S V PbMbO?_0W$WHH4eλ^p" =N4d܆}_e)kѢ11W2{Y?38c͖ M% :-Ԉ-% nR&)8/勒!^QPt+}7~JDkCOoxlcB 7A g 龜p.ѯ#lB Q4tbô9XIȱY-}Qh]`m~BF[36qOrޜy\Cէ@5ZY|e\"5 )M 䭖g )zQ[~\7=8yX( %2c_!qK%RLokX+4`Gs/=Tn@ƊQ\i~A rc/F^Pe+^SXi[;r%mgaDe2ڲ'y'8~֒e:ot֎=Ng8{Vo.) g(2VSǘ:g+[z߾eLf"h z֞y{$YYkN],:P)E2J va|9 Zm\ߢuo ?̻ U}$ag#Ar(Mb(1=(K59F_’" Raq3A bSa4}jſf(o} %+ IQaEW! JkM-ᵵ>{;P "@9f˴DX t<)"BA6OWO8#pRL5T q`,EnF~ɳM`D\FncjbfS;,݌ϷشM \cs܇X8$<}>r<&&ڌW5TΗrm0>͂ \wm~ٳi `^fvq${ξ$Iks"}-Yn[v_,ٝ$bGȺտon`_*_vfHLG!4nj>ywvsU5{ WV-XR,j˰RaJ  c,j*EV[:}?Ac=~%ozo.9^CǏoE8~Q3& u``iŧykQ ~bWˑ|8&p? "ūomy#D}3UK?]şd>^?>) ۩cKoQ2{{qQGs}/1Ele}*J~z '^ LK\Cx泺4ŋDӳq2`@l'fԈH? 4c/<_M׳m߃UF6 =I`+BT6v[ޓ!L:\8g7^UAoai )2Urq||uOF`Ap{O?Y(5"en9&d7A5Y$X$E} 3Hd/"5[IʗJnOHcΉL+ߓhȃWtע~*p4Oa 7 .\,: ANLLj6)V`+˞#\D*@c2rb;yg68ya+[I#lhR/M#T518)ɸޣI#|TB6-$۲-jbF;3S9 !T 1H˘TO5(G #&2Qi5Ϲz+9+-'~'c`B)*x[((J[JTK%:WR[I[ZX!'٧25ll?+Me i<2+X+H19@z&#QJ Ē=[=;[zDrB5#F٦$8=_^v뱖#Dnt' ?3XfZLX"i;<pL^eqN9,H4/6@V7[VvrrK+JQIx^XYQcweCl̕r`D3t^VYy3aK=^R;L߻ݖ 9fDV>Yʋo1gIxmorG15ݴFp35NҒ 5T@m<Ɇ0Y`23)1AaD}QAugw3pGH1І+]''K %B uŇΖ>+^ư޿dbRE$9@ B@ϐRƒsg m!H $p%JYeeQڦGyo6PKXwY@rHaY{2Ѵ#, ڣM*j1 8j %Z@V4jYT?FZ@5YΨ9,٨E=wpU{tl' }`Z]Kr|j/Ңv sI=LPce~Ma#Wc&Y m?_ݽ5hY(`[=xHu30=M#]]\-yw&Fc*,aPi: %{h{{ލXci>g{&3V˜%Lg ؃.z^%AƟ.8|}{fi;cG,`yƔEҭf̫KiX|ٻM+Pقg}==PLF+pHnJPKMU[Z4(l:vxbMRo:NM2I`/VيJ5"ꩊ*5"y|W;^/2^LkNF$ WH]~j /ϝҀj6h9)-X(鰰T#އOnLkΖם1^^kfNƠ dICDC̍A+qHJc56LKPH]Eѐ#OUXb1{K<4:KzHTQ*EWBɲtXeK`Z,$W@;7H)+GGxChXf 31k$=ݿ b^Ԩ e,R"ʗe\Pyǵrӛj:/1eY*?$}8TZfA9W8( A.]tԲ B`/BIl[aaƢPOUho6hÅ83̜JSwx9 MNV*2ZJN ɸ(!e1F(׉2pKn<ø^Ԉ҆zky%S~zH3j'NVW:ů˿{eQx+~w:djEaSɹZ N\d&ZjkqGP34IJ&] |bICgp!B%2eǩpw" '`wPv0ǯvFJ=zy ?ӗង+!2^yk;z&qdbqWg?]?]~g'=[슞–LVW'Ggo A_P_te}|/ݸ *=3 Bʡ};NjfhXl{ybtn^=6OPֆ~֟OolUtRC\d?:ȅ~W[~_AV Xg,|rPp }OE@r7(t% ӮM_ֽ>Y!wo2.T+%P1 XY 8`Źu򞛾v<{h8վ vD¡ި4LTto#ޅ4+ ؆"}O7dҝdhn*#K:kʦrKt t睲 u%RG|9qܨqBr[?\>>Ewo;gr:Χ5d?^-|%>J*]^́k @'+ FQѦ4LKsWZeT01uX6uXcbG fmv\ y1C 9  .aNknQ?^Q)k7Qۗw@6w~v^==:ї{p FfزKj/Uughzo?fM<5AwQudOQ6٨d~6S Rg:t Z̑pbQ[ U ^ Pcix~Irc6JN97;^e鍬圑(츞 =tDl0煛WЙZͥf.?rVݩAB(Tl@ǸvNDZ2ih} ǤTX2cpw љ)Z%NjS+k/* <^^ H{TH 9}sNsΖDBmxR9WVEdTp 4AY^dp.YIQ#%S0d 9IF +WU[߂m\HOt/' Y%'W^7_*cGL-IpL5Cdn4)98(uz0g[#ҫV |ֲ=Ҧ鉶J<kΝ\r^JMYoJ蘮4ݒv֓rZL*Y#S.pm=_}ziE,WQ|_Ew*eISlJxጓ A4Z4JZh6p4BɱL꫹/[9G,?#!_IфiFyخj*]xUeRT0j<<< #`ָF&ʡ _~rԩ!kO>q9 {՟lc?/`nw0c×`ջ`#?1=VJcށa} X7452+_ܗg, yUp.>V++-Lc" [ԼYYDA1`BY 88jlyٿX2s15Y\/l<=-F2Z4z_tr'MGz|-p_.te뮦}Vr`{l~ @22`x'#klRY-=)2|Kwc_Ѭsq4]Q@[8F0jÅ L7sE1uRޭe3~]:LV-qi>?Vc/ZB'r\h5u bdר$ 1z+$lJ+b.'^PR\lxUTt[W1Ik-rog'3vkT7nje$!QAߏpn枔 ǣy(UNИa_h Z-&9QJi8 j| 50pDb$ 8 b0),hKmiEA\cǷf j_wЕjp ciUwW4'8jXD7T}-[J5Ds*K8`x>:U79`ŵcDd]\sWkvj Tfˏl)uכUV)o'-:ေj@= oUw)5wK.AwtLHL ]qn;OZc()?2]l6±f0j) ]3J: KŘ#/85b f[:$ Q^LXTMb%c"tc&k;pyv&ģ%'Βc(5ءB[+ i(TP#Wdi5vYz9&֟.8FYFȯ<&٨-81 2uhmZ 6d<$Uc1Sy?ϓ.`yr+h^?o lg~{,D2PyɟHI~Yt%@HfUɣ=S; Zuـ >Ϋ&"7t i]kz6H\m_bF.'r^8td$VLyB"Jtr ?VE<E%N=JgN"0|Da8OKB 26M,l`*Lj(ULʄ'N+4"0 KC\[Lf< !O|"b[HƨUuxI~6W_Es|U%YH "TSI ܰTx55@sJj%9EY%YHIx*鵫1˹ZgJ \)Ax`[$)RV'`)}wD"躶]0e LRa fֲxm6r6mcx*&K*#V3Tܒ:(:2 @FYW{۫ȥ HBK/\K+h<!a1-L!x`jA# d&9i=}ϋ+cSEOD4{=^=!tY[McII+Zsb B0ەm uU Z#y!P׌ݶ]|7#1FJ L4Zi!KŦ6Ez!4!QltR*ɪjLVZuIAO3܍%9F4c7r%8fxo։`j ]:6RdP;v@0dG*1&:U?Z"Gp|Swޢ_Y?q_[p1u8'Zix.4`<8}-G<7'x2BwhÇEP.4 c<>ԼwB5ln%/t*#̏gŤa*@m'ts6 Kd[mJzm0P(ͶP{U<[D5^<=7wcxk K!y>1M] DVsgέS tl_$_o/Ff{ȳe NZГ DcIΣ|ZIU޳G58Nb'%HXfy?/_7nc}ltRԮqO[F/:7cd%>*'W=N pm: K&dxE 0ӯ$f6\VQFJEj^Pluu\s%(2:ɚ.?~3F9k`앰 [2{lǢ+$]n6XSM1.ʤ)ܥ{UI..oƄM'y"z6h2ىri>ln/z!Jǰuǿ`; \/شUL~ !,u?*z+ I3x-zm rчtw/@V-1Œ |ے2~ZdI nXN-<~~&qbfgb4Y4͵%+]&iiX4~}wVI' W*B. ldQ \lyfـWk)H-ɐ ^)EgHYitXGs{6q{H7p4^vI;!X9٣4qt9׵P4Cib6_)HY -7yW.h-tC|<~}vdt_)N&>Uϵr?ڿaPc{X+{ UU)[iYJ~6/|iP)^.ʂj E*6mqz`¯:0 YƬCNM:fALKLT-<J Ab]]zrD$4L|]ċnÛ,~h]O!sgߺ6U'eC΂H['?`q=pMe;N')e3H ߇͐CVx|fT-<tYp3?"^f} =͞<r@y>+1)<7JuR<)B(JaQ2EBn>QʀV}S8TPB  (/+u&pX)jS4}h!"N|sV2I!sB]sFFis":$.Ot% p HO &9DEtLrkm#Y_RUzHNj$eWg_n8[DgERIAttiƨn3 m^Pr?kApXpVX\ËS9ODQ:]^m;kGio[Z~иd9+}zZ#W5A;܎vj3z}4Z@Qcٛ.XsOG0)l{:zF k 9SʻL' ҡ_4NDnjaFӢD:06y̖10 y15ϖ -0*1+=ҍv X{ۢkqk?TKu)W5P! wr '~'G$ϮW3mM I}e>j&p'g1L ˁUOf `a/LE9~Eof)\T\&)HPDUI#iGEt3S|/3_' |~0VC'Wn9aMT՜Nk [ߜd>_}S,ʚ(^J`Zh*k.<ûyiֿ?N_[?fo( Oނ?RE8D,DDyd9"S*b&R`EXSŹHE_#((F!hS[cb)!_#7"y QJ=A@@4" 3D"X*crHE8T$RI D!t`9K.И hgv4RP]2ݻ1cSԂ⺣ =|qKKADW"AaDWqefkf1Y:؏z+nqNr" NB9LL΋ڡL7FR0ۻٛ"ct;7Wx3L_MLhtMcC3&-[$ol2(R0XyYF41Y,e0\G:SBj38d')lFbWvG3|k},mX ̱*H!pEdD'yo4`'i=XS%Zj\ӳk@8=Zj*]q{K-iAde3&S/,ᄦ4U`EY7Z:]UZ6#q^Y!m8]ڨ<Kj^6 udht5\t\= E,GBRJ I̢\9 E%|7X0%NmVaWG&7r&_Ev 3q1ѿ]ފfr]nhZ{/~3}ǎHh:0ٸ 5&Q3OӟdqGtux|Ro:` M@ ox1)TxG൏[-M50|8*#*.d`F[1e*q / 4[OLJK9DXi8 !eJ` _` _J{rX$b$eizCi%Md.TQr(N8FTXTj)+]?TiÅ\dN!F:JhNEIF*)db6Z 0qNR Lf*ILqf8yZKA!3^.6 &"McpSe&0PAY 1cD 1F(&,s3M8hlsjBsBc^ctQ-iwox۸0E(ʒ(jZ1=DgyBa)2TNS_2RCN;X;opIaS\}_V3?&|?O.I$jɼYi?"\`>[tdX_a}N`0\Z >չ ~9^?Y9EWt4V9:QʨK~}2|>3{q@= ^bDݷ\`ABIg$zuZ,ag $齢9|l_[@~>Z4'kK[n˽\&'9pm (~3sgܪ~gjܴ&2۸9>N=>Hˀ7lw*BU;r`,&CoB../]E E-\mM.gz.S7/Ѽ6jpC3$tUU9~GL(]-:~sJ+ڂQFf/m;촘R]:=k`վߕ ԤӚqj_"(8tNg@AQNphUW;5K HV5ᦇ5]8i? Iwk~sJѽk"i}28-W%:~-'hOMZM|Nj \phxyo b\$uvnG>ggRD -/L#<&{-*xw?k߉Iɿ]J,ј_e%^f}ϼh3|^DI6 6_7.o>d1ݼ~ʹzzٝS*E/mZs:opSӀjbI^h#Oj$O^>5[ss}ðctF03 `ZfՔ! Qaq֥EKXV:fB^# }$e3ϝ&ːbkqK^-_x? -&`iQJETF*Om~tlsv*.duل闛jvZqpppٽg87N¥7S"N: |fjvu:K ;PK4YVQĂt^Z k7 ;~WA0?d952G/x_s~A;n Tb~Z4z_9dy'Z4[%-Cpy7+ hzӵI`LӞxvu]f]ܝ7^E*2Yh{ X|-@%-gbn:CY >0wuYL~ֹk>ڋJMι1Jܔ6ZtUpg{q-c':k~|lYzr?Ti\Й7ѮoMP'A'N܋MxS; U\YPphdj~ؽj=SMt`<Br}s-HxϢ)P|y:Fe]v:ܺ|tVSy+FV=p]*f|}xyC̛h҆!04C6o*tǔ&7-af#o kHXR+*ppurT$wߠh~!QȽN'AA#$ )QuT8{ٲAg/3p^k8%ͽ5v24﫣OePƮs7SwFL(v(eS0ʸ_ F܎}o&D0Ŏt3>u%PI:OwkI q: @ʠM&4k$mCSUNާoufL+~\,Y|Zo$긮-\N,rBEH PbMٌ!6Wv',)T헁V/8Z0l?Znꎛv5X6|W*+nhjKn֣e3y ph*==!@..81؎J_5J|U Գ@7E t mu ͨW3Nڎ%=0ƨ-UL0}$r~$4e&:G/Հ t!CyEV=\pRh(%?U5U sF1^`G  =pSMxОcS;;]OO4]yuuzX\%GiZ:Wofͳ ѐo]T`;;V#*̰Zr}HZ ڂ dL.hB\1ny( !]̚aʖL8 (X$vħ 9%⫧H@oJ[9@⼔C!gPȥ6`Y rj6RO7pSf1rlal=w\٧1? I>SQJYEmzۙ>i7B?5o\oVOۋQ9+f4x/B3GQ/Tߙx~~|N><3{ۣ?A%#XP@9}V s!k:y(H"CHoU#%hc HC?zf/VsQ {__~HVg`Aev3$k,\vE|1g9\GU.D&q8-=zG#8c^mؗ 1@1/8#J5!O b޼7c~{Q|:ocJ`χz!omr{5 _USQvU_ H)[($hq@:^P풗"# /NVB&Yy\֋jeaU-PzFWҀ|26nB{,\y8I"7A8opἩ f6~('L; Rh(T\ BJJ͑Z(G`ԕ0f藀NJ ooFg3/`xj7n{cz}REꝣŝxq7䗟- ch:Qy,諘b*)EYj"ӊ:%J %7Z(cVԦdP~dL22k}9k 0H8~ں4ܪ>d Sv)K5+2{dmS" ]BN o/?Oy$LUl-|ݷ;,>U!0PJؙ4`VW ߒ@IM5Ra8qJaV.Kc}-ZOlIط@cJRe+K/[494.,R0nDV@긣XpՒ@SGnD!Fbܮ$X~ɡ$=y%`&c7](mbǺF DBsm}\*Lvݪ(g"$5)9( vJ\ $JI)KQAU{ ЮzlG3Ib<5׎E|%),!ЫOYc5.KKo *u)t"|@֯nIzWN1p-XITP붇W\ժ+p+4oeF:l4|Dlxi&O6dL"; (+驨vdI %CQ;%CɤL!dmJr 5T|ڑ!mPds2Νj-xZ7Ӻ MnV7W0]boLZJ9%a.zGrf-09NvkǨnM}QR[oi jOһ cn;+/'uTMn#4̝9Ih$z #/W}3{C5Z*021 ]2i^5'1BZr 줳23 ?gjXv 9QNZI$80yCjmbhBC*lEbGk֕euU(g9sah{m" =N@#B˘-(YETDFV2LH-E%PcxA g8q D "$I⩭QVƈ'jeK8N؆=s >dLCbBhK?]q}LSM%NCSI=|d 9̧N`;HitҖ"a,vpUzWGJUHjB)jQf+fV̌Z[]VuG 5;A]|{RPWަ k:ߌRm-c-v[5s򎹅p8Izٽ@aqX4JMp -tftF夂f~Hq rMv ـ@*#Ʉ=|qMيOk2!S$(<h݈֢nXo &a׻pXFi;lSKBolJ{w4,'cW[( `}ۓ7@xJ  x>۶SpBmp{2r/Ϯb\bȗZWje)e5O S$<5)_Z♺uUB8FX"tcA#9?iHnO:'c]?اk@֘=YlTԞh=Xj=;FFCT5Ð{}s/Eqm \>z*gfw? =O-;0;O!Bm0޵,¯r0 F .a)r ɡ(n$HNśy!s@jqYZ Qio<1Hae"Q|*;h8wvnv`bŵ>¶Ň¶W  ωl{](zF Ӫ;OTog{ 9=!sZ;?`no C!:0+TKVf5[s/i [/|j̪6VϬZ">!: 䛨yXIdg0 iI9N+z'WPCtEL 7#M oM (ضbfɡyrlnw~X+ۑl$#pmVKFcY 'ur2+v79q*[J.0JN,d%& ".[0sNN">{StV"@ڟ2x9u?t;g~0(p4CquŌ]H{wwv}\9KþMRM{7oRpVnBacؗG{ԌA:GM,cQ}#q>}_m{HKNæ>i؝kN"P:$}w&i}r O;,)Ygtvu?J/tPq"G3>.򼾸zVX_^}qջ/'Y;|@R6u8(}\P!|\X㍞OlqcLDD`% LBa )g#R'> m 9-#X{ 'B^F"//hCWǻwjc_~ _hgnTݺB@ X;jX7>?{ڔc/:" {)L؛{w_~ɻ2~#5on]f8Wz˝IcKM1ǖ~0 †[&iLU+͠dDy=MFh㘏dA5 I$Xߎ;|r_拇aƋZy=]H7f `;Y!^4^0շ҅2dHj{GA:=/u`Jp)҃QV֤K;]Q_HBoD%@d,L'ɈFA㗊?ˊZɬJ:^ [ęU%fО5>>MK_?{8n @?ٜYd HM;LCJlK6e<`vTbȪGAOFD= Q s=-˦L8$XȾK/!5fHAs~ꍳTuIeN C'3XZ7g_r>1_ӺBy^P|9ьxqҩu$‘RE2{WQ6Z=R zAM ;nCx0 *D$̢ "s{,%\D[H9D9.8 ?B'5Y+֕ LO$*rw[A Dm ioܖjh~7J$`'3gyQ>z[s 7UA MEc=ɺ-NTjT<Uy0ҙ=I%#zS:X80x>oha`fSkufں/U)`1bIu&QbRȱ13 h?r˥xv |_t?ǜ/l$/K{\o33w.x<+F26nK8͵.>x ߊnz3^="V< tP{\s!%,%TPQ`&&Zĩʸ` ) L4ws〱{y7T S8= v}f17ab?D'$a816Y%60!aLql~f4I91tTN0z(y0GMHQ+ D'BL[%1WL(Tm2&)+p"GDJlf]i{h)QjIN!QNTA*ͭ 4F$==r"z8=]nn^$ nl_UC"t!![[MU𦪏uU}$8V }Q9³Vm6:=o i=/`%5Kj]`q  C<|6<ӝkj1Z*ۈQLsgF;Wbt(fKt ,8_K;>8A{P<۽)*W͕? `<_j_{9W įӫf蕲w޽ļr2"R׮ףVlK嘐B"P>͎BwPߤpP]? LtϏ(7]Af_ ݤ7j)/s؝]x{i1B٧^sJn67*J(1lM ;r]*|!PҠdϳ?4"rQ3SR:1R*3D`ŀ8єЕy-#@WoCXNpZdy9q'Rʉ_-wIžl֯;̌*vRR`8jr`(:nK11p,0*mQh`|>Gi7!+7Dmqa47 "HܼQv:fCХyr Y]1֖Kr軌7h )dBؖnJ2!@4f>Ȍ :1bK1,~{AFP TCUnBlMuCUB T95ԄE(DŁ?;'Z0O[vNd ]OJJ* 6wlelfAPa{ҍUZ; - 01Pն׽vt"ѻw^^=PJFPrŖuLj<8?:<&㾜 Sw3v`P2%.);ڟ{]Dċ\ N50FҋrlکA979duUd rKAa3/6 _J]SMX?Zǹ[T5ɚөAM^Eݝ9Vѽ=`6 `7a=in^Cn6ACmJ8aֱt^Vȗ|gs^ġc;K$]n7_!¸Q!ꁛű#`V5~_qU k3z|aࣈVƐ=79֐ES-$͸6bq#P'8B IT?wG|F7^# z0m1)KBg1:K0 *I"aRkP BbdϐL O.q/n٫,9ng6Ըqjm0/&n;9%׿ۼ{xvdTzaKA~'=DZt-" JKY8}q4=}DrpQ .ӀY- Jr" j}7=[:uǼyo]mm切K܂Rd8/t6..!^yv9Л!f@ەǶ/VY'[v]OMuRC(1gJ,|_.bQs (*'5lʍAOTR[iz|.x|}'*5*'ӭ-76y/|W,6Hkmǽm\*uA-] m1ۈۧy.nKPb(U^,4B!5JwvixvBj7VڭѳҖf Yb#tM)zXQy"mI)a,;1_3K>nLtD`"Fp"7  Li NӏSD11".^,X"n[kOvrqq\+kWM4mbWw?j58A EZ*H)η{*_AmC_-DZת~0IK_zJL/eeQі}+~tW޻#=h\b{K,)dĒi=Yc  D Ʋy IhR$SfG5qO hsym1%51'NαkjEgHc\mRɟKWüQTI#x\)w [4(7$BIҽ7F*~|3[Y Dku)_@ !lH1ٱ"q.-kyAA˝PT: 4̢}͉λF>ӹ;F9nHPFP)D$̢ "swi R :)6jNxd0d@Ȼ.&SR!2v r$ CT%gEܨ攥hŸm[$Nkmnkdb[̠3}1Ѕ1&qvv%e'ۢ$R "nFJIHI GQfV(DvDIdH"-c$H5c&,͉SLRAF HP,beDSAX ߢ))1":@YR0ڈ?!DIT1I(XF T)MlҼo <).|#ǷQ6[,\?U<F\E?e\Y|n 6˅>?=>b f&ǣ?Q_㎗b)7+\hS~2-ӑ+IBaǘr W-93b(^8i[g5պWUL:зZ=EEMj# ToG[ڔ('8 S I bL(:l8=[juwqw3|-UQIp{_]݌ JYr;eA"CHj9B*x8ߦS="8>\M>VKwj58蓨Xh6elT^lI礔Ga)QZTB,!qtNHG p샘bt%^<>ZuL#)B"RUt.14 FSuڡt©=0zݽFDBVj=zշz6@x[_ÈtP07h;3F!1 m4d'Oՠ A-[K3=6~lga&ΐD22k"P&m= Uyb McZ7Mŀ?Zja]/V`GP*. # tQ_HaB UfXNsCxs }wQT"t_F:8MSdW;fN9 K rB9țK2HlD")SdiNḏ+ E:J%Xq"!cB_#|jJPe-57 eh$Qı[J(HbZřb\&a.+q2̚3cpCMM(%vr%@u 8z)!(IXp,"DtKfo]5$G5lyDr0 ԕ}Q"EQk$hn*iu ܤk{xJ04@[P>i~CP}1T}q(P}Pi~5NżSs OyU٧q2;?A!hU F% AE" bM8}ocE IW!bIpwo6i_S N*Yt?2<mr31Li)h>ŅGwkۃ{4-7aǕ/)i%%7U7UZ.ҍ'T;zuV/f"#Fʃv>ذ] 2D37ቊS68Eh"<1 -`X{`upmNL WN:fre2?\2f,{9ϯ=%fݥ!̉=8!"=Yѿ.&,yPx)TܮaLk{| Tb`Z)Ki}}6Igwn^yJ0ڢ:}w?7 `!E`'l E :rSmA'8l"wS=܋3A A'5FK眺胻 z.qc@i+vX&hף @[*38xϨMB jp^쀞p'`c`pKq|ỲؠV_@?{a ʠ18J$0|p|6/e<N )qgiNrvpWtb%Nr^'mm`v|_x/>_OsԷb iI 8! ƃ`m `ٟ &[ > V f0V#BBSY J(SyEW ]旿0/Ԏ amS V>?^!Ujb!C`ڛ"Яgm 0sNYÅ}(  .+7NE͒׭$;+ !vLh^/AJmq׺PyaWVpDCdBX^B.Y#C6cB{kCEc'+N ;sn0EH-67#G֮X.oJv6!aeT2W.)oƜlr,cNM2RI%  gL靹,"f`-#'WGX,(-F|{jϗ,,Ğqp*cFD#@TDHsI!!#`TG"'\Q1y!pNG)(%HE q)$QH "((fd#-56qtY]p$)F˜J 6@YDHg)334 dGEũR)e!J,‚I0"=FpD%*$)$67H"$3 LGR 3/ّr_ΏG9M@=?*aw5MIbV 8YjFzjX*3 F\鈘au#)P8y4+ 2N)`f3/1Hת'QqĮ 23] @\mΏwf[:֟ZT%j?gWWӥ#|%zd>3kyl#`}n>}s}[r7ܼ9;k9EM 3C[t :zM y䔚 rOUۧ1xtawg?{|9g*%x>.ܰ_N>1M5 N-R#oǷQ6[,׳\*?#?gl5j~]m`.FW=?=>BoĨxGLs#^*0zU#Nw!F; 1b}L1n1*PAXň##(@. [0J*^jf)#(FjJ\˞W?Kr0"Oԟa٫1oGOM1M@\ 4WYg 4VC}@hQ( h6"iQ >1TexOL?6[TAYO_6S3l+oete4e^bMmk}ee2a}f.k=_vD("V%P! gOE$C`2Ј ,:i%<a}sg*eLӮ4g"HC֛#ZuB!&]ٕN-ʍ* Ӭ{[rO?Qvws=p㳽gmEҶN`,PH?&|ZvJGsK#Pq X!;gOY go7` ()=@Nn}en#_ޒR2 K9* d"E'<@AG.ouTU>(i^k6F\1^s#p#N;3? vN~硳Yt$a-.'frkrJ4>2Q;ix X̼|`_{:_aACEР]92 K3xGW@|A/:KBHe2&bPV]g(?h;T?=;{)뇰jD?P&1O:ɠ!ׄ!ǑT ؛LPʦ`y9 BAz&aPJAf^j GfHW]ɋH1ٳ 273jƝ !l~Bm؆6f+JnfzR |/!   >R9B;bmqC^0֤* ,w {_ %%\|؄\ e1 2l~ǒ?OFg;v6u2.¿s}&*)gt(@%ŻK7(FDi4Iu=RG ~r2}6Ѳhbv6ZL/_| $:D0ԉD)Z!{&z# 8 ]Hjb==I~7 bc, z 4IDH$Ah=M嶝/'yjgtqUfݫ+.pz:f_oO=i~ljίN/&|j4{;p1ZvTi'[]/7/~~wyeTcj!}t{MjSyUEy1փ0.uJ5zwUT FIV^bC\8A3Xpk:bl `;WOn3.X4pEՓFg-"+1"L12 mi}~I< D0ڞ;JyΞ?ԥ& 'I 3ݺQ6NkV2=XoP; 3;@{6) <i $О+ xMDSBٙS fjM{|i ԚQ&Ӡc('5tRu=Fb ,1C]B/XKƢX܎*'CEyV V`[ F i T)M~(@ HA)BH{JQ]]5>;h3>uY|3;2@&=3ƐtKss 7Biꀤ^#/^#HSU-HUEHSOo3م/{AR/4uE|eWr2adOܠ /@S9Eڪ$V%l"?j Z;dCX})ȶ=i* D˔c:LIES(Q[o8LEa1O(O5~uÖo[Z+:!aH+j<ڥ~c@-$r]9"G;_$'c>bPd#D[Aܩo`"(D`neXG_FQՆ[ (Yb"E֑i]KXx&qf\b洦(-l#ͻmMHԛA\ك60H(D1**վ?fqb"lO8lmkNc~֏w8!7(oPLZbǚ82P;C nݝq=ùK5$D5JEE4c]'$SE+!+ A{5;iP(|@h+J4bV|ㄲHϦV桭ɼ6ǍoZ|2*y SEH S3q׎ bֻirFX^K.Uat_M e!QkKAKזmY׋HM] ^Ej1%c٣:޴ dZ1qHxr;[&oD%QHȪ^w_]M竾hw?BGpHOVI rtsuǫ\vY }3F-kї?-t>TYNA>PBi* >>l]޶wzRFj_{An3c ȁDN.J}-!|pu :xfoHeZ#A-|* Ӿױ [mf}]9\۝){KB?Wɉҟ66V[2upyvK{9u\wgqPX;{:P)9j )>2ҔGhT<+YUJcA< TP \!<=L Fs&8FC*$Cf>!oQ.ÈznթJ2lHeT8bN bNl2|&ga0 X/*6@-W!(#+#EX$ Fx DwO2<11h5yf/@9=1u[k52 T;0sGF D0=0Wb /fx* _ ="<[Orབྷ8Xф~}X8yhgGgw=%tѠmsކjhPePDu*q\2FB[ޖF)g|F+:k*^!?,5 BOg 7ke1zȯD0.IW;_2Mz'BE "R9Z+;i#|7CCtQ L_GG/uWfpz-]0O>A9x_/nΑBCGŇkcWM+CL~RBQ{.t}Lk܅1\qzNJ`Kh=[Xe 1i6c(rǪT?Bl77 STUfQр[}W2YuԽx?eŭg#x9/JZw3aZgۜ0G5ziR(g\djM˾ڹ?gzSƟli7^UVd.'8G2|61y Kp^8_oq ǹ+l>/RVXfU ς:`,/zN=DᳺnK.fonoQ'ɴL?eY-mtLal@Zd~9j̗cf 6"Y1ba1v@ 7@0Z4<w$C͒pEQ1:̭`EnQYz_A3ϵ:o#Сh(G\JY!=1Dd;BS `sJlN\pTCǦkс}p#E(J!9K8+n#DGcH¢W:]F?ZPQ+ as,;¥]6q-_|<{:+#Eoݿ/'(߰7?=0[[ W7g#,쏏rr;߱r2t6OxrYΥL'\ߡb: >5%0)DKC%Ewڗ}{x)PtH-]kܗĨBGsbQu8z&hѬoTRJU5HƙTh9UrBRI1^1s@q<͏5p؄ӡR /Tj㹭|i\xa-ω/͍/+L G@BTפ?efI}g8̤A5> +0.?}TNק^5./"3N@L> y 3mB& s7-G"jE?Ѓ6-op VÁyҸCk†KqŇ@zZ5=BWmҸxNo$ 6!t8dZjtjX;MK-B ޴4&Á%Ҹˠ@I]Q&@U&fj"Z';gSijkֆĨA 0;bG QCT6:1g9dyވUO4{%ru]c27F Z6||u],.}R1sʭ4f<ӧ浴#p݊/. װ:~N7^$7#Ifs!O0ֻ+x302^Ӣ@4 7}Yg500dؙőYqnaߜmụOqkZn>}Hl`ز3A~V\E:'1l9\qmoXh}_x`WMG۲kC5 9ÿWa:b?Jm7=oy3,Z7~wmeC$jLG1A?S\NjQɲ))cW/POiǛ+hYM$__[ctDQQL} X޼ >fWILlW4A1{ytnQ?&衕q~5ؗR>\`.okࣿEp4Ů; ]f%\^"+2 y_$!+: x@%$P !'[\!t0UOy2Ck=;}Ec2SsI uD/Ec|Cg2s(ťA8o1vޣCg7OX6nul*N<&0ҪRp )`N9K )),/>x1 fe-ִ pl-Q\; װw^D-&Fq`yMJ(iL)"-)HdK >(jQ|ET'ֽ< K(S͖b0E ՙ8[˦rփݓ"ʥ PSeLe7v? oPhLG^EL1Ǎ#ռD%5_eKOF{-0D f/N=ee paY{e>Ä'1儀<-H4oFR*YzRp\N1.:`r.lMN8bIeWʇ\"FʕN /9,Ǭ%w-9-~T~sE*>F.4mW'm jd VV9uohNr)2JK 6Om/)3+[ %%O=V))ϗ+l_6Z&f!Tc.z_rwPUxJzw:zTz#-deXKZJjͣ8D)gzÛz4 _~X޾@G $~mL`ڼ4cA0N C?h ޜ>W?=xo-Z|x7~ox`A^ǫQ Z2"hՎQE:caF++qƌwL@|3/u$kwnF6DןS=eUǜgF6FNZON# pg3)s!ag~(#`' Fa= 5: \TܞAV?@9a3DAѪ|0_[E{8RHD c(5H&ET}s-3q۝Fyԅ! Q۶U)Vq{cȁsh=?a[/$02Iᐲ1D"9H5 8ѝFJ%e-Mx $I I{(XĠsm{ͽMT=x`{T WYF8mc$ ?wDڍ:4R<7Y$Ix5= 6f!4X1Dž@F%VZ81*g&9A=I0k pRF`)Asl;J)t(}$OYh刳 Z? c+ q)n/@H# W\hX΀O 5J'VG9,'q%P*9D(R1u2B *u9V*1GLJe JdKv e+t,ñmԈmu6tJDvTA4}'~YzR䬬W8'PɺAdqsqz۬B@Pm89m^ Ta8lVҩ*9OipD/)OMIH8pßJ^YE@XU2٥@ 5 v @<;+#:;/YY Op5TduV(@PDhRb]T&JgaQ9 "}%#rhTMLe"GSmHCS.uќ*D9D4[ G| -~^^X%н8_~b̹=⾞4=5[us,aHvfgLwÞ)-oWf rIo\ 9nqon.7k 8տOx<~ן:={bc +wb| ߮?-oWx+|5`k~n((< h0!MZGx <^|~&lDlu}ίF݈_ y"%SiM \(\QmqDo8wiޫ2hDRynִnN[9~SVкlH (5ӛdPtL#oUtwYA"Ɣ`=&lG-щ]cu+]I东u&4Ժ.Q25jdkEQLRMPL5Z;{)5AbxwwM;J 8hf(&O;R%Մqy:ޑq%3elʫa$}gZ;k;$V֪3HMĮ(Zk1v+L{};ADf0C(dî5+H7?)i1FZ3B$*<:j\+~d|[/{! wR8lG 5WJXFr! h8bM.mX yVnz5"1r#טPMhp4@B^Fɔ{CM&[,!Vro n)$䅋hL}k"w+щ]cuOehBC[ y"%SZR9GrB;{i5A}rJfhRj?;J ؞ 5#-;J 4.ghjPtdk9H!Xab :rs'@py 1qe]Be]v77Q~\(Tv1G\gzb,e 摢!k9ǶS=b;^]SEzv@FhHEJ*.@V  aaCGzAvj5#)5*Id 0RH (d׺ \X D'rI֭w<޳DlH ־8w AD1ɺU"£ n $䅋hLqP s3G1I5ӻ|\1{ǛB"w !O;Ҋ9=Vӳw4{GI5[S;B;t0#l@v 1ӷW A&ݞ: AOūղ{u!&h٫chD( °aȵN|7 ^`8 2 Yt6ݞksWG{oHBb78!pxH Hl;Isa ߽zWH}yu5-l`[q1{fSkY칒DRB!Svx-b}~ʺC:{pګ^x"'Vv{cJḠ#!!ņȷk/(k| Z}t@{ͰRh)5Y!b`tJET;8{tRз4ksr3+I7p`PAtpN{è3By rW_!aK~N想?wyE.o G 0 oὈ@P^ xK0Fh-XngaX1Ҭbu7_r ,fqiVfgVĀ(ɳ1Z1@Y1ZQb][s7+, TvcWgS* #1H;ޭӘ$"!17@!oM/1#A rφ~? 0"ӌ%`7aT+Aqʩ j4RH Y&sFK,B|Sɜ\䔈/c!8cp0^cґ ťf?f\s=5bUO; $,4s :tR]q MM':{'7bkJC0ZX:v9F*sdR6MJj!*? P%ty/"˂,Z3t v>Ԡ^CKUcBk*sٙ[TCv^cT]ES67KyXdMsڪwrqs(MnrZ5=v17kLYsZt<_Z0+x?/^ǠH  B om90l}'hR~C?eߏ_x Oq&e8_N"X-0J;yΐi7fy΍i'0)}'COëb^}IN6kMm Чޓ.E@'EnҢgTA%Yv ݤwwQ(.f)o} ;Ww䱤OQY7KX5KԱONv1"5f􈝸݁Ջ"T^ Wm2 k; &4$ϴ]G%Jv~tun=7H#XUG`ytRNFaW *xA/*85Q9^z9>$kTXJZ9hzU9ThP%wt.bzmc^uM;hT:4 !ԈFǤcvI7hVqÉ@U=sSԞe"NmU*@=Μp޻,T{4v_MInB t4V2v3(#JTMM8Z`i{5pϮ\|u~ %ѨOc QaA0O9ύ<;8{L^M(HN44m(]*I|*J+qslH E9ri3'|N:v]'|6}zoצ(z>gz;W<b.fL2*2#YP1g*Ia7Ss!6/~?U[/E6ßz 2+I3 1@rZuƲ*(0S3fLP.sY#ӽ* Y }~;MFs҅Gsx@`kk P`c2p3$H!)`pFK,B (#20Db2'"W((kޠokf+ϠG@K1z`Ɓ/"@PT:;I"qTʹci" q/PLZ2/yp0.g֝Ņ;3xlG8{%byG)`4w{p~V( ( YRɈ:Ē |0(Ts˨xU5[?SmMdVI8+`L\Q]-Ns G$q'rp5 }TN9T'u}0G#-n{ >yR+p'9}ؔ`v<*z" na Mod1ܬX Xplv>Z̯;8]|<0s ?r%\,X'aiץW9\}G)W7Mݼs4_f{Nb0]f~z(ً(c¼XԖD߻@F"_M({Oٕ|(pC.էZ}I[7vhb1ubqu;Қu_h!_ySc릃HCAC[QG=-~u|]4O d*D1/QL#!(L@jTyx_<:Fr؋w:`z~ޑ;zj <Hw;^; _hH 9A\Kd[̡Ϗ7QVr€e4ԚȚ߾~}I?%xw⑼:(fo7fb\aIFŒFH}NA&QȢ&ڥ2cIFMrwa>V͋q7aG&fWo߿SdM]H$8kWqDGD{o!BtrK* /IW&̻}U(UC*L:ikn,h4A9ŨNrŔU ,Lڅ*:BB}+*ߵ?SDk'*UVhED{?&cY'0ЧLX2ܩĮ)ƈ u{o!x 3ᘈ0}c='\)9:D(ᘈ&Bf7O!gxD8k !eDR *1S55i#hO-D0ڗyQQ3E BfT@DNwLFM#BD&X"("N۱ҳ=}m⁊B~DDX߃1\"z,HV]܅JD+"kT*mttzP(ac'8k`C=~&AN{ GF;PEvՠ `%zʝhEGT㠸]!WBpݝz4Qaz=8] cRfOMoeIU˭lzϫ߬%dhf 69LJ%_i,4MM(A0懻K> ͷAWޔV2%}*o1({e}c=)2&-{,I/'I6HP@;xJ")1Z3%/r7W={ѱ˲̋0O;.ougA6:8_Vx>tސ d [(aYU/|mQ'Ⱥw9O.vۥ.}vnm&+2s4Pj0i %2H,1yQ2noϻ3_Y\v61}]ys'ClZQ?mum^9ʢ_~tOLp?׳HNUN?Im0ʕ.J 4U9F*ϴrQk((T9ģDC/Qgi&2s ӹ?G5ÄX w2Ynir>imT&xNy2f0C<:I@SK*wi$>wcmd BPגËay{B&bf龎<7>ӃlY5rh?ܵzD#P\{lܼh|<=DDp.L79x!ebEx sп0) tV~glV~gBd{ɦ|JӇon'2$jXz7o*7Bi[IxXJި^hD V;;9" c]ne\ !;5BfA[/%xio|<2, 8/uzc5ؤ\L2n|M>8۟L ϧ (_$Oޭ| 9Ge!GH+_~@b=&?W7\)A8ljZ4Dst46lK=%nyYJ>/F||nxܼz)+Miveu/ROLFE7r~d_Ox[" M>@m.'7.ۯ*1mz7 j@K7d4C@Uΰ % ~3/ß\H5ǴNj+B~ԈL||xa mq<']KҴGGa/_ofD[֯483$i@ʲ%A\єtހC W-XqPEieGEk\@Z gOsc6 .H~&uJ3?cHMYٗ\;N\y++`m1yل\J~qH T$Te.ݨp޻p7 c,GTavwWr~n^:*R[~aJxk/J?mDc=}*~Ffݐ0R%H$,I1&#bĤR=(%1I)FGJ߇(DpŸ6뙢0ʀ‘M^ W) 1 R (3r>3RG ?3(Y(2R "0B)D2\w" z@*vI.5mhcmp#88),ɲ$HDCQY,v^PL GM>YF4DD$CDf@=mJD`"X&⑆9.ZB>QɽLJ<7M,QY+DCcP':j },|݃GZ̧|~UQ?#]x|Xz`\<ׅFWC8*{f0j_~~?dbd2^if-g-ߌŝwf+~zN󦂻[ DjV_Cwv]A 1-_qKEJ =XdٍЅP0A^ƁK2̴Y vA$ A%RXɌF6 i*x*=^_ z8M1ι^'jZoT0˂7X4c [=d@3vVM`}YnK%hv ȥ$yɴt1LYfgx)=`ov1l_ I~gvP /fNllٍP&hR/ƀj4E*@HΤ_M3:sM.b G]0I8  3op根N'7%wX8弓y܏x'QdH$zptZA0oV);@2'o=ۺb Pς8]6лYE?sߜ!TO*)0д X=tԢЋxZeO_~;N&I7nh׸يimW 8c-,rf> Γ)n&UO{17a1nJ UV8+/Ϻs$q/a.*AUD+dEVY[j"V$VERЦȻwhi/M/O'跨Zi$c\{)z$;=U{{(`u"A[{md@}f^&w?x5*2J`Sn`ie/gSW.v̇uS b) aU֠`SqZ Ҿ"W~T8[ßIkݴ9XВRZ;k(uzqRxz}{HW,|\ ~"AWCb~͍gīgoIN"ѿ'I=ȥqXҒ3 mR}`U23/OgjI=ewXZ&5huo?_:vzIDn9Sr>E)qnղD~3.} y#jvcjqvs븖Qgvk1МJlvs-g&zșwQ>EK̏ű猗2z;&FG~b~ E(XoĪ.S3EgcO#XymA$74- F;?Kˆt.Rڅ^|=|C'Z,Ʒ{5]SWd:CMwMx*\kJ)޺޳2\P w!΃[+:yZFD%+'CMBW(/> ^E|dDюz/ޥW>Brx%"p]a8J tgj>ξo>sYdxQ0f]FEhD" G`p6Td)ڔeu8㐀p̷I0q8f>4fEM] vXqS$aQDr+r294?̺lZ)VdC4!GGV|lr |0:m58'!GQv>׆j 0"( '`7D/:RN["';n8nDaa5[ö.f9zn TqNn~h3#J~(ѨN-D="n:"2艈~g8RmDKECgx8)H d_d5${^_{RPMjuW یoU/cH Ojnu>Tl~Uv0#u'MOFb> 6LBW$0 ` zzvҴ/,@ *A'rlC5Ӣ)Fࡐо./5"SO>_>-1H#׃!H"J1 p%1=~"#50MV5Ezu5|旓q|*HX]%u*sN>ngBҗhpIF B&|ɘ2[׵?  ᛃnqhw(Pσ 38o)8%BK?XĨRDX+ȈI dL{>,\\oClZлJ%hF$wj[&p:~p#Y_ ] 8ڈ a6c(25CJ$ÞKglgܺ qwisWi!-qg~"8*eFhJ8㡈R Q2uʄ*D !LQMj "t@2Q*D/I!TŊ1͐C4JkdLJ\pJ auwiWE{7jzswfc0~sVYH?]d'[ŕO3o!LM k~OWw V(s+(-?l>s|56$j秃)3S-DLKQNN[P~ߙ@v{I^E4IP}>0dS|Ik6JkV fIS$6!^OAR(o78P|Va yAϓbFJF,G ӚF_4l q NAXNp`:'hgIӍFk  3)H)3 %m:Go+ѫ3#Azյ)N0FS)T)pꋽV-Ƿߚ\K&r!t!rin]Uvaʀqu(ATdJO>.:zyLASHOK [C>W"M^?#EWK8.qK3O[CYl4mӜ{i}-^J+vԪtT(zn@/Wy.+ؔ%}L[#G^<OThŔM>I65ifDyYK66?n#m;7+T@g(%&I}#j0FmQv9)p071^!S#ՀcX>"%VRo,୘cmi3'VKöI#\zi'Ou}/#CܫScR|#![b66x}w&*:on79.w9dH(o  G(RB0C6LVrhia?σm͊7c,`Wo ؍2;,nf]Hņi~͟I^Lvs&maR~ h,_ˆ^cCk^{Ձ1ʥ=i0aZ5GFBIBj cFcMB>݂s^mɋU]Ō^!![ѡado(i>Ƴ9@)^i]pĴq[ϟ̣y@_҇p-+1tZJrW5vo EdK:6zh iY()E6)Иol|,Q$il =F~|G60) 6B)e>`|ޖ:>@Bs9*9!4nxϜ(m&Ʋψ?-Z3'3^b1~!|:pa ?x74Vz~'(oB3Fk2|1*-AKJ :IQ}c(z}XxdW*BZb( ޗl ߴ,Ev`Y+I5(-n&e e-trZ"PntU}A{ݥ 4sduM1p1EAНyN 9c(c5Nr V;F:q*"u {5Sj^MIݽ;c/@֩:(#yn1Iy{=Ԕe@i~!TrvCso}8O? (Y7D?Xޮ0R ׏9v5HV *I:!ѲcU-ȕ 뺱vA5vK*7If->[ۂp Iz[ II!Ta/&@ J}G12˓>>oo_?\$]-|.F{Cun9.bR_z -5./A{:_p%A ͧqv=ӧN#F!Lߏe2WyU F&*L(KBRDi,ES\Pd=ǂ oX6'7p+p2\S$H!3ϲf>l%%?@M MG[>EꞡUrE}+ꫲ.3t& 8 "F$uJ-3 S\1%J3$ !'v 3ZbhJL!)ˀ5F/fm4:{O~&+~ȟYx"\x%dhRFX1 ]_#_ˋ +-ҧ ~4;0T6SBώ|OWw -|X]_'xN?!o?g3~Yx>cYpd)nο 䙐Zsd*9Y;۾/ąp_9\$8dh}p-~nIqιҒFw)6Imo؈ =lTwބApרde.zqK%m\M sEk@3N"(.F SpR# f#˳1#${aa-nIP}>$g|I嗇zq+M"<$40 2 N$3HO @^SAh :Z*NB˾Oh$\X'! nC]$Zt?*0@~SA !T`'ʌ.Wj8J?~/Iԣ+^[4Smcρw+!e9GR)a/7~==;#w?Re e@m )P4R~OAZ2ߊ-J q" 3^ ÅDSpX8)p)3Msj"T)\\|Te3ڴ<`0`A [s 8)[ 3v+&zJˇoHw;. >/8 339|ן.ǐL>}sRx1޾}9h=9q\sq:uaXl%w:{*2;g 990tyF c%BQdf6#~\)lSuV#9[Ms* (]t1k2eIKC䤺+(("ʵncf^է$t ?{]k]Qd[pUs献:!7t ?鈇AW3 yt-AEPX2eP V?3,Ӆ+]*w)V'Ӳ/9D_-AgyGI&=Vrf]ɱ8 8zB4L:L%y>^u@|ss;/2$ĞJ^ƫ`J&<ϵVjGgCJE[n.pV*  nOmAh}G4{tԢ7Nc zuCzq2=GŵõŠŵe Kbj'kjyhUSEgĚۓIyA*ھS1|϶vx#$kNbWn7]ܽĴTi_)}y&% g6Y]5/o^FLJ׌qњϿf׌+8s1< v> O`+|[6c[+A[opGpt{c] ;(|ј=y = X-o|>.w_Y;gݬbꆵxpJ.)๯E*ǖ N;'-+n jIP΃mRl)r\ ωR8C`bDSWZ;OF'ZN[O6hMi%ּQ(E6,uB F.g-Ka`T@i^T+cVYcXNg-2pc539ºUk6s*rL%e턖j=c-YqJT?;]̲ǭqkz;<qK0 sktF4S?IдbfOEȅ+zF0`QȅWyTp߽<2-i)8%r_^Z[L[K3U.>UԖ$]w6'm 5S{N4cT?*f+nሔ?d~ wVc*I-xH)HVOUyh#'kRy|I_+qpT*#~+BJ&%OY ̓ AsȨr *U*! <8%|jHMB `RVH|j(PX9DP`}߇ppI J9'٭QrKP%(L,(Of J%%<(B OA @8 vʬO )iη`I*6-=O*6M|Fb\LoLSͰzIB6laz1[X`98A )s{nwo/Nv(DI Z)LI7sDRbE+)` !(&AIH'K\&A~Gz*Ō:.!peowkgGJwt#=Ww )6%PKe!j4 0v1N<7ٵ+3DD3f"s R2Cki 0JMF8|bc'FM q bӭ_~ f:vID'=%jH\jڠ$;t3 5+ܵW^t}~^ޭ}unbW8?`&8`w?mV%v.,m#$2C~쏛ns>Wgeԫؙuqrz:_FS¹:~N_Zaywgcrbh ޜ1ׅYN}jľcpL^oI$α oGVi4mgm+'4xsgֳjRL<aXM+z&S.8\OBp)NzO-xX |L'!ޭb&å`[-|&cS9nK٣M("Eƀ003aD Ĕ#^D5T1B)Q]U,uq3&^DtZQۻ-X |a;BbuEEX%), 7( ԻIJ$Ӊ}Gw:t[-|&:ۦ$.]}vQ9fY빻=[nڞ?xf ~io__~V7՗6gzyt+*s|BzDSyCD OV@j6bW8@tAL&vi5^= $@Mw[񧽽%;BClYl ^ 6 э`+u@p!1/(&⾝<|>Q7^@ '7{< 1o]۵6 / o A [azlE8ͪ7K K0o#'NNޢٻ%(3: H6gyzvu1[/:ڤJFN" j~R(khejM N1Ob ̷۫NㅫK.FoUo-:.-T@y&eGѲ ^?hń`KvmV"y&Fnhs~֋ I6mB?݃AƷ]h~CiKa@&Ÿ:Iَ=L7C^}/i0Ad~A v>'8~|nYܾ5U;0d<#U[}/eyG)~<͇jeV31`ݢn), 7"]w](@ K"H%1dE u̜D+fV{cs$|DzGV;;(c]HAiC$h~]#Fr|GPrQͥb{I"zw7H`!_FٔÝjyݸSPPsc(Uw;6%V{c(|&kS/Y9Arzld3lδ3Oa|*Nwn/g^8g?K訢B^؅xsC9g/^x.& J:D=! <(GS /: (amVD)I'Y/ۦ@wdZQŌHg iC\p7کa4n0q*ȨM>n¾#3Kh!{ '@Q@3= & ^ рw "~YBN٢pDZO$~(SitM=tlj ^kvEPdwqfxzx}MCy6gxC0d6 U]L1gSH Ղ pf+)~= l6?K; O 3z {jKƴǣz+!:9EᤂEmD0ȜLTj?!"8{>Lη=n=spf\7]|Ja C |eF/U?˶ {k-{ku .An{7.rL:LUP4;%a4ca Om%Up*Оǥ'q c PX2kOo_Z%Tmfo+⨞+џlMeUS{6 [_SOǝ5T46-cL4L?iUf}ߖs-4/ג{8xxZriV'ܵ<#/k3l"\NwhK`,Ql6A=/@~G(R~G~f &e[&g g;Q*x6ϾjS-ZQ P-'=L=hTO5zZ\(AQRA%]Բ% ي˗ϞxM^ u7HUOspKXg>h8ÿG*Q9Vk2kG%d tQ; ӂ3+\$6٣ "sa*xF@_,?@d%Sr,; :BBH# p׀`Z,Rsq$B6L &Hy0g͠kB 9)9{iAԇ mZ  >Z" *}Z$B6R;vjZTEb~W.Q$@c,-űa1DǖY t;!}6Hg2̭SorJNs*jPUle6H[d>F/RV3ғhK ڊʓRRCFP7nЀXt= *.鎟4au(G"]ȴIRK?~S~(>dc՗.nJJ 1 :f:sah9~&-0RhTٖɜQNuə3e[`S _-s]}>X @Ԡ ?V\h@.=5(=aNy8Č?=Ŝ$?DǑ,c(Ϲs"0Q̽A*p4 ą9e\@ 2EYgLS!7 aYnY'Ԝ[yQe(`\4O1p?j &@8ȮFܭƩW۴~n^WbT㫳mz0˻e !,xsׅQ7eûFnzi?\-z#bi6ԩ*]#{.ʴɶ-edfnVvf]x7+ߦxz&SQz24hMIHEgyY 99nJ^Yq;Sq3M@m%ى'5}u7@NbQ9"y(,kbLu[d)Z`jDrL[I2$H-blMjN7H ~o0E"ץ}nAє1řM^nI#ZT bN7X1Y̺%+kr|=byK dvXeVXJ69$(^,1cl;Ha&C DH6"'ŠxA`svcĒr]O!A(A5_?mU?>Khpon_k7+3gtץۃ3wORahٖ6`R1lmVҶ@{TKع} ;C1/;H<~pKx}eƐ~ѳ U9ITT ): $YY`NXSiSյK ID"INsʌ0]9UTm̝]u϶ eܬPAR52( lg!*qVP&3+R8'R`(+6TI Cr0aF"Ti8׉pPN"zʻNPT ):aHȳ" ň" ʦd2TTz0# pacY#]~('ݷ lt5Įg`SdԽ* oU×I"霈k)/ALC᥂ڮdN ʨ;(L'/AdEF<^4up*BDI%!!E"A0VC". ;˭\+2%m.o>POHm*o",o+ebJւy[ Dj-v@м pl&A#n DQ7;,7ZT"5BO6’ iL|&2`~ݜt7(@e8_#3 ħOǫ?\C3nfjhP ]j*-h) fvcސrʍȚskt.4/iX,؉hB9|A.vϢuDgKYr;;o\Sg nj-6hp4 hh /aQR=; 7T)U!x!*~nAK+Šx9 p x@+VT2F䌪Q%1W7<7 gd.٧(|MBKR1 o*ӈP]Pu|c]S3-Q[Xx;1ǨÞmCQm%`;Csũ'T$%EugfU?;2cG.aIn#EP6y@L>/B)TڒpIHGt3L`Z'$Rz2HpX7^R",9":%[:ɟ~8 ^zNVg_^Uj??Zso%ν8V<7l6+.Z4maBrh-bִEܜak3~d\uI^,0VڸeCOv E6p0Ce8]BA-AmPFX[;[+4D`@j"|5z|;reE>7i[.>i,vf{?7x.)ǟw?? _.Sg'ypz7/oO*vw$K9u?^ R?ix;7K{ I67p8?OSgB7\0Ӆ[K;aRO]_Pӭ⨶[Ǽ#ݺ5i(%JUFUʽPlAh+;qf[a!Ǜۦ")n]7P^/lFVh?i<n's@ \IoƗ.)ruaDqPSDy/YA ?h2 *4z1JUPAx՝~;qǨӋB뒏½|7@!FhWf1F `?GA~Uà '+ #B($gBnzY{mIW C=j NJJ.E9A!Pjg}tAgMno8toһOU|>Uz!eTrm-qP1h`PN  3  RSL OjقF")qCUYm DP vF frC5P#"$Ռ 0OZ0o%5Ҥ[# Ej\Ei.7NzhF(nm%Jj%h-Z)?3(\}{o#ˆͅ,36`s]+Q136ƘLfάIޣ7> Ҵˏ$_uɍy歼? Z9gI9Jp;C ^?]~c=y1M~2eC P0v(IGʋ.x%_H˜ą6ܡXqM,wtǟJvqƛ)KQNdQ&f3Nh"^vFAю&Dd''C$< <s?AzI j[^55u2 עA7[Z/_]䘜1ɢ8~ycb@``9nVQh&ԍfFLI= pRaƬÆ/u'8;tS7E%3z| |{&El5=.qIK/[8nx{! v "8 NЍK/ޓNgn/bܶ"+Ocpu:=4Ph";JM ֔F=NrD1ȇ< nƤtO9L2'n%_&D=ܕIm>˖x;|zp[̭ӃYL[} ˘8<>" IB.z#ӓYK@P/ `;}kC~ة-q̓dm5PPJeFV|/?^ mlM05C}CEK`CJ [tO?Y dّ >0)&OrIX$ITiHr`tǢ`.SY QaTb5_!66RrJWA ޻Vp渨֥U[T2o֟D}wYQ#;̀ \/*zj1$n۶qkw`!4(0T1JklZʼn6[VR#$iXCZrCjc4nF-up5m+i+jkmü`LDPGЭTm@8 ռ@-Xn9 +uS: A^q;뇋+\oQ7Xm%O>icOg.BxX+vyU$_(~R&|p_|;d+t&7z~gV/@eNwsP൦,jr qN/ 0nz)HHIO?bAGjձ:|8E'3F*nO[\8rfp 2\ߓyD8ij4 `e;8g vlk ADW~S}Hvib:5{VN6]ŖHʵx+7š.Ey=uP]MEUŁ0\)rNC$I ֤#' 8Ah)U4q9I`)/Su"~tׁeD{6)vR[Rq;Sex3wngߘcN`dz(lQ;y?!qj3_['QP%6a }7 =?18Oϟ֓ƙb㉱}=<1葄wm^;X`c]m 'gq46K0&KJ`tH}B%/Q(>H->fjm$-FI1v]b)|fpDY0a? dqԞdBzJ*N.%vXfzp// )9H)RR "L_P+ A#%A,A;J 2 bVl5dQ !9yxmw1D)^E%mc[B/t˞ŀ)6At?PYruI]RiGr8I,R[s]7*xK V 61Ah`dۘ"fKp[k6% \'q 2s dP!Bc ּ=Ir1i5[O%(=Q]j=e6z?[vг~knY*87|H㸴Mv2( ˢ*DaHdݗK-9{FO%_ɉ߭oÙZqU8u?yp\6}MoMZ].4 jn}UfrivƠn R ˆrJ-TJO")2\AK;Nbɉ z[\|lƛ">CX*:\q?Un6GpCÇ6>-Iۚ(un84~+ٓlPIL1ʾ˨o{.x}dUg(%TQH YιZ/ Ed tA}>I./)P/k=I|=M|Yot6:{btcvAg\uQdH/Ecܝ[ɮJAxrz1S@N6ӕ fG?W*P%=%}R#Bѐ#WQ-W0b`:2uw8Ez[mng[r*jSo)Uy~ ٓeAyu2 Ü`8fs#;=~M^b>U_aU4Ȗ,TD8cB9H!qH1ND4yn#qپA":'w Tw6_gf-%4yEO`(Ψ[XEN#Rjɣb\,mo'JѭFLA%ni[K} ΩMJ9u`ޏ:oo_=AOM,܇ֆ{0W"C}[೶7xEX'Ux;u!& =74Z0荏áNe=Zl3PG- LJSA<ʽ9hIF䉡"oluDgHeozXaYR0< /.oPj8uzY|Z:9s]g 鄑$b2B՛O9Gd)/QeW4aTjީ{bcY{?d\mxN@JBp3 כBOߙozZ])2c&ܞ1A'jd| W[nqP" 3`Ն {\PtAYFsF_& {FUJRxN8Kxn(dJ2gDcb6$=/G 8]ֽ8"ő*?MVyPa+'u)Z?UK}=6:Up$5섡Dk1{N"g( ߂ zHdTV%!kAQiT꠴ *iyglO Pu3!}c @ !iuc2bB0k:Ұ4% Z4X+@ `80p 1E ) >ا$4Hμ&@HcZM[3~I!b82F#/|.}p; AE0ع5؋\#Gbր7S.hj4jڽq-u@\t>AȒ+tUk#}{u?ww?|z#5r. пrPTXE?c J5InÙp!ɴ`^Mgz^ &H?j]z|L8/ 8YN{8R H3\W~׺_ j8W#',qFi% ! YSiʄ $ &S8a4Bs ߺ=W9grP= nFpFEg*\A$P 7 ߦw/M0I_^[s%ם'LqYݛ;hf>>?{hk5P„\B(&i„ŜR8 )d<jY&IH I?bTAF(BLMT,dX@JPL ҈g2$ $%D_-Db A A.#b!^ou?12/\yž$t2īE_0U o=1{s5}jFjCƻt:p?Ld6Yޘgz6Skwɦ@J}8nϿM[>㡒 9P*L)͌{B{Kچ^zQ "#k8a\Y8x(T e! };pJГ1\mQ{>hHֳ4nmO@*O0_"!V|h#ZD(Х`n] o<z+J .N0MxORj:l۠(6z&3){ez+ {D@rmaO-D'rݏPzk^҂ ;w-Oaii^3us3ǷkSE%x1NOmE;l)Y`G::It%mMз/ %Tm-X=ӿYLdQ^EtɹDӥzH^~?_M[A @$8ԉoRIp\}|U! q G5'$gXkX|56=k9q̔5Y'9${g9xP>PQ+!!"XYPfo#0~N([E/1=Slx AR2H[^4W2Kh{KkmihkS1xo:/a =_kIBЊO<fL U(^yu/ۥWU; ~͞ti2+>ѿ?N_ZiC7JG{*㜰; AaǥWqG'xu7l [35 vvʘ2y0E^wE.߬`"@88hAu_ `,eۃU u~xmb3zG{9z9QϏ?I?I_sΐ~i'3<;s\n3)}9O]p#k};dDa9b/Sn`pd4a7S[Hݢ0^&=NR kx*(CBТ2╚VZV~kcdg6]= ~xYzp /]iaF{y/W>~/e'}rs/'κSObjiP&y 9xԬz%L\*pJWِ+Gx)b++#r%L*^׶~)g2nLƛ헨i6EE[G*@6.i B-״tٸ{0V!VZBKZm8#ͮ ovNK@H^UTl=>TC)n 0n-2ix8ojk"udvcGVVgsl7P6;eSdlN[;hBpQu\dQLҔ 9!UpHHSx3)Q% y$"q*h$ (W6H]# O$cwmq YL &&H6VH& CΌ$MCHA`[fWbդaL0 .Y#'r Y#DÉBͬY7Lrn| (e2:-}iWg+e€W%"-NF`(\40__zF"׎(ޛš&xR?I8\YLjR 2މ6;ib1ImD|2Z3Ep/0Ň9p܇L56J Vi,ՖhF4FP^ ZjP혵5?N~89̮nֽ9gstgo9Tģ!O֨D]ϯ>'$F&b7gvΗ!thzm?aˀ7^|,/CflnkF sSMU?cQ<}>k[1^Wn6OqD=4j:b`߅{[\w>NϪ+<_hBn"Q׎6[\Hb@VAꔎ (,nŌz64䙫hN yw+^PA'|Ӡ1,*u%rh2RFIFq Q-5aT0PF ȕp"T|ƶJHBCR Rt9Ā4:(L %АgE:}wS0mR 1JγDqBR UtM Q!r#?]d~uz{tОTd$w̫ЯBcj9v s2iG1]̓D1 *ACˢR᫇F..n[H=0~fÜSh0]:*iZX!u_C@~^N.We)hj QHC ׈#d2aE@mCˆ$أ{͹{CP ؔ)_7] ,4>l43)5w̿Rļ&DslZ+(5S5NgI2H&V+Zw ))@πۗP z>O BѲw|tt09ƇlaD7l"AFҗ+ݵ\cs4^FIx2#~3Ip#Æ\NuK1B+:l![~DM̭-xLj|| ً{0S8&AҎwpk$L7mX:r'#֑{2F2-_o9FL9ZQ% crGB6mw`[ @Bsg,-pLdF(&ߟ/&#-AC1ҋ'HK>sn4OBm6ɒ w!ľvɦ;vYG咛09ل!OÆM[l2! Jò;%%6p_ϭw{n-]hVD'qTK𜤭r9S,{^)J7"?K`Ok txaYZ%;k$uhZL AdmdrX:e:edt A {I =wMM{ZokEa:)5¨u5w+3݉^v>Ғ+zBZ~vg=<xHRA#ƺ!DP1bPFa DJI9 -B8DPKQykXI=+j#e08t]Fy +%TtVgYyH%wZys4ePcMumqR9ʁ@":b^W%Q1_J` O@I\r"Xdk  /mTR9͞\_;٠OV2j# ==kv+%eh)Ek/nZ'aa͹w>Ipzr}{,.^ಪ-:D^_6"J!iͲÒA8"{9KV6 >NV5;}y\QE~FF@QvR+gԦ2PUL%dj3ŏ40F[Mz\$c<9domd)C<5D0ًLll[((ħv"9B$+d<\e2{Xl+aI1C fнE 2^eݺDy S K,͑R,}\"/V M D_I$;~h$QaTf5`IuOކ:)kh22WPF&NpAfVF8CAF!Ar,cԐC='X_W nm)&Zk5qH;&-2`g҂p%ŢN$awenק~z3pq΀52o|] ,s(ۻoYPߟɻEx;u_pvBwQx{4iΦ=??W'\n^抧~j@6Gpߞ\+&HLhY؏ߦm/0#AH:-gd]`r½!uZ6-Jdwʞߒ:ů.!QL.E":]-EwGzhKrhGE#44M,.6 |K:1gyԟKPLCNs[rdg=۴PCJTp%\K!"W+ZцJW{rc^^2HݧPS(5֯*]P\`ꌔ ^2`:^zB gc凋tK1r=+Lr{)yP_&D0>V/3VQqz=}=ǹǥӇ.D, )^f-"2¼̈,M 0X/cvCE1QRqmgPp4tvp;ҏP5JRьݖ^X2`L3eN_!;5Mܝ< n4Oc7|*H['@Bqcus\3p55efJQᜇ 0~N.u+e&,{RD:}˲7G.`MSAFtbڜJ wiEN,rfxU2.tíଡ଼iڂcE% R;[kĬ!`,JVҁTt5MCK8YfJ|vcOKQ{ġ l8# x(4:nMCT2C%7 i~^FLqEb`_|pn$wZh glOA^q8m iE4a_ ;j@Oݎ2NcUJBmI=PO , DUJk=V; ;rg1/mc9wBKF|G漩lULrJԞz ohy6 @KCRוDkD{v1gq뱬ADSȖdjNQ:BlOm6M@NоtAJ`o.<FcޟiZW ߐ]=mQ,p5ڳD(Oyl-EEHp_]Yrӽv2*!QI7Y{7d/QNz^.YlqSHNDH1 $u[8*Eb C'}PzQٔ.T愷1'oXQNR|KrH4'R!f7v=ޑkgn)"U~X;X""QfJ+o"+9LPbHhXI਑T1W#9ڕd12A ɕ}{}WUG&ĶT c̓;#H,3,┒*TP`S ߼]|0@Mݶ%peE>_h2vSc',!cٞ`,ދIU}#TO&^<~qȎv.Y}ÛKtt`/~@ И&igP+fH:FJ):=\;JзI*rkYx1H"BhGdh=9Y 8ckT!qO>\Ke4F ”hkhP0NSҢ6A0T/m3wF Qwθ wq km.9Yzg-/Ux~wK, ļ@fѽ[Z=.:;A;߁Y!hwĪ",5g u̾33#n bHEC%ajxbpo7Xyrw,/t;={w"i&|TmD<07K?WǛ4Bӽ s;-ݻȐH>&LAp@HZ&hvŨuǭ_5x'wrN:2@>,&bqv1Ӆ 5Nݠ/i8vDDKlږ!%ǐ"!-jbljw9tCs-{Qz|WEr{?6H.ֳnjbn |Pڦ%NZx{@iZH 9ҼsI(}dpk*mWHj?gy|`=4p0%5ZND(XiyfG"b*@RLIBҩTZ4y.HQ2,,nWRcHmΧE-zu?K;fJ7\mE͛YM쐜7-ϙ#=^#RSh;}$vu%^K~hsH.WL7F\;O7A7d󄓝٘˫B=K^ \ĊG@x  X?nHo ute\v)ZMZC!>o]Vq0 Lb_{0pПW'jsi;řJ9Z$vpg BHH88tW7@7i.ÄK`@*SET0ȧIOq*-ۗ侸´Vp{uS,34%SB)S,čsݸ2U?[bEM XYSk5;R(=<?y|2ԯLtD1+&,c͘AiK#^r~3xޗ2sdPUF2KhiViA.MsqT.E_eiah$ d52ZR3x( <*Lb}%%k'qƂtp7t!^K(C\, ,= -(dh#?IAsĈϹ& նR} 7ăQ$K"bט(cùn's  Oˑe"p;ϩ _4蕕)@MOGJN[!C+FD@sve 3R A۵ve:0 Cn*3 <%egA\0 H8WV!aws !i:$$@fhގ`D*;,wḪPءj[-Tʂ%0O_4TS{1(κ<ѸpĊF'Q~٥2z_ڬL= X+mi\?o/1CU c;}ā^j5㴩EYGބvp<_UښYWFǼېSWA-$/Z'w_)~IB i14`>8e#ݘjD<4醑4)[mKLTMH1z Nh Dp.F|AfBxc"bǀhJ uR3x=<0SDTA-/$AsG!WomT_ģ7D Vkޏ爅2btC Ѩ\i8DVIXH1 RP1&Gc4P{?^7f%Sk@둺qsdʾLDq(쌾T`{i/z2Jm ڿfS&19 6ZNa؀.Aת6? t$0g8hi5A\,1r CxiJyi/M3$K#̒)`Z^Kg7-hieK:zC@u  I^KXH{wAH=ULt: p1}JF{_rFF!˦G hfƨ¢G31QΫ 1nQեF<䄢#雐S y2ق\ BWq'1K1"&nHuqAX;y7?Pͅvt~|x2SGޜ} y,YP}d~Vƚouɿ}Vd= OKM7o.zi j'SϏ23#?323#?+?o>vߛ7Xd ޞѠGSyfL1gTM {Tnɗ4ދ +&$L?'0 "RXl4aЀ~tDfGdfY)< N1!q24zx( f?]_4]_5>ǐ3~ǿ{LSOktv9=ۑ#П%D2QZp UVdH(ճ(A21r*D<#\$aSM65nشYGHӜ)B!mkKb'ד\Nz̲\;1QJ}肣лx&~oc(${!S-ճHLALw4L,/@)yn8B(; Ga8a2}FHOm))BR S(6aT #J0fJq8H@@$L*7q"9FջtCQFI=bqyX/2 `_\Y2L25ѯ;`姳S;X`]\}-}dջsɏ>b9d8yd:BP{:x{=HVjVMb=h?7jyuۓvn럭O?x|)19:?{#/z]^!Tm2+0z _>?axTGS8]oʿHnV/7sگiஶJ7דӿB1"8>?]՟FQa( 3w04y0zpښCV>10@"k'w=Tr¶FsA޺ZfK 2?nœ0kr/UN!. Rć@ %ǃc^z#!غ_DJ(HD 2e9T$Pr,U$/LڻOscsem$# $rGS x")N `TQ.9|$-\ G0,A"65p{P)iؗj4cɽVjX4`QRLVk?VjzV,R?%j4Z!K)Keqҍv ]e궱s$zÞAbZiOA,7;_ ^m{73AB SqAGסq2ge 䕧O^wY%˳?d:[ݼӵ6dMK/xbK'/nGl~<-tF@7[f^9g ?<,ɾtB\C!)A١H}rQ~'[V]8V;1D` ^k|ֹa^ڇ1qe~"8@ &Y/ͽPE@h^)w:'~8c9k&7:o,<:gȼDy T=̹ZB9dq ֪n;>nn# :Yv' `g :IJx-}8w[k1rx'/  #)~!xlk[{0BB|vn[>6XxVU'1^-l%L8%fR>zA!'0^OG`C{Y*s/-.)v|{3M{X"b0PQz{+x%/7el:kXunNCA+:+v{#eP2}7]w2V;*DL0;Y[Ⱥm~Ԝ_L@ A]ƺ Wn2:l ˫;궅sZ V ^W(!pGqh5[ڣÈ}C CPB8!(w<3B(e._B@kz9`Xn|=ZE8;뾵ZKd׷ҕ#%Qϼ J[%Ko^n;Tь3Å^IMfuq~3Fց ( ֣ndTFGp)kG\'7Ի%z@ uTNW53fZ΋{cJ/.O~XmvQIpSiYd8B&:De{ۺ.3İEfjU$YBb|F^܊7 NHU+B^"x!2>D;H)Ht8I)9r0ݑxG{%z-5OmX-[2%C`ܥ%<6wFk)Q)*MNڹ`'Ζ`١9nxsg~ѵi;WW6߬sEPBp{=Z^[@ʼn9,W3Gaӣm'&nU4ynKDUM(;/0Z-l ~ Zo)%?՗Uqیy.Z=ABB< Cm\1qP Tܘ%hN Ŕ@="*Þ͝zLh'`+ e~}2DF #›c-$["ň jA ׁqGW͞#US\ cz5'C¶%J5c* Ti-X"te-!.iѿ=\N4" meOv?za jDo#$W94?)A{Q/C&{70MW^ ׳/*|$^[! .DZ؜77.Ie! \Lϑ}erAI %U:Ȃ"҅ s܊2&4wE@\V\D,;!}>;cC!0{L/Vst o{S֭ ދhJzVI}J`VD^[\4 說n P"?zD0m rMsɸRTY ӝ0toAEPl=w?5S| "E}'ى|jWeHX~f6 ƓB zu7Y6>no!7@_ŨͶ$8 V`W3?u/*Y/:]L(@9fGޥu8lԸsRRrHHdP˲Gyjunaf5\@V.zOV䬄9q ߡ16^}DžVnf<2(hFB 8ct"~Yh15*12IH L()' aaqD0TZBIyL|x 亁<st\!HRhHcjHBB `P c$ tt 8tBR|BpLAS"p c2=)4iq6=\ߙds,v0q؃7,f(8MZd}xRQƆ 7hZnfX=լћq<^9!߃?Odt1nIkD/gOrqL/.Orek#h;ȮqݜdhD~Ц)X.ǏO#D%tIs9+ty>0%GёJ܉1[ęK"2\c Oԕ7}vw?v`e\,8"''͒t1jK*AFA0"PP"p?/YriIl9MKҴ1krxVHs ` )ƭ9*AS&n|7Q.nҊ/J^'#Xr$?Up#pK<հ`$W  *Wu&c*$,wh:5qQc P+m-1Q~7's_Oжl_A6Ա.G+[\<9Qċȁ(9*֯* DP?_Tɨ2^*)T^=+07wU}K嘠zb?w/_mO?W&i$ HVpmP!ڌ1% uO=`9Zq¶`q&JHfVџO_4 7Fv[V!Fk*ä4 @.~ʰ_OW`v#ڊ`&Eznyf2 PMT#A.>δb8gz r!%̧B-72 i+}146:y$m˜,Bx( 1H@qJ*R@GTH`%2`ɋɥpWlI$JDSB8$!5Q6d&i(T Ä /pd8,=5.GE4D$DEXCʄ${%@aeG$B\)3h#q|fe8_OT`rӗgc^7L=ٶnx5XN[c۬nV^H`a,ZџY1LʱI=~ ^g~eK>JL7+^/Toc *>gb#G_@7 />[O&P G?LMA7_fyc}Ӕ 2^w.@X9| Vї_ H PT]#yL%" TE2"dA=4ϕ=/Jc#E? ͯV4Sx;$~~q/8pR78Ѭ0=G$ UNL豓. ?x˟Ya̯UwiGN: mQ hKp" y謞AÿVhӐSVܑȳryO0LsŐy]d 8Ih?/@V>I"`.u- ڽk]³X07,f'B1x^Eڇ'mJ24-fX=լћq<^9!߃?OdtFqs`)Nt\vv3^LVh8c68Jڭ_b4jռj]a5Q>&jTdHr͑S/׶gf"ƭgU_$3&ž \&Wbb9VBaZ240B@Лd 쉌dwcϙ'3+K6Tf?L )tߺU({"Z폞,/@AmL}Zj\v:[jDfZ?oBy CujXz0t~7]~Q~$] =~%!k&-`'n{F)(]{^iTd\} Z=BxsQgE3@8kC̎@|Ub/J1A8nucEn^o A z*]Z>- zl׍|?`(Xݮ~\1,jwY[ssONB \x&P/a2[yIzg4/?6Lߒnxٻ6$W|up6,cb+yYO(0 h߬H6@FEY3E/2+0=[}0jpx)%u΂QչCu`dFRAJp JQKE\(ExR͊/Ge)I0Y(0 NZ ¿6`3VGNxb0 G>e\pȈ 2 qsK]0TJc xˑ&!{,7*ֆ݅Y+¸_Wb1JbTJnO,|ܿ뜤)N~u ]ADTWM;7ӷ'8_. y{wq1z~g7?\--rs쇻+x>Mpt*,gÌS_Дhr2hÛޣR &e˓!NFZ  _ҟ'k9Y;?b>=iEDZvB~ 3ILH1$2`!nocI>5"@f_~(/4Fh$eڒ_:zruPJ?0ZvTm}仰\QsvA/!?FIV=Պss1[u?:[xwQXkyȒ6*>?»j&{i:8?`-!{5G9,QP!NW])KY wO|z?[UqF3~DvkYQLV f gt[B`2RiQg-2g-NB fͼs r 7) ~9( fb@R{Z@x(Y}aƌS˹Z7\$U:T@$b{)Ix6f-`B9NY3eMmKGbէ;vsV3C;,ՂߩqyEfl{x<6HW(]>R`f`f03̉ E*LaRq'W,:vH 0"Q(9,=CA@ʂ1c#Tú[ NVd<t>S߸ݐE !!Xa0D J*=RD'ʹQ#a2yg+&UG ["RCCih  p-cYYj!\ ǃ(xf%*&l`fm7zêb۩ttJ\{mÌVҨxF zJ@ B2(ƙ(m$pi>ȟH%1BF\ŠYaJECBNE&1cfSYxWAW$9BQ[M|>f& ;cfZRtbZ J/g[+4Qjl۷,ٷb@Q%;n웭n b^Î d*z =ؼMW9>ȟ/u|Fl(z媌V#f}ɺCa#q< __^R}T)j Kss3FQ!"MłTN#lV a#_!g|>,i2^+EKŲ=c|V|#S>;/($8mXm V ԶFIY{QԄ@%AbULZQpxLqG Q"vf 6J9=f:bk͂K7VGTȢTA " ž  Ay݁$}DbtU(*Ű$Ÿ8k˺)a N^cS4Bl^(zQn[:?x-B+('ߒo{fwo`N$AdJn7 h27%, Rc~5 cɿLiL:.^p:m*9*#G#KT7 [ 2Q=C$ٝ >h)LLΗꡌP"+wTP5d{g2)ΫHg`VL*]EHĠ4Nr "-X,e`Owri)m #K-GNIJ hVza*y˲0nd&З<//͹A$I KseOU:b]cg @PqlrT([&(#bkќsvzD {XA ;G=<ۭVkqv}_\c W[iD}W*WKlBgBY_@slIa9yyۘ]G(OI7Sdc4NjV<Զ sHџpmU< ӍSS/U,87ON}B]nwS0Wv_K `jo[1#̄yu:lF] z:{{q~~N\ebb׸{\zyVRY|l9IԈ̟*+%> :A{Pſ/*?#{Ƙr o$ 坍A&mDضGF?I)գzY]=v/W?emVIBۈSo*Q" 2>K0w$4]CY~b~X>yaֽbuQsHڙ[{pT=eZ=11jQ\oNv\ ^>PtA#:i͆$.tR(ҵުWUg_ 窘ޮ^%\Ol5618TO{WYNW}B v;[{vWX9^=>18 'nq`?`끟cҊЎ9D0ݟ8ػ1<3}?M2;]uqSP阣 36⺓n'G\qB5-&`w{|fp5ڊ)%1ٷJM ǚhV죍CӛPZ0.p(^%Rz%򣛡"HzBNrzfFeI=ǤA{ >Q\E])4ن~U~2xBbۯ@B=K8-:L"wel|$6Z@(F_n EN )ƦmeRηAJy) E.:By֮ӽJdab3tȪZidR2I8HveٕS2n.o~mq1& 9Byg$b`::ZsP4RO̥`u'xCRVUvdUMR5gf( 'd($gdZy]S}ՄJea$jb:%Hp:e"U7OAo#S'R oMl#"]\v(.68jca2۸NdnDAtJvAwJvQ_ Våe d՝n8R `bb{#sd֨ӂĖٗk˚[S;k]>x8^X>8 ea5VAr="yoȍ9'cj6"iv||DIvB{'4'_XPtEBciF5f:%xdZp1Y"#ͯJo8$l4RχJhD`Wkk+%P;^Bj$z} ro #lfjSZ*JU&B_I("TtJ" 1mvo9]͂uGQӤKRʀV_7q~5UGN_)N>͗2ֈQngn_~ W^4_yǪ{_7hjN^. *)q݀T4vb,9{Fv OsB{+Bk d0Eerx}:3ķ֙.j#&SQNkCDT9ATqd'1%EFAwA7oA,q\F^8*2`VpV4 qQ4z9NEGEF\+0>raZΒCڱX2JŽK*.9Uo ȼ p̬TSxGЍe(P\W nn)Fsj#"* 3Jʸ'`R@FZXƌLPM[T'z z%|F$R1-n.,98$%hI?(  tRW7 e%a2XD):e|^b ~9i>Ηfx̯ޯ~Q:j扠ou{tlT<^~0w>-ux(Y\-`3X3r?[̫kҵ/wVҳ } Vޮį hLi9yTF׹[)9S:F6[1ڭ hL5#[ib?sS8 (cE+G!3BRi"Gs1 'za)/5Ccɬ%) gƒ~z}r`%BB"Z"S' FK[)9S:F6Ťӷv+&4Wu!!q=Z@KFm//p9sPNJ5nFCǓd/،F2 "FZ+ ňҠ^  Bps,h`8Rͣ8_/yuTQL] qJMRޕq$Biwg+0Ca= yJHnwS=n>ȪκzFƌD/"##"%RBQHM %W+6M{T&jp&I!.(qIO$:gyΞLŨ3GnrSҎ/"k3|DG QFe EB$HeF9M%" A% ǗdQ̈Zm Nr26xE*xPR m`!L$ 4.Ə"~>",)E˂ `SiC \峹F.`#FD9M) $Ĺ+&ϰOI(GA3M(10ʧ\-2H4SU?Fu6FW-6p#EI`bm`/.E2uv6cDVA)6msyZ2nń6n]Hȿnm(S.i߃DH&z Wm|Z5!yD)'?q鋧|fKF5p3d8%/)n1m0V&S+nMhۈ{ONW%#lOl&NSjn#HNN`͓3& $?HFO]Jx3sFV4~dt? {Pf>mp?]x σk?J4&"gk9qat 1цם|npnsyfFsόgόnwHG۶d)lpc'I! C잗MoLOLe{/sV?;OEZ24Nmϥ33cl)9zs'F\2Z" - D[nt9E#%dѬ\GxHDO "cLnd@qeI!$e7(̫mcOn^FWmcpT1?iۦtkZ5$}sd۽gv^s1zD.A.? әIJHPٵXfp1#/yZ>w~?}w6͎M ~!@BY/-wMMaV ߠY#E18$(Ҥ֊hqfr>y]렴|iϤ*P(":%=fTmҠfWX߫%5{$i 5L;jK& -5S|O+_'i7ͩ eFuj#8TNt֝Sr1F-"̌603 /=!ӧCl=mʱfzI$b"z)b*z7Hׯy{>cҐFy#ON1>ӝSA 2a3٬k{lYTvqͺݤ6zx-dZvqk̊K&db{UawH=UڰmE~ñh"?dadb;a3N'ƛf ԰8ތSyxs`Ш˙gfjPaǚx7~ǚը(2\;e _ &ߴzg@pf 6#օ;CS+:KMH#V\]wO9>dq};@gq?SSw*{ן63 7:MQpcĢ"h9lS6]pPQ2qg(01Fw ڧ N2 ۤMZy7z4m$9NCR{:T+ lJkZC]b}/Z/* ܺUqۦJי8ҨćӢcm1GV|m.֝*@Kݦkwu&șlujo^Mgd'V&?o~Nxocth[.K(T+9Hg Њkͼ(arh3 6R,T[O,pp ]V9YQ@Ҝ `!7h- aB*Xί:l!䅳E2%/}Dq >Y-,sHBLJƂUFjEo7g GL Qϻ :es){R kLSlA@c57ld˒$'O|'7!RZtΌrKT>mD(zuط7'O~IK>`$+”DTTJD>ocd:Lr=_T,RϬJ)ѵ$g{N*"(y0/ec<ȣd!U£d 3_n` `G7&s畴|^_Gtǫ3-fPZ;';`FCr ۫-Ɩq(f-y\^,b Ԃ+vfwfGKOjXX1aU[GAAmJXnpf1l!CJ2JͼKFyZc.[<ϻ~gY\t̖{!~ ktX M7Վsvz}鋕&rWC}\r c|PT)on@CnJkTQ55#+N#P^>]kkZ57z{eʆ˔OjoI'gZd|uAh5J&&BT$Kdպ!7X**e4МvJCmuEe^!VA;%ZR\eCI|*49׫RKXםz}LPZ IΔ6yen]]Cn4Cz?2PJӓv?|P(8Llx 4@FΘ6`X%YH[5ήCn6]65J>j<{Y-Jw#k[-QSs~6^m^Vf|-?T<=-bswET S heL;ׂPu~G~>;?'"ڦě@A[iꩈB8tGZ'Q LnFeEe-=JFW~ϘPۈJO'*2q4?&t*rƒ7s%԰X/P#qAS2:YԨ.h)HEG))ر{rl{uLI63q5S2  *X>㝎1RAC 8ta `Et>3ncT %=84WXhGm@K 3|idYzK@bHp ypIt,hS^q@X)݀hkiAajEBGz.BcPJqX?kQT>i|4#£H :mD/-Q4J Xp~E-Q>ZrhQ8G+LM)$tѩEDـݦԩ=r<;⓷x'i'*S4 IA")xT%.2c P@0SVGj=b@]{-s ʖL \ r~AUbuΡG-<(#L)|jjlj{o(8jvN1oW2H)+_%A^I1 &eD3< $Q=2VJ*hlM_.Q0FxFLN3P@iXΥ!\x(,Ih""gc]*1ttf12ǜPuhZfN-Ch:4`;<_m9+&VXS1b% mV;~[7ւxE$#qHo'.W|/~q.ڀ],& ^ee.]Q:o7?ݴ3F934!r)')1!ߜl>e~d""tSCamF ܚYN ¶4VWLZBP+˟ 0IAL7@O_t { r4)(b6F(w'cAP 9.( w`IAq+F;A0:,vMGa`4<6sRt K?`8a+M`G [?[)l>p!l8a+`G []R [^m+pUW[{I.9k$EرS߯AAF@R{R#@׍F~H> 6nK$&yҖ;H ?Rؒ iHTEl. ~4c-De`PrS+Ns.ңځCq73) c3)Yf+0Y"0+P' .sc?0U }id/[+}['zֺspKԫwvĭ6Aw.'!Pi!qa]~wέ>8 /qAf՟]bX5VjYxW/]B|8@GNtpzq7t M@(dzw  bܩ?ЫE(UeNR~gg.B$J%dۧT"i|6"F.V̈́])AT*;$_>juV*sΝɐ9gLāy's .;*WF눢2\nR y$"M!sBVa&(oOh8Pv#bˈf!g~kBUm-wwfdn0Z󿙛a1h>٘-&}MA_Vpnu ^?ftG󳐿]+鏓xx(O?A9Ï5;_PB_PR'8CK-UZZr\& Sq`&NӫNX[Y*aN#{mLsj1$x6;j2ۛW;3y˹/L2ÃCeq­@f/gX;<= 0ql:nxJNr ;;" V 쑄wĞy*D1l_&O l?h-O=).0gkkȁk.bˋ _[D+jk5~;o'3~ wA)e0կYCpq~ ~_  .@g3nZ\8Ü'Ci(rJ%bgXP͔2Z+MɉBy%a'ClG?-AmKdfXQZ.+JEFs"#O22VȇˊŴpXkU˃j<4Z|H[|n>s2`93v>6 %-(Wqg qiU gJz; FB"%l3Oݲ'ϛjS˸=wH,J asS0Et΅ʙ0\xR`cJ+a0̑{2cBh3HiO ܏wǼjWrW{$t7q 4ʐyk۫U×S_~-L׹ QbW`!LgK~/Oa5EBgOwx?G#x>T3Qrp5Ue3h&o+7)DqBxƄ,kB~ jVYݔ}q TfCHН~Ѐy:سdD r~Tլ027_-)J>䀝F=XΉ$x<_is'.%gʾGr8`8{+.dTpfMâ[9#~S.W yte{HzV Br8! ?xP<0 `HA3y$*>L%IpT* )iLOcj/LQɕ,D<`v2[O~A~nM>lL~p?2q\[%̓~]J]oE(.yVBg`IacВ8:LR0c~c/bzzE8_C9_⏞vsBނs+B tnlf e(C.G9#wܪfā{oY' uebq[&1\ Y Riű8xŚ .yR]sY -yS^M5k%ť "|w3<2|j۫u')p렣pҊ!gˏJiRZŹe1o)˧Zi۰ӻyavPiZ"ԊcJ/>8Bwﮫ( \O=ޭ׮f\ 8nC/RD zNpJTs]DL~ba+/%72K ymoSt14ыWg-j R[[m'/ }~2lo3 X r5R7oW@OF|[)Ů>-M'_,=XB kaHQ?Z+U"HtS x :(.PiYgC^yE,) GP'LPΧt1b  9丗V4 @&ڜ52;]qmqP^P-jbAT>U1;*CU輺I[ݤ Etbkvၘ7k~w7¶kۢ/ Qn4{ت fD5u1%Vr__ݢ@>%CNIwlnͰu5c,qlQ؃,qх`ϱ38oqdKvt:4챗-O+УlC<Ltq 2aEXoqO/"&N!W*пiّ S#ܑi; g5KD}.%``%Vtc>Z:Mg>*bO}rHs!` x -?h#2?hhL1}ۮv̛gnĈNjMSznnmH7.'vg&߲AXz)\3>tF inYƔ,28ȉz,-G(FQsϊ*LK5`sY %{{JSs*=-фKL `gRve6 C k2lVXT2 'BSu^I_ڐ!hVdbC =&] HּS-RTAs Y0g<7DBHK6ZGT&N ą{/ G\q[-,N00@9usN_-$h{$>EHQWTHfp; /iV-umló-J{@uED8-s-6C\iDUxB9"` @f'Jz츑_eA9 \f9z IeZҸa{%嫷KfI6n|I`0OKǀL< 5UJx.q=4zB̏WG| VVKcPx&L<%1_AS yjqF@no܃xD!}z-CHBӕEY«!cHQ6ȪUf܆컛fiIOo>= r7f㆗A.$~dۈ|VnF 8zE\Sh`t2Gݙstnt}yzpT>ׇ\<};~g;+3h#mvz F﹈s5wXa N!lK[,#/^4ZrV9?6ECw?kV *su5ǘe4zZ#*vXe` B1ȳW@ut=_@-H4HY^;+d3+d6f֜~R^Klfc@XR()EKL~*BS0nqV8b@[՘7<tEnX&ҿbQB XV%%VTl74'ͥ!9Z8|y>"L"TIF0vK{[1#>h[Ή:`UT(Үr2V͘wΦh>+,|F d0% jQS i!mi%c01mCiB:\Ati @B&h H )L].g2(WT|z`PI6o>Z5WWlqRD9-N3'jޭI⤆"Z"S,`*XNH旸i7(Tiykf hqтUs-F+$iðlKHB&&g`}te6ƤuGe&1NF ,886{Z)3UL; JT證S)`\+"-I{kh Vz٪ .^Nkp&&0:N!a Z{"#>yI`S4DŽicҏ`ф1!@$ͤd$& : YwX1RbzxaO?\?'srCzOoaĿ?b gxJ|c~9d?>:ÖV]|5߾id_ɷjcePzl&nB|A怦$]ewt8>=ǖ!gj Tp4\c}8j0:l8*4W[lMUJ5x!C*׏{- *t"B30fՖ(&J^q~U,*6+V YD9^" \DkPJd*aK 8iC\L 2. a)fxɪC/ɟqU$,,у y31hRL߈Upx SG x\rj]d/Jd$lTRvF)hEvmC8E^*p<" ZQIpLkor!bO|JPBLę 8˽9)Bgeҳ45BX%Ttʥq20h6FXwg*XE WIUXʫU̸t0~C9nxCjWeL:v/(Me%3A`+~~@jG%Ƿ 2vMzՋCVC|[F[5MTz >t{|}8O~TH84'vZ &(8⺝ITkp?l4A~yjB#q~}[YOcÐ+ʋ[vzv0= o{}I8|0h s_QϽ?"ms@EY ξM,4JmuF_X ⠖8,=qPJ&W_nŞ??fĭ9ŕլ yJA<TA3z'KҌ+נk|@9t" B S%\$ȣ7dr #D3j"N;Dx/8jJ x9 輖EJQ9EVT Jw+U,UCB~p-)5['# yEVFtJn$ZkbBڭ"Z$S|a(cqjZU+ÙؼJ|}j,^+uΑew%~lWAB~p-YkMqJ1N9ms6T_L(VՐ\D˔}}8aE~,2Zvno6!e1kE?7Ũf0Ɲ$'wV M=;5Bz:lM၊vv0Ώzxۆ }tëw~yѷj +20O?}w?u?i zBUi?w VxT#܋cPx*jb{xj_"OM*jN{xjo}xm﯎㿼:,˿o~n>'7K777%ٝrXƛrbu/kHQR[լ4#C?>_ArZ0tP9 rtH,KMO8 #&]l NE%߅7wʜNu'>Oi#c &xfX0S>dғԊ ֠qq0RiP"aJ-"A==M,Iw;A\GLwʭt¹<|,2 MzGbTb4͌XL\v%>K>a܎O队2qK2'tMH޲_.>g2ڕ}JHѧ7q2y0{p0Dl{3 X~&w32Gh .5TzW͜& ZQM|!I!4tuV@bdRa ޲Ȉƺ,lc ;hELqhB[r6(/$jkMq} f ˩>uo[^҈FP.aAlp"ukپZb] 3׫.jxKXcC,T?wᨁ:.جo lc]9lwlμvFgerp 5gYtj39Dav|e gj#17QK %>uuoez孕WsDFD.PӑSAGU]3Mè/~i 3vSmWF?m' 3kW߮\qɫeʜ/$\/ M)anv*ڎȖna",X_VYrvԮҭ6 k R^qt'*P.U1hVGXH8'Ra$1 g|NJšhF Yxz=H >D=}GQ>&6/GFiXZ!)Y:Q@Yv E=(Įmb ^,r&`j+]Vs2G6\M)Th%L[ -DN++bX_ď8Sx+ p`jBI_>2 ãFpp2@c24%%GzR66Zj`HCȿxkV kv0+pf\`V|v0 !?ɔT ZWm@_JNA ʫ4\.t~1v$B%I0FqNq*-ѣ!}lvj\p#a< ,T+JY!gei\.[lCCҵ-i.n7O;>ZufB P\ j7ډx:O4o7T;Ķ !?[G (D3R 0S:Gv[њxvCńb[ EDY>Yڭ| ҶWhzhE\FKɫJ<9N.mB3bHb7څqė cpSEģbk.B ⵬ނ/ ޘ7NHdZS|‡< ̆ԕ+/T MAE^9B9Z0)d GoE?Xk>ϢD)֫2꾊0jk`Gm5tdqpKXLǣt$:9+DgNŷSKN`/|,I9#FIJtD3N*v [=IxB"D khWUd U^N~+"8v}Ət Qe,^DS Z<+GS cu&zI ̃@WakAx~Y"'̲K)}# MHGP6ɁP_rSRY)>QwbZire ϲ;wXF)BgZy-KwL_ ":~7)eaF= cpx$^ް$QP-wֹۇ?O/Q'˶~=qO8$RŻ_nB&̮=3$*lo$>Ym k4%y߾6k71A,!aR-504.r ;5Ϛ>L|<,;N`ȡCA}kbGT Q4.ǡ@ArFHܱyt߾y8&9Gw{jsFK:/1ִ.W.G{&QbϘIZ#(RZ]{NR)DXzF耯4Ln1Z{%[b9' 8fB LnۣC3Rq;H5c2 vbOW9Gr #8)Ԏ=PjhHq@XX zx@(ƥ $#*bIKNKQn\)x Doܺ!ywo 7joxi{Ͻ+xi~#{ǧ?ã^QKyQ;xd5Ǎ?]6X-Ze>?:b }]#<^6\Y?AƫY9eM_5\3)k% BpOEG /%'YNTRom\|<|v "2c/)_xkҸ3s!ïϤw؀uZrɹLUBLhHnҤ1Kh#,o{I-'7YC@J\rJJ(09t0=1q@sӟX 9-Қ_,Vz( ^ a!Jlsw!9&.H%Zv}z!uGbf2j!C^+ؙJ#1!iIJWiHNj88ó>QP7 p >{yO6 2ݣu&AW|74d3߸_{+tXfn:,h~z ?=.l\*rm9:p~5܏b\wÈRӼk"(4];J?ޑ,r9Ow'w^Ύx#۬xSvA.$Lz.@ -O ceZ8>x -GƗ=`̆mӋg įxp=5/0VY9eZ״zpI9#Rr ڸDIjP%8n RB4윚FĄtZSZ׫.O1~ 몈&e/&^?*v3o|cE*$!:{DԤPz7Q\xƄ;m6٢uǪ>*}Tr-L`h76͹XT฽9${toCtSKV_Rc-jR܉a6as./m5Շĥzn0&lTS&`1WBꩻ|5yRb[[T0Qpr$HN{nzf{ ނT-;ܻ%-`-'fQ9g>̠\֊l[r%4 nŚ~#D;ue`ժY\Y/ Io{" %K!fP:x/7{ՆI-0g)hQFjzfѸ߫~Zc j\Pi < )V\N; N[pc_ySCyOLd)ؿjT+cfi8.I6cvZ֔,UDʤVZ T,XڠNHyg5ߧ \Ȣb7E,6TJ֨pz]3˯TjcbFsF`wȀ^{ ⽳_bMH-xo%ALWLVcK9pJg5,U|H7kV?!scv02R d%WW6lڂ:p-`CE^׌ҕ \˶-6zlβruL![be9'w3nHbƊ7/'fc { ZkIM7vTd& }M֓FRj%2SNR{Fvk˿Y1{H<*>!bڰnR3}&& w)F\M`{byB3|1XVlƲq‡|=96Kڈ%ٲcҊ3gɚzT Qg-)h;?8LnXם쪟k6V?E%Z ̵k#ON ̷XiPVFAҫ~`\!Kdոp· !fI@Mbgmx?f?R|uW޳A-?_䐅|MN׻^ ;ؤL>eȳnD{ٴ~3zIvϵ<|T|{dbIuɲSOgoN=Y5:{v4\M<3_S.><gsýO>/ƅz 7mc9gGO?Z"aCբj},g3#O1?9x5AƹjJ|=_eO?Zd+~EU"9SA z>Ykr'`J%RhjA{ ji}MfC 2] Ub4Wd`o@^ R`NЛ66 y0;2`/7YW4lvY&G]5ĸ&6mkWK6aXurUÒ;Jڿ=I+A0ŖKq%ָkw5߰%핅סok4QF -j~7tӵyy }I1GHPٻ6cWP|Yhf*=HJT+TsP- YI/KbA- ,$2NNיikb^H&1z &aR`\.z+jfBΛ9(e4miV&.Y2eՑH e8;cZI'"Wm=+D.%:2s'e 1YPJkI\=,-G:= 98S:HI?Yr $,X˥SRx҇ג4In+=An (SY85,3 H92'ǝ$KmuQ;C(% L$K=V--mHY0!`DAgQ0 gK 5}g2}T)eIѬƎ 4R̫n0;Xi'i0݀$^׏4B(QzO4y5E~GԟV=7+KQwm@u٭]E:~roצWqrk]4֒=#qǟ&J#//gNbKoq0Z,8 ʕye[B'r1=fW uY([piܷ.cdOe*-敍ȪK 6_^KB([U}QO'f+xơd l>pאi-P5d+# (Xl otZ~ 4MS}-=( Rn꘩<7$N"GXJSB)c-N8k- X5h|L*RW!E$X`JRoDN"$k68Uw~IXDA!6shWQLB $8G!TJ 2`eUDTip.6Lg-aw2 XEC9]j=-3~ «0??뇅ĩ n&UnjՌ 4֌ ?E1eH2-.29F=$Bbu9*(@rQ۪\g]ݚj dp_koyہ ֭V.BҼ7g )hdXRFW1S6}p"Vr*L 9Y}Z8:7P+L o%pQ4챶d D$8`aߙ9: qU-| HlA81nbmT!T(1W`\M`D@Ue1 8/h2L75Gc͞Ԍ]/ @u-F/ 61A dcdi 3vE@1uޤՉ+P %x{f3!=3RV4;|FC#Cf8na C:ɑJPTp>@'hN |If@hƪrH/?vP ƙ oFC#ܧ@ 4VgfJۜO3o?&JZwIk:q7C_|,]Ltr}iQYHUp0s˻wM@=p~~pv,#aw 5V[5/CD:,oi>˟ˈ ?~/OLFz&XbZ$_V_<fZ@Y0w(A40nDn<8}M-i& aGDt}L| {S&Ȳtмy æ~G6 5`Ȁk&$|vbzai0.d/ v#OCaG$8IH9Y|H~->LfH\6Y5jl)y>:`s)$X,v QP)1C-h q[JZƭڕR}Pc%p('Y%JW?msʥ]ƥWLj**T!:U.Pdt Qg+w&r`-k$:["A|@4FtĻ՞׳fLon֋(JW؈وSMom-SY@Ty^P*`0lr.{c'3dzD7a9 -hc_n>#ّr~h9A$I3O cZK.+ǃ6&R[Øw^ Rms6J i`7ƶ% hkit}uAņ}_ S&2֜gZVK%2.2N,ٔ Pܠ F:m}Q1TDz3N{Sߟpl`v[ki؞`nv;sFSF&p>:7,*Ǡ+d6TAmNTkq7.|W ,@-_3([ruf) tZCIyh-Hy.!c|7 وߔb}ƘCӡlAf~row1g]6ѷ%P Cn"*HVBㄕ:ؠd,낋775QZn xRj1!`'< Y%9W1")1́uL1T l8lcgp#H@U`of>B*+Xs"_Gb,|ovTJzK0cI=g&$6I$TJʨ"< oCRF+ Ie Ek="@_{3>,$:;(g-ޅY}fMωAmiͬ 1ڧ?(ԘϪ/z/ BgѱfCk,żI7JQ]H^t (*;.mfg_A]ͻ]zX>Rw!Y7ZfeaqrSrbq⇫E/"zo\*wT̝tgg|tk(> ſ+fʤ]5^\b2C+޵u$"iH}2dY>yAV"Y %;N߷%JsxH2>j(QɎZm%RLk[8&]6wX?Znh%FSgCfsY,%Esu~^տ[bI~Oc^x{nV'oF᫻yn>Q셱ϸ]xվ<ǧ'x㶟TxS$I )K꟞-rFYg-ZQ=cqrwddm[+"K[D"ܹÆ{Fém9Nlp[iomO蚝v?&j'jw 1P[4 $0 ydM> ص=FՍf`/4ȷ!Xw5=I9-ɋѯ} Lhw2Ť!p1ff)˺4=;h[7 {49zDðX57Q-j^o~4oY^&:h 5ѽZ Ax6FBϋ[ ě,kpMLl@ La&5oaݚ7Fӻ-5+}[ qc6O4]warAr+GD"ت{88Q45gpK =]q%ڠr)]}m5MDȞ記BFK0A! +S-6!{bW 'ncm't>cHv{U Xm\eC(;-o稂kl6*Z Z (Bv T`K!ONJIQGr]&tZ(%ۮ̱:A7wǫXN NG) ޶S^F6CwF[d޻q}Mׯ^]o>}]tM{Na:P6t tvE=9 =ROǟ/%'] wP|Ѫu% \^nO mNl&˗0} '`?%ZڷlCBU)e(pSRbrb:(qg*T(KmefMf=,2bPSZgWV&,OPHbuM>K#j&+X-R:VGXd| Tu>Ѣ 8R;'pͽߨ-cTѕ(؊j,lQ"zXj$ Yq캴@2IaXlu ɧPPAU2s4m ZSYݚg.ELbqX BV!ː5CfIƄ !RזRæ,R}%,RƠ$v02nTE#&dzHz1)5L^ۤ\$%euD[)Rȭ+PRǏ&Bdq&1dQt2]\(q}K;%oe=e^#ֹΫW@=MC[47p!T;xt,[#\d)Rܑ35 l1bzњ{BU_B>]Ϻ; `فv ?P\ DJoV<{?٫L]#$|k3D@pf>w1D2DR~/P5渏۱xxIat[E ~߾`PEA: *s?jV* Y~*>vlBq;ѓ>w>RAKtD썂dFL;Ï߃We*7Ew}g~o.?~pV./;?W?X/(39n;U0-.>\>\9|wuuXa~EtrNxjW~B؟ӫ9Gؗu=sC< d dĪ^꿦K|7Id Z R ܕb d>ky5V_5켾 RR8Ge{ihk&cVm{~vKubJtņiA&t1y@7S J_4~uhi\PAzr Ծֳb'Oq[eGft?dx;ar6g;,jSL3o=+mn`6I5 Jsvb 6cRl.iw5D[ ǹ`4]Wiz=Q7ݱl-:ķ n3%SvYmifHg8[ctIQ>Hy 5*0ޱ솂أno{W= Ueͻ== R;:^{M9=/G.TU.8г5׈n炬I#cQ.3W0c-65%{MFT|d8k tFJ6pmחĥN8BKD"nYZ]fbժs( XHc \miՎ1cis DoYQa|@>{5{Ө3l Y3ZSQbVAT"[+^cE;fz}eֵ5hke5XJs(Tʪ(VR48!ϯɖ$dN;GJq~H>EEn8JgˮJSj ŵ4u0lZb }Zr P'XҖsITmV.[R7>|s r9nDWl,8sQIS<+ K̙y!,Ft:!Z#.1!+Qƨ[U_++X/@{_b(P(@##,]|bl=a1,|dd2KK?i=[g}P7x\71*1l,X3-!KRymzݲ|uQ7"Ckb6ypKwZSɥak_g=8n;JA aN8 \3ν'10 &Cma%78<*cZ1)A׻4Oߢ+4}|0XD) o뜅[>f]lxp==>DwwymF6E|kW7I{ +lu#^k|vBqIߢ5nV1$tXϷ7q:7~_)zJ.pxJSd \ޞ_Y#̟wyWnQ9Hē\s7S0UdN;bA3%O.\|j4#7:{(YmK4Nn+=M8__i(+~sa Z a,tKÀJІT+!=fVFY?Z'rN M'WElelEQk?LMNn'# gǿ}>s0ʾ7h 'u~z3_}C>wagWhtf7S~O(Eonq2sE.>5YYHYP4ϭ!m0ŸM~l }nmePĺs2TkAzh`jː&8F JRT2P.r4"90+մ  =HtFT{0*UUITKҀT&SWMIU*H2WVDOD[P-+gtΓgBEHE%AlrVR-UtRPZSA燍RAP*ЀFХ|olRu`,z|p%]}8_:I-oFïs0;>2aDV@EcmtȵӋ9BK凛yU]3L' PlΤ!. D]eقרW7ƨYڹzVkw'+蛒 FRzZ%cӢGiّF&lS]:8Cwmˣ6Ձbt:8Z-4paA+r$<=[^H-RƌQ]Q57~]oU Ҋ缪qћhpIpC^4K]j-W/A {orC/xLR31M MP6ɨ\2'*bΪI"Z礦D@lAڜE85BK0+<6B6vSr>Om oG90Gy]jcZKw+SK׿|ikq'XUk?kSO't1,mݞ߀{.p=u;ue2иY9%PXV ʍ.?_՜0! 1TW{&Dញ^:fv&[Y|]]J^9VVXiQj QjRͷo,.H&x4؊9UTRJJ(yr{5lN>Gq[Ie*׭"OL@2g;?j0{LP-mfU?,eaNu!Gpx!/w}ZB-io*lH;B P-&qUst)M|9)i6wC1MxLp2jxbN=>V0iUytNEcuն5SM,<YN+aB+}5B'&WHF1zi1$pw1'^8mON(p}{^ C'hZxh=AK+^0|!aq$,cLl|7^|7ɛyYdWLͧY+ A GG N,RB*p[so͸|s&*O-R+pne+ k-+&ҚFp&QX1t&^R·\*UVlk\[_faj&t:>$ UXH&y%> D v'aH~(eQEG]wR 1 9%1[ǧ8@ts{΅ 8_Bbni/,7e{NmöBT~BZv/P7/J0^S@RIP*0- Q\wyYM5*L Сy1 T!TE!"nFAp ydCi:%͌/vGKAA($^ۜ|I+b֖/85Ðl`xw[}keq*/NYBS*K!r8C!^pzyCKqJwJU[KF2w"W\b04NŦ'.F# ,Zm^-z 7$R V~ ۀ'%Ngbn2yv&:$<$Ȃ27" &iO ҩbk˻@d'ݟ*TZetcWQbDk*J> k`߂K*@Fi,E+B+HNsOU;XʥͼҖ)һJޣ>|TF:JjTqى#P}ޯL6* r4HNe?X[ܲ%cЇCZS}Jtw PrH{|d?*?]i0dP {?FoÐA0 w= .D k]g*!`n/#)j8 *jIBvy/Scr='& `ђLrO 9h-8۞1E٪iO4^ӓG܄I9y%M+v^e+8ZhSurs/&niT%y+1]%|ZQkI2^+"E)@V}׎+k4-y0 9Nؠpx+d659oP8-(Է6࢙)m;ґnVGǹC5Jwĵ;ɦ|yϩɊS"G7Iݢb8c'ntyD-ԥ-cz8 8"-X `$0Љ #I@ev6`cki])` Q|ߛ%|s#Әck:չ~Cjc9]!GYZ xQ&$ފN폦"8t;#hHh('\5` paz0泋22`!^,g7ܖn349896&cq'w/7oz{ EَG$х"ۓ `;^&y)%"jj n縠Vn2dD)rvXEBHـ zE&䡔']CRri8ࠦM!17wWn_m_^C׿2<0`Y'kPl ys{3"z\0Mn)G!rC7}b>q~~=Yē2ƓA^c IzLX04u&/orMn̳UR>-e:k{T޴Ny6<#7 RhN5#.Eva`%p!:޿JLP:ރ  \m(’geHѱv+ŔXA.YHa;giw x2Eḹ[go#7siwL]^HӶ15ip3uIFK ep“m&syyF$4'a?!ɄO2y tE18'  RIRQB}WWL…d;rQv2FwI*g Oߗi…kHk.o"@~x;vezwK.H|*(M'Z_=մ~Ŵ$17]՛2flؒϺOaJxI4QX8iu9pLl/ouD` a;we]:9ݎ y- )e&f+v)pJ^8H8ҧ+hYQ(Q\%:Zw_qTxs`dX>jW<{j5ڎJʼHyW< D]UO-dUÇ5=W#{*D D a=:'82:ZkO}HO2䂀*MJTmՄjAzTeXn1)a;;EݥT3+-6SvHxK:, #2$ȹemq`]FaYV,x=o9*`(Qg9V9j@F6!Ֆ 3A {J 0Tmn^:q>=}rJk j,bb5f]- v=nޯ+)c.PXڑ4DeK-" A[ r 0BIɔq!PW!A7F,2C92)ǎI 6J ^HE Zx.8f8#c(n#CcyMv_IW2֘Yޒ<kŤ%]w'W4aJ25k{}K 9)%ՓZ 6rr @ay YR+P`G%JcEiIBGԂqq9C8՞`%wk@E)aA Agw<7?Pƃ3z`'a< w{1l]xCjoH^zs@+([~$T8S%&-P~z2Lsy"QNL>0_=n9[sV@]:>8@\v+M gծO!Su4;tZ^rjLڌXesTl6n)  u_Y_i'W.Gwd_YBޭp|i\\,XI~Nj)t3 ;7 &`BoI!cJs!JE|0,v|ٖ*bXA\v1Qel*DK婯(M *KcOl<̮U(c[/ՑU>T# JuT TwHߖ&&QJ >.A\VA[pU(b8C,)lx1ߛ6V^ݝ<WRexgd#rM]{}/g& i]LlqyOoqozsXɘy5"F@H;, ;Q{{"Z,:pfky捑(V4J,mVX2p52VGE^ًU'y-I µܿb c4@ aե]F<|mô-J nn{[{5B7zu;"NŒrG#ru:dI)첄ޛU SnZ?:igJzHThJxJS`0zJml~%r5c͢!Ggn2FFL#9!p! ()/xX 1vĦxqv):)z5ԷKU|TUUyUR?_Pduv_$JT~{#:~*p <} \*=SF.EcD+F7ILRp2.eFF 2MͰ'HDQg@) BfhvԷ|K̷`~o)BSpc!N1 I Vy-z  $vE"5Tk1pkp‚X: Dx$sT*͑raV+Z`$BHT=.<`F\{f4sJX`735B1o^Uø/*ZH-Ie9\k-SˤˤT?1CB1IZ\07zyybд2+F"jy1BsB1ʤKGއ*"%DY#x Z ic>SJ*&vfgP1?\om:s/$,KvqP8D# ibсwβ% 'ٔdۓ*F;mZ-9/ݞCxa;9-=1PƤh^uX:Wϼ,k"E~ډֻQrS0 $+8wu7N'L229A[$+HbE&7)TNWڑ@$ÚsN#-2+BZ_RV2ya) $Wė}ܣ B -1אdII>zdjn?Bybc..LNb^IbLVt`~|їLMyxxkto+? fO#var/home/core/zuul-output/logs/kubelet.log0000644000000000000000004612725515137416605017721 0ustar rootrootJan 31 14:41:34 crc systemd[1]: Starting Kubernetes Kubelet... Jan 31 14:41:34 crc restorecon[4694]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 14:41:34 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 31 14:41:35 crc restorecon[4694]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 31 14:41:35 crc restorecon[4694]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Jan 31 14:41:36 crc kubenswrapper[4751]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 31 14:41:36 crc kubenswrapper[4751]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Jan 31 14:41:36 crc kubenswrapper[4751]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 31 14:41:36 crc kubenswrapper[4751]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 31 14:41:36 crc kubenswrapper[4751]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jan 31 14:41:36 crc kubenswrapper[4751]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.133917 4751 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.139970 4751 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140002 4751 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140012 4751 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140021 4751 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140029 4751 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140039 4751 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140051 4751 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140061 4751 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140097 4751 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140106 4751 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140114 4751 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140123 4751 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140131 4751 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140139 4751 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140148 4751 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140156 4751 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140164 4751 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140171 4751 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140179 4751 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140186 4751 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140194 4751 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140201 4751 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140209 4751 feature_gate.go:330] unrecognized feature gate: Example Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140216 4751 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140224 4751 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140232 4751 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140239 4751 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140247 4751 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140254 4751 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140262 4751 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140270 4751 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140278 4751 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140285 4751 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140293 4751 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140300 4751 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140308 4751 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140316 4751 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140324 4751 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140332 4751 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140339 4751 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140347 4751 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140359 4751 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140371 4751 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140380 4751 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140389 4751 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140398 4751 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140407 4751 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140415 4751 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140423 4751 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140431 4751 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140440 4751 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140448 4751 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140458 4751 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140465 4751 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140473 4751 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140481 4751 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140488 4751 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140496 4751 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140506 4751 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140515 4751 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140524 4751 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140535 4751 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140546 4751 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140555 4751 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140563 4751 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140571 4751 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140579 4751 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140588 4751 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140596 4751 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140604 4751 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.140612 4751 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.140742 4751 flags.go:64] FLAG: --address="0.0.0.0" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.140758 4751 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.140772 4751 flags.go:64] FLAG: --anonymous-auth="true" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.140786 4751 flags.go:64] FLAG: --application-metrics-count-limit="100" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.140801 4751 flags.go:64] FLAG: --authentication-token-webhook="false" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.140812 4751 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.140828 4751 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.140842 4751 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.140853 4751 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.140865 4751 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.140876 4751 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.140888 4751 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.140901 4751 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.140912 4751 flags.go:64] FLAG: --cgroup-root="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.140923 4751 flags.go:64] FLAG: --cgroups-per-qos="true" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.140934 4751 flags.go:64] FLAG: --client-ca-file="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.140944 4751 flags.go:64] FLAG: --cloud-config="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.140953 4751 flags.go:64] FLAG: --cloud-provider="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.140962 4751 flags.go:64] FLAG: --cluster-dns="[]" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.140976 4751 flags.go:64] FLAG: --cluster-domain="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.140985 4751 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.140994 4751 flags.go:64] FLAG: --config-dir="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141003 4751 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141013 4751 flags.go:64] FLAG: --container-log-max-files="5" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141031 4751 flags.go:64] FLAG: --container-log-max-size="10Mi" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141041 4751 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141050 4751 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141060 4751 flags.go:64] FLAG: --containerd-namespace="k8s.io" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141100 4751 flags.go:64] FLAG: --contention-profiling="false" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141110 4751 flags.go:64] FLAG: --cpu-cfs-quota="true" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141119 4751 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141131 4751 flags.go:64] FLAG: --cpu-manager-policy="none" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141140 4751 flags.go:64] FLAG: --cpu-manager-policy-options="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141152 4751 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141161 4751 flags.go:64] FLAG: --enable-controller-attach-detach="true" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141170 4751 flags.go:64] FLAG: --enable-debugging-handlers="true" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141179 4751 flags.go:64] FLAG: --enable-load-reader="false" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141188 4751 flags.go:64] FLAG: --enable-server="true" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141198 4751 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141209 4751 flags.go:64] FLAG: --event-burst="100" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141219 4751 flags.go:64] FLAG: --event-qps="50" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141228 4751 flags.go:64] FLAG: --event-storage-age-limit="default=0" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141237 4751 flags.go:64] FLAG: --event-storage-event-limit="default=0" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141247 4751 flags.go:64] FLAG: --eviction-hard="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141258 4751 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141267 4751 flags.go:64] FLAG: --eviction-minimum-reclaim="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141276 4751 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141286 4751 flags.go:64] FLAG: --eviction-soft="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141295 4751 flags.go:64] FLAG: --eviction-soft-grace-period="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141304 4751 flags.go:64] FLAG: --exit-on-lock-contention="false" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141313 4751 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141322 4751 flags.go:64] FLAG: --experimental-mounter-path="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141331 4751 flags.go:64] FLAG: --fail-cgroupv1="false" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141339 4751 flags.go:64] FLAG: --fail-swap-on="true" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141349 4751 flags.go:64] FLAG: --feature-gates="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141360 4751 flags.go:64] FLAG: --file-check-frequency="20s" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141369 4751 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141378 4751 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141387 4751 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141397 4751 flags.go:64] FLAG: --healthz-port="10248" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141406 4751 flags.go:64] FLAG: --help="false" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141416 4751 flags.go:64] FLAG: --hostname-override="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141425 4751 flags.go:64] FLAG: --housekeeping-interval="10s" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141435 4751 flags.go:64] FLAG: --http-check-frequency="20s" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141445 4751 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141453 4751 flags.go:64] FLAG: --image-credential-provider-config="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141462 4751 flags.go:64] FLAG: --image-gc-high-threshold="85" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141471 4751 flags.go:64] FLAG: --image-gc-low-threshold="80" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141481 4751 flags.go:64] FLAG: --image-service-endpoint="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141490 4751 flags.go:64] FLAG: --kernel-memcg-notification="false" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141499 4751 flags.go:64] FLAG: --kube-api-burst="100" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141508 4751 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141517 4751 flags.go:64] FLAG: --kube-api-qps="50" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141526 4751 flags.go:64] FLAG: --kube-reserved="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141535 4751 flags.go:64] FLAG: --kube-reserved-cgroup="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141544 4751 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141553 4751 flags.go:64] FLAG: --kubelet-cgroups="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141562 4751 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141593 4751 flags.go:64] FLAG: --lock-file="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141602 4751 flags.go:64] FLAG: --log-cadvisor-usage="false" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141612 4751 flags.go:64] FLAG: --log-flush-frequency="5s" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141622 4751 flags.go:64] FLAG: --log-json-info-buffer-size="0" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141635 4751 flags.go:64] FLAG: --log-json-split-stream="false" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141644 4751 flags.go:64] FLAG: --log-text-info-buffer-size="0" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141653 4751 flags.go:64] FLAG: --log-text-split-stream="false" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141663 4751 flags.go:64] FLAG: --logging-format="text" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141672 4751 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141681 4751 flags.go:64] FLAG: --make-iptables-util-chains="true" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141690 4751 flags.go:64] FLAG: --manifest-url="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141699 4751 flags.go:64] FLAG: --manifest-url-header="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141712 4751 flags.go:64] FLAG: --max-housekeeping-interval="15s" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141721 4751 flags.go:64] FLAG: --max-open-files="1000000" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141732 4751 flags.go:64] FLAG: --max-pods="110" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141741 4751 flags.go:64] FLAG: --maximum-dead-containers="-1" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141750 4751 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141761 4751 flags.go:64] FLAG: --memory-manager-policy="None" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141770 4751 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141779 4751 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141788 4751 flags.go:64] FLAG: --node-ip="192.168.126.11" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141797 4751 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141816 4751 flags.go:64] FLAG: --node-status-max-images="50" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141826 4751 flags.go:64] FLAG: --node-status-update-frequency="10s" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141835 4751 flags.go:64] FLAG: --oom-score-adj="-999" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141844 4751 flags.go:64] FLAG: --pod-cidr="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141854 4751 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141868 4751 flags.go:64] FLAG: --pod-manifest-path="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141877 4751 flags.go:64] FLAG: --pod-max-pids="-1" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141886 4751 flags.go:64] FLAG: --pods-per-core="0" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141895 4751 flags.go:64] FLAG: --port="10250" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141904 4751 flags.go:64] FLAG: --protect-kernel-defaults="false" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141913 4751 flags.go:64] FLAG: --provider-id="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141922 4751 flags.go:64] FLAG: --qos-reserved="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141931 4751 flags.go:64] FLAG: --read-only-port="10255" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141940 4751 flags.go:64] FLAG: --register-node="true" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141949 4751 flags.go:64] FLAG: --register-schedulable="true" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141957 4751 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141972 4751 flags.go:64] FLAG: --registry-burst="10" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141981 4751 flags.go:64] FLAG: --registry-qps="5" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141990 4751 flags.go:64] FLAG: --reserved-cpus="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.141998 4751 flags.go:64] FLAG: --reserved-memory="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.142009 4751 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.142019 4751 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.142028 4751 flags.go:64] FLAG: --rotate-certificates="false" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.142037 4751 flags.go:64] FLAG: --rotate-server-certificates="false" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.142046 4751 flags.go:64] FLAG: --runonce="false" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.142054 4751 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.142089 4751 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.142098 4751 flags.go:64] FLAG: --seccomp-default="false" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.142108 4751 flags.go:64] FLAG: --serialize-image-pulls="true" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.142116 4751 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.142126 4751 flags.go:64] FLAG: --storage-driver-db="cadvisor" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.142135 4751 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.142145 4751 flags.go:64] FLAG: --storage-driver-password="root" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.142153 4751 flags.go:64] FLAG: --storage-driver-secure="false" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.142162 4751 flags.go:64] FLAG: --storage-driver-table="stats" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.142171 4751 flags.go:64] FLAG: --storage-driver-user="root" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.142179 4751 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.142190 4751 flags.go:64] FLAG: --sync-frequency="1m0s" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.142199 4751 flags.go:64] FLAG: --system-cgroups="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.142208 4751 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.142223 4751 flags.go:64] FLAG: --system-reserved-cgroup="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.142232 4751 flags.go:64] FLAG: --tls-cert-file="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.142242 4751 flags.go:64] FLAG: --tls-cipher-suites="[]" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.142253 4751 flags.go:64] FLAG: --tls-min-version="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.142262 4751 flags.go:64] FLAG: --tls-private-key-file="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.142271 4751 flags.go:64] FLAG: --topology-manager-policy="none" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.142280 4751 flags.go:64] FLAG: --topology-manager-policy-options="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.142289 4751 flags.go:64] FLAG: --topology-manager-scope="container" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.142298 4751 flags.go:64] FLAG: --v="2" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.142309 4751 flags.go:64] FLAG: --version="false" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.142320 4751 flags.go:64] FLAG: --vmodule="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.142331 4751 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.142341 4751 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.142541 4751 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.142551 4751 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.142561 4751 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.142569 4751 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.142579 4751 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.142589 4751 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.142599 4751 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.142609 4751 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.142618 4751 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.142627 4751 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.142637 4751 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.142646 4751 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.142656 4751 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.142664 4751 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.142673 4751 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.142683 4751 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.142691 4751 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.142700 4751 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.142710 4751 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.142717 4751 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.142725 4751 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.142733 4751 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.142740 4751 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.142749 4751 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.142757 4751 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.142765 4751 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.142772 4751 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.142780 4751 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.142788 4751 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.142795 4751 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.142803 4751 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.142812 4751 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.142819 4751 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.142827 4751 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.142835 4751 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.142843 4751 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.142851 4751 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.142859 4751 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.142866 4751 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.142874 4751 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.142882 4751 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.142890 4751 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.142898 4751 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.142908 4751 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.142918 4751 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.142927 4751 feature_gate.go:330] unrecognized feature gate: Example Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.142935 4751 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.142943 4751 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.142951 4751 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.142959 4751 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.142967 4751 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.142974 4751 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.142985 4751 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.142994 4751 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.143003 4751 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.143011 4751 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.143019 4751 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.143028 4751 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.143036 4751 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.143045 4751 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.143053 4751 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.143060 4751 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.143093 4751 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.143101 4751 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.143109 4751 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.143116 4751 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.143124 4751 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.143132 4751 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.143140 4751 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.143147 4751 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.143156 4751 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.143181 4751 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.156466 4751 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.156517 4751 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.156654 4751 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.156668 4751 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.156678 4751 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.156688 4751 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.156697 4751 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.156707 4751 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.156716 4751 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.156725 4751 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.156735 4751 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.156747 4751 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.156757 4751 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.156767 4751 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.156775 4751 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.156783 4751 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.156793 4751 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.156801 4751 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.156810 4751 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.156819 4751 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.156827 4751 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.156835 4751 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.156843 4751 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.156851 4751 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.156859 4751 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.156867 4751 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.156876 4751 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.156885 4751 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.156893 4751 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.156902 4751 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.156909 4751 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.156917 4751 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.156924 4751 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.156932 4751 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.156940 4751 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.156947 4751 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.156955 4751 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.156966 4751 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.156976 4751 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.156985 4751 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157016 4751 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157024 4751 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157032 4751 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157040 4751 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157048 4751 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157057 4751 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157064 4751 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157095 4751 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157103 4751 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157114 4751 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157124 4751 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157134 4751 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157166 4751 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157174 4751 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157183 4751 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157194 4751 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157202 4751 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157210 4751 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157218 4751 feature_gate.go:330] unrecognized feature gate: Example Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157226 4751 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157234 4751 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157244 4751 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157254 4751 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157262 4751 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157270 4751 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157278 4751 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157285 4751 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157293 4751 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157300 4751 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157308 4751 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157316 4751 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157323 4751 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157331 4751 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.157344 4751 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157643 4751 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157657 4751 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157665 4751 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157673 4751 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157681 4751 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157692 4751 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157702 4751 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157711 4751 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157719 4751 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157727 4751 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157735 4751 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157743 4751 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157750 4751 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157758 4751 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157779 4751 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157787 4751 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157795 4751 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157802 4751 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157810 4751 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157818 4751 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157829 4751 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157839 4751 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157849 4751 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157857 4751 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157865 4751 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157873 4751 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157881 4751 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157890 4751 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157899 4751 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157907 4751 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157914 4751 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157922 4751 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157930 4751 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157937 4751 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157945 4751 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157952 4751 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157960 4751 feature_gate.go:330] unrecognized feature gate: Example Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157968 4751 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157976 4751 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157984 4751 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157992 4751 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.157999 4751 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.158008 4751 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.158015 4751 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.158023 4751 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.158031 4751 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.158038 4751 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.158047 4751 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.158054 4751 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.158062 4751 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.158108 4751 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.158117 4751 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.158125 4751 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.158133 4751 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.158141 4751 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.158148 4751 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.158156 4751 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.158164 4751 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.158171 4751 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.158179 4751 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.158187 4751 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.158195 4751 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.158202 4751 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.158210 4751 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.158217 4751 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.158227 4751 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.158236 4751 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.158244 4751 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.158252 4751 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.158262 4751 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.158272 4751 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.158284 4751 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.159262 4751 server.go:940] "Client rotation is on, will bootstrap in background" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.172401 4751 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.173131 4751 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.175223 4751 server.go:997] "Starting client certificate rotation" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.175290 4751 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.176361 4751 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-05 03:29:51.269274361 +0000 UTC Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.176464 4751 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.207471 4751 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 31 14:41:36 crc kubenswrapper[4751]: E0131 14:41:36.213165 4751 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.98:6443: connect: connection refused" logger="UnhandledError" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.213338 4751 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.228026 4751 log.go:25] "Validated CRI v1 runtime API" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.271963 4751 log.go:25] "Validated CRI v1 image API" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.274627 4751 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.281534 4751 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-01-31-14-37-27-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.281584 4751 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.314672 4751 manager.go:217] Machine: {Timestamp:2026-01-31 14:41:36.311834054 +0000 UTC m=+0.686546999 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:ad4951a2-dbbf-4c4e-af3f-cce3c25b01ef BootID:2bc08d22-1e39-4800-b402-ea260cc19637 Filesystems:[{Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:e8:75:38 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:e8:75:38 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:37:66:ad Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:b8:87:0c Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:2b:85:13 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:6a:e2:76 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:e2:31:b3:08:bc:b9 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:46:ee:c5:32:e1:b8 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.315055 4751 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.315391 4751 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.315899 4751 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.316213 4751 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.316273 4751 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.316625 4751 topology_manager.go:138] "Creating topology manager with none policy" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.316645 4751 container_manager_linux.go:303] "Creating device plugin manager" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.317144 4751 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.317195 4751 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.317477 4751 state_mem.go:36] "Initialized new in-memory state store" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.318216 4751 server.go:1245] "Using root directory" path="/var/lib/kubelet" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.323386 4751 kubelet.go:418] "Attempting to sync node with API server" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.323431 4751 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.323473 4751 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.323494 4751 kubelet.go:324] "Adding apiserver pod source" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.323517 4751 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.326736 4751 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.98:6443: connect: connection refused Jan 31 14:41:36 crc kubenswrapper[4751]: E0131 14:41:36.326855 4751 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.98:6443: connect: connection refused" logger="UnhandledError" Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.326881 4751 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.98:6443: connect: connection refused Jan 31 14:41:36 crc kubenswrapper[4751]: E0131 14:41:36.326958 4751 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.98:6443: connect: connection refused" logger="UnhandledError" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.328966 4751 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.330495 4751 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.332210 4751 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.333828 4751 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.333873 4751 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.333887 4751 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.333902 4751 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.333923 4751 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.333937 4751 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.333950 4751 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.333999 4751 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.334014 4751 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.334029 4751 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.334108 4751 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.334143 4751 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.335186 4751 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.336155 4751 server.go:1280] "Started kubelet" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.336287 4751 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.336978 4751 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.337145 4751 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.98:6443: connect: connection refused Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.337903 4751 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 31 14:41:36 crc systemd[1]: Started Kubernetes Kubelet. Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.339164 4751 server.go:460] "Adding debug handlers to kubelet server" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.341242 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.341361 4751 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.342192 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 16:23:34.700402174 +0000 UTC Jan 31 14:41:36 crc kubenswrapper[4751]: E0131 14:41:36.342839 4751 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.343179 4751 volume_manager.go:287] "The desired_state_of_world populator starts" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.343245 4751 volume_manager.go:289] "Starting Kubelet Volume Manager" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.343285 4751 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.344125 4751 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.98:6443: connect: connection refused Jan 31 14:41:36 crc kubenswrapper[4751]: E0131 14:41:36.344292 4751 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.98:6443: connect: connection refused" logger="UnhandledError" Jan 31 14:41:36 crc kubenswrapper[4751]: E0131 14:41:36.344765 4751 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.98:6443: connect: connection refused" interval="200ms" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.352340 4751 factory.go:55] Registering systemd factory Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.352393 4751 factory.go:221] Registration of the systemd container factory successfully Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.353052 4751 factory.go:153] Registering CRI-O factory Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.353129 4751 factory.go:221] Registration of the crio container factory successfully Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.353264 4751 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.353299 4751 factory.go:103] Registering Raw factory Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.353327 4751 manager.go:1196] Started watching for new ooms in manager Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.354324 4751 manager.go:319] Starting recovery of all containers Jan 31 14:41:36 crc kubenswrapper[4751]: E0131 14:41:36.352534 4751 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.98:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.188fd7d6d88ee3c4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-31 14:41:36.336110532 +0000 UTC m=+0.710823457,LastTimestamp:2026-01-31 14:41:36.336110532 +0000 UTC m=+0.710823457,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.367349 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.367441 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.367470 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.367493 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.367512 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.367532 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.367551 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.367570 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.367594 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.367616 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.367635 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.367655 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.367676 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.367698 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.367717 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.367782 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.367803 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.367822 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.367842 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.367861 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.367880 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.367900 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.367922 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.367941 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.367960 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.367981 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.368003 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.368023 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.368043 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.368063 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.368114 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.368170 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.368197 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.368216 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.368234 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.368253 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.368272 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.368291 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.368310 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.368328 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.368348 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.368367 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.368387 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.368407 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.368425 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.368443 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.368465 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.368499 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.368529 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.368555 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.368583 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.368608 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.368640 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.368669 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.368696 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.368725 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.368756 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.368783 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.368808 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.368827 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.368845 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.368865 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.368885 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.368903 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.368934 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.368953 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.368977 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.369002 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.369044 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.369104 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.369130 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.369154 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.369181 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.369206 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.369233 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.369259 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.369281 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.369307 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.369333 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.372362 4751 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.372445 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.372473 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.372497 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.372520 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.372543 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.372566 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.372594 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.372623 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.372654 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.372684 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.372741 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.372774 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.372802 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.372834 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.372862 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.372891 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.372920 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.372943 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.372962 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.372983 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.373002 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.373023 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.373043 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.373061 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.373145 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.373194 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.373224 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.373255 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.373285 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.373313 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.373337 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.373356 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.373383 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.373403 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.373424 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.373445 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.373811 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.373830 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.373853 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.373872 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.373891 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.373930 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.373969 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.373994 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.374017 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.374042 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.374127 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.374176 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.374204 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.374244 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.374280 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.374308 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.374332 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.374361 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.374406 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.374435 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.374460 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.374483 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.374506 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.374526 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.374544 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.374567 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.374593 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.374617 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.374646 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.374665 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.374683 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.374700 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.374726 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.374749 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.374765 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.374781 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.374804 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.374828 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.374851 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.374866 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.374885 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.374902 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.374918 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.374948 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.374967 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.374987 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.375005 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.375023 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.375041 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.375059 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.375137 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.375325 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.375356 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.378030 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.378106 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.378127 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.378146 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.378165 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.378188 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.378206 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.378238 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.378255 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.378284 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.378304 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.378323 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.378339 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.378357 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.378374 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.378393 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.378411 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.378428 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.378444 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.378458 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.378475 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.378490 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.378507 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.378526 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.378552 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.378572 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.378590 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.378608 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.378627 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.378683 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.378707 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.378728 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.378747 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.378766 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.378784 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.378800 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.378819 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.378836 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.378854 4751 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.378872 4751 reconstruct.go:97] "Volume reconstruction finished" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.378886 4751 reconciler.go:26] "Reconciler: start to sync state" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.381848 4751 manager.go:324] Recovery completed Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.393185 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.395594 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.395656 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.395670 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.396430 4751 cpu_manager.go:225] "Starting CPU manager" policy="none" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.396453 4751 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.396492 4751 state_mem.go:36] "Initialized new in-memory state store" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.402695 4751 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.404545 4751 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.404585 4751 status_manager.go:217] "Starting to sync pod status with apiserver" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.404621 4751 kubelet.go:2335] "Starting kubelet main sync loop" Jan 31 14:41:36 crc kubenswrapper[4751]: E0131 14:41:36.404693 4751 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.407628 4751 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.98:6443: connect: connection refused Jan 31 14:41:36 crc kubenswrapper[4751]: E0131 14:41:36.407717 4751 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.98:6443: connect: connection refused" logger="UnhandledError" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.415514 4751 policy_none.go:49] "None policy: Start" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.417313 4751 memory_manager.go:170] "Starting memorymanager" policy="None" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.417348 4751 state_mem.go:35] "Initializing new in-memory state store" Jan 31 14:41:36 crc kubenswrapper[4751]: E0131 14:41:36.443611 4751 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.468254 4751 manager.go:334] "Starting Device Plugin manager" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.468475 4751 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.468569 4751 server.go:79] "Starting device plugin registration server" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.469231 4751 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.469364 4751 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.469902 4751 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.470352 4751 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.470462 4751 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 31 14:41:36 crc kubenswrapper[4751]: E0131 14:41:36.478392 4751 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.505269 4751 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.505384 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.509424 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.509479 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.509497 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.510031 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.510113 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.510381 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.511615 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.511676 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.511694 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.511788 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.511828 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.511846 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.512044 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.512280 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.512344 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.513109 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.513155 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.513173 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.513326 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.513547 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.513602 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.513624 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.513652 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.513665 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.514673 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.514699 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.514767 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.514724 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.514824 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.514861 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.515047 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.515139 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.515176 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.516328 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.516355 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.516369 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.516441 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.516506 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.516524 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.516902 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.516988 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.518282 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.518316 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.518329 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:41:36 crc kubenswrapper[4751]: E0131 14:41:36.545950 4751 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.98:6443: connect: connection refused" interval="400ms" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.570592 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.571950 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.572002 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.572016 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.572043 4751 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 31 14:41:36 crc kubenswrapper[4751]: E0131 14:41:36.572686 4751 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.98:6443: connect: connection refused" node="crc" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.580864 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.580918 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.581186 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.581250 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.581401 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.581481 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.581531 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.581576 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.581620 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.581682 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.581729 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.581775 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.581820 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.581861 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.581901 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.683125 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.683217 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.683256 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.683317 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.683353 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.683360 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.683387 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.683415 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.683480 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.683433 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.683531 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.683551 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.683553 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.683590 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.683641 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.683642 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.683653 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.683687 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.683749 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.683800 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.683818 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.683843 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.683883 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.683918 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.683920 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.683966 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.683970 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.684001 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.684062 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.683885 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.773692 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.775724 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.775790 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.775813 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.775854 4751 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 31 14:41:36 crc kubenswrapper[4751]: E0131 14:41:36.776626 4751 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.98:6443: connect: connection refused" node="crc" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.854513 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.867345 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.896447 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.919024 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-fe2b08f694193d48aba43b738bc3ad0e34f6dade5d9923f898bce86b51df2cd4 WatchSource:0}: Error finding container fe2b08f694193d48aba43b738bc3ad0e34f6dade5d9923f898bce86b51df2cd4: Status 404 returned error can't find the container with id fe2b08f694193d48aba43b738bc3ad0e34f6dade5d9923f898bce86b51df2cd4 Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.924143 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-88b00d0aa2e6a00ded22705bbdd4e1c09877c9a7026ebc2131992b951ea468f8 WatchSource:0}: Error finding container 88b00d0aa2e6a00ded22705bbdd4e1c09877c9a7026ebc2131992b951ea468f8: Status 404 returned error can't find the container with id 88b00d0aa2e6a00ded22705bbdd4e1c09877c9a7026ebc2131992b951ea468f8 Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.924429 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 14:41:36 crc kubenswrapper[4751]: I0131 14:41:36.932791 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.933654 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-3d8f5f7217a27dd1cc29e7316d4f3457b960feaa8a38ea18ee8ce25778480b97 WatchSource:0}: Error finding container 3d8f5f7217a27dd1cc29e7316d4f3457b960feaa8a38ea18ee8ce25778480b97: Status 404 returned error can't find the container with id 3d8f5f7217a27dd1cc29e7316d4f3457b960feaa8a38ea18ee8ce25778480b97 Jan 31 14:41:36 crc kubenswrapper[4751]: E0131 14:41:36.947671 4751 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.98:6443: connect: connection refused" interval="800ms" Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.949978 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-3a08a017395c4b14c93b205b62a2857317a99bc053cf38153b08b0ddfab6d072 WatchSource:0}: Error finding container 3a08a017395c4b14c93b205b62a2857317a99bc053cf38153b08b0ddfab6d072: Status 404 returned error can't find the container with id 3a08a017395c4b14c93b205b62a2857317a99bc053cf38153b08b0ddfab6d072 Jan 31 14:41:36 crc kubenswrapper[4751]: W0131 14:41:36.952760 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-76b06490ea6d342f0413a5f40a449f0749d6666ab75911ec1426cbcfa4258c9e WatchSource:0}: Error finding container 76b06490ea6d342f0413a5f40a449f0749d6666ab75911ec1426cbcfa4258c9e: Status 404 returned error can't find the container with id 76b06490ea6d342f0413a5f40a449f0749d6666ab75911ec1426cbcfa4258c9e Jan 31 14:41:37 crc kubenswrapper[4751]: I0131 14:41:37.177164 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 14:41:37 crc kubenswrapper[4751]: I0131 14:41:37.179724 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:41:37 crc kubenswrapper[4751]: I0131 14:41:37.179804 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:41:37 crc kubenswrapper[4751]: I0131 14:41:37.179823 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:41:37 crc kubenswrapper[4751]: I0131 14:41:37.179868 4751 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 31 14:41:37 crc kubenswrapper[4751]: E0131 14:41:37.180932 4751 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.98:6443: connect: connection refused" node="crc" Jan 31 14:41:37 crc kubenswrapper[4751]: I0131 14:41:37.338102 4751 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.98:6443: connect: connection refused Jan 31 14:41:37 crc kubenswrapper[4751]: I0131 14:41:37.343191 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 16:30:09.415993748 +0000 UTC Jan 31 14:41:37 crc kubenswrapper[4751]: I0131 14:41:37.410572 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3a08a017395c4b14c93b205b62a2857317a99bc053cf38153b08b0ddfab6d072"} Jan 31 14:41:37 crc kubenswrapper[4751]: I0131 14:41:37.411486 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3d8f5f7217a27dd1cc29e7316d4f3457b960feaa8a38ea18ee8ce25778480b97"} Jan 31 14:41:37 crc kubenswrapper[4751]: I0131 14:41:37.414436 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"88b00d0aa2e6a00ded22705bbdd4e1c09877c9a7026ebc2131992b951ea468f8"} Jan 31 14:41:37 crc kubenswrapper[4751]: I0131 14:41:37.416182 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"fe2b08f694193d48aba43b738bc3ad0e34f6dade5d9923f898bce86b51df2cd4"} Jan 31 14:41:37 crc kubenswrapper[4751]: I0131 14:41:37.420701 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"76b06490ea6d342f0413a5f40a449f0749d6666ab75911ec1426cbcfa4258c9e"} Jan 31 14:41:37 crc kubenswrapper[4751]: W0131 14:41:37.424457 4751 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.98:6443: connect: connection refused Jan 31 14:41:37 crc kubenswrapper[4751]: E0131 14:41:37.424518 4751 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.98:6443: connect: connection refused" logger="UnhandledError" Jan 31 14:41:37 crc kubenswrapper[4751]: W0131 14:41:37.609460 4751 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.98:6443: connect: connection refused Jan 31 14:41:37 crc kubenswrapper[4751]: E0131 14:41:37.609567 4751 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.98:6443: connect: connection refused" logger="UnhandledError" Jan 31 14:41:37 crc kubenswrapper[4751]: E0131 14:41:37.748884 4751 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.98:6443: connect: connection refused" interval="1.6s" Jan 31 14:41:37 crc kubenswrapper[4751]: W0131 14:41:37.787557 4751 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.98:6443: connect: connection refused Jan 31 14:41:37 crc kubenswrapper[4751]: E0131 14:41:37.787668 4751 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.98:6443: connect: connection refused" logger="UnhandledError" Jan 31 14:41:37 crc kubenswrapper[4751]: W0131 14:41:37.803434 4751 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.98:6443: connect: connection refused Jan 31 14:41:37 crc kubenswrapper[4751]: E0131 14:41:37.803516 4751 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.98:6443: connect: connection refused" logger="UnhandledError" Jan 31 14:41:37 crc kubenswrapper[4751]: I0131 14:41:37.981696 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 14:41:37 crc kubenswrapper[4751]: I0131 14:41:37.983875 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:41:37 crc kubenswrapper[4751]: I0131 14:41:37.983936 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:41:37 crc kubenswrapper[4751]: I0131 14:41:37.983954 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:41:37 crc kubenswrapper[4751]: I0131 14:41:37.983989 4751 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 31 14:41:37 crc kubenswrapper[4751]: E0131 14:41:37.984620 4751 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.98:6443: connect: connection refused" node="crc" Jan 31 14:41:38 crc kubenswrapper[4751]: I0131 14:41:38.288770 4751 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 31 14:41:38 crc kubenswrapper[4751]: E0131 14:41:38.290342 4751 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.98:6443: connect: connection refused" logger="UnhandledError" Jan 31 14:41:38 crc kubenswrapper[4751]: I0131 14:41:38.338016 4751 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.98:6443: connect: connection refused Jan 31 14:41:38 crc kubenswrapper[4751]: I0131 14:41:38.344219 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 17:22:23.359146688 +0000 UTC Jan 31 14:41:38 crc kubenswrapper[4751]: I0131 14:41:38.427448 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b9317b698da5ee8a5f40c046d810783e523f27b6806319023e57e08f5cae267c"} Jan 31 14:41:38 crc kubenswrapper[4751]: I0131 14:41:38.427517 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"daae5a3d70247f744d33c832eaf3080a91e73ffe0b5327d111ec8e87175becdb"} Jan 31 14:41:38 crc kubenswrapper[4751]: I0131 14:41:38.427537 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b37355e2b9d95a941be3552b713f7fe96973a49528f7a979524779b263412853"} Jan 31 14:41:38 crc kubenswrapper[4751]: I0131 14:41:38.429844 4751 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254" exitCode=0 Jan 31 14:41:38 crc kubenswrapper[4751]: I0131 14:41:38.429926 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254"} Jan 31 14:41:38 crc kubenswrapper[4751]: I0131 14:41:38.430106 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 14:41:38 crc kubenswrapper[4751]: I0131 14:41:38.431581 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:41:38 crc kubenswrapper[4751]: I0131 14:41:38.431637 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:41:38 crc kubenswrapper[4751]: I0131 14:41:38.431656 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:41:38 crc kubenswrapper[4751]: I0131 14:41:38.432664 4751 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="ccd9efb7096722c8a48318444b235a1970fbec711faf7448d47696ff84da5d37" exitCode=0 Jan 31 14:41:38 crc kubenswrapper[4751]: I0131 14:41:38.432768 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 14:41:38 crc kubenswrapper[4751]: I0131 14:41:38.432778 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"ccd9efb7096722c8a48318444b235a1970fbec711faf7448d47696ff84da5d37"} Jan 31 14:41:38 crc kubenswrapper[4751]: I0131 14:41:38.433990 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 14:41:38 crc kubenswrapper[4751]: I0131 14:41:38.434351 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:41:38 crc kubenswrapper[4751]: I0131 14:41:38.434393 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:41:38 crc kubenswrapper[4751]: I0131 14:41:38.434409 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:41:38 crc kubenswrapper[4751]: I0131 14:41:38.435164 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:41:38 crc kubenswrapper[4751]: I0131 14:41:38.435242 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:41:38 crc kubenswrapper[4751]: I0131 14:41:38.435269 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:41:38 crc kubenswrapper[4751]: I0131 14:41:38.435607 4751 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="3cbc9dfffff59a784e871d47bf2ab1b8419b11840eb329ce21c0f7c24e6e77c9" exitCode=0 Jan 31 14:41:38 crc kubenswrapper[4751]: I0131 14:41:38.435673 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"3cbc9dfffff59a784e871d47bf2ab1b8419b11840eb329ce21c0f7c24e6e77c9"} Jan 31 14:41:38 crc kubenswrapper[4751]: I0131 14:41:38.435716 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 14:41:38 crc kubenswrapper[4751]: I0131 14:41:38.437286 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:41:38 crc kubenswrapper[4751]: I0131 14:41:38.437373 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:41:38 crc kubenswrapper[4751]: I0131 14:41:38.437391 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:41:38 crc kubenswrapper[4751]: I0131 14:41:38.440173 4751 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="a0845dfce4ee156b5b52e07b6257d62908413eba9570b3767b9f00724e81e034" exitCode=0 Jan 31 14:41:38 crc kubenswrapper[4751]: I0131 14:41:38.440240 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 14:41:38 crc kubenswrapper[4751]: I0131 14:41:38.440246 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"a0845dfce4ee156b5b52e07b6257d62908413eba9570b3767b9f00724e81e034"} Jan 31 14:41:38 crc kubenswrapper[4751]: I0131 14:41:38.441404 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:41:38 crc kubenswrapper[4751]: I0131 14:41:38.441456 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:41:38 crc kubenswrapper[4751]: I0131 14:41:38.441473 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:41:39 crc kubenswrapper[4751]: W0131 14:41:39.039680 4751 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.98:6443: connect: connection refused Jan 31 14:41:39 crc kubenswrapper[4751]: E0131 14:41:39.040004 4751 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.98:6443: connect: connection refused" logger="UnhandledError" Jan 31 14:41:39 crc kubenswrapper[4751]: I0131 14:41:39.338953 4751 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.98:6443: connect: connection refused Jan 31 14:41:39 crc kubenswrapper[4751]: I0131 14:41:39.345369 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 16:35:23.180008567 +0000 UTC Jan 31 14:41:39 crc kubenswrapper[4751]: E0131 14:41:39.350193 4751 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.98:6443: connect: connection refused" interval="3.2s" Jan 31 14:41:39 crc kubenswrapper[4751]: I0131 14:41:39.448415 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6ff7cd215439083e1e83ef38a79946be1c7e9bbc20d66cdb931c456a21bbdf17"} Jan 31 14:41:39 crc kubenswrapper[4751]: I0131 14:41:39.448570 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 14:41:39 crc kubenswrapper[4751]: I0131 14:41:39.449895 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:41:39 crc kubenswrapper[4751]: I0131 14:41:39.449947 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:41:39 crc kubenswrapper[4751]: I0131 14:41:39.449963 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:41:39 crc kubenswrapper[4751]: I0131 14:41:39.452515 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7d34543b9a432a0dd029c18b90dd065749cf5a74b9221dcedf22a810afa749a3"} Jan 31 14:41:39 crc kubenswrapper[4751]: I0131 14:41:39.452562 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"08603a998264e94ef7871338a47019047495685fb867c5fa4f290f8a990dc6ea"} Jan 31 14:41:39 crc kubenswrapper[4751]: I0131 14:41:39.452582 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ddd61fd469b2710e86edf5644cff4c4272ce73a59b6578c4cf941c5408e2d218"} Jan 31 14:41:39 crc kubenswrapper[4751]: I0131 14:41:39.452598 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"39e4fe03f50f95359d284f7d5aebb0df697fba39743cf83857c7f17893cb2bff"} Jan 31 14:41:39 crc kubenswrapper[4751]: I0131 14:41:39.454957 4751 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="d1020dca4733e38925646f97eb80524c4060630e33323e9a5a0fdc4634c6b468" exitCode=0 Jan 31 14:41:39 crc kubenswrapper[4751]: I0131 14:41:39.455155 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 14:41:39 crc kubenswrapper[4751]: I0131 14:41:39.455280 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"d1020dca4733e38925646f97eb80524c4060630e33323e9a5a0fdc4634c6b468"} Jan 31 14:41:39 crc kubenswrapper[4751]: I0131 14:41:39.456138 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:41:39 crc kubenswrapper[4751]: I0131 14:41:39.456170 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:41:39 crc kubenswrapper[4751]: I0131 14:41:39.456186 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:41:39 crc kubenswrapper[4751]: I0131 14:41:39.460987 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 14:41:39 crc kubenswrapper[4751]: I0131 14:41:39.460986 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"a3a6478c4477b785bcb405d597f1c835faaf4ef7adb3a2bcd6e70cc2e692f44d"} Jan 31 14:41:39 crc kubenswrapper[4751]: I0131 14:41:39.461160 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"f41a7c5739a571e6f3ec88c3798ad2604382b9320c44ddda3d41681a64c6ab2e"} Jan 31 14:41:39 crc kubenswrapper[4751]: I0131 14:41:39.461196 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c56c1e31014f9e3d0be8140f58cff1c752ad4be1c6c60a942bc18320bbd37b6a"} Jan 31 14:41:39 crc kubenswrapper[4751]: I0131 14:41:39.462764 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:41:39 crc kubenswrapper[4751]: I0131 14:41:39.462804 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:41:39 crc kubenswrapper[4751]: I0131 14:41:39.462816 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:41:39 crc kubenswrapper[4751]: I0131 14:41:39.470774 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"3e9f62b49c0d916da6e1631f3216d52fd37ab407e878dc0509ccb19d0e5fb1df"} Jan 31 14:41:39 crc kubenswrapper[4751]: I0131 14:41:39.470836 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 14:41:39 crc kubenswrapper[4751]: I0131 14:41:39.471948 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:41:39 crc kubenswrapper[4751]: I0131 14:41:39.471976 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:41:39 crc kubenswrapper[4751]: I0131 14:41:39.472009 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:41:39 crc kubenswrapper[4751]: I0131 14:41:39.585556 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 14:41:39 crc kubenswrapper[4751]: I0131 14:41:39.592910 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:41:39 crc kubenswrapper[4751]: I0131 14:41:39.592951 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:41:39 crc kubenswrapper[4751]: I0131 14:41:39.592964 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:41:39 crc kubenswrapper[4751]: I0131 14:41:39.592993 4751 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 31 14:41:39 crc kubenswrapper[4751]: E0131 14:41:39.593547 4751 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.98:6443: connect: connection refused" node="crc" Jan 31 14:41:39 crc kubenswrapper[4751]: I0131 14:41:39.776724 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 14:41:39 crc kubenswrapper[4751]: W0131 14:41:39.783871 4751 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.98:6443: connect: connection refused Jan 31 14:41:39 crc kubenswrapper[4751]: E0131 14:41:39.783972 4751 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.98:6443: connect: connection refused" logger="UnhandledError" Jan 31 14:41:39 crc kubenswrapper[4751]: I0131 14:41:39.785892 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 14:41:40 crc kubenswrapper[4751]: I0131 14:41:40.345739 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 11:06:37.66158861 +0000 UTC Jan 31 14:41:40 crc kubenswrapper[4751]: I0131 14:41:40.479759 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"16f93de7a002a1e6a281d90042a003acf677ad136b29da511ead9acd31eb7ab2"} Jan 31 14:41:40 crc kubenswrapper[4751]: I0131 14:41:40.479952 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 14:41:40 crc kubenswrapper[4751]: I0131 14:41:40.481320 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:41:40 crc kubenswrapper[4751]: I0131 14:41:40.481367 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:41:40 crc kubenswrapper[4751]: I0131 14:41:40.481389 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:41:40 crc kubenswrapper[4751]: I0131 14:41:40.483394 4751 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="bf0f78147bc50d98a5ba239c2456467778fb4724433d914b9ee4300ce3af6e4a" exitCode=0 Jan 31 14:41:40 crc kubenswrapper[4751]: I0131 14:41:40.483548 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 14:41:40 crc kubenswrapper[4751]: I0131 14:41:40.483582 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 14:41:40 crc kubenswrapper[4751]: I0131 14:41:40.483631 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 14:41:40 crc kubenswrapper[4751]: I0131 14:41:40.483627 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"bf0f78147bc50d98a5ba239c2456467778fb4724433d914b9ee4300ce3af6e4a"} Jan 31 14:41:40 crc kubenswrapper[4751]: I0131 14:41:40.483641 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 14:41:40 crc kubenswrapper[4751]: I0131 14:41:40.483720 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 14:41:40 crc kubenswrapper[4751]: I0131 14:41:40.483750 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 31 14:41:40 crc kubenswrapper[4751]: I0131 14:41:40.488277 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:41:40 crc kubenswrapper[4751]: I0131 14:41:40.488334 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:41:40 crc kubenswrapper[4751]: I0131 14:41:40.488323 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:41:40 crc kubenswrapper[4751]: I0131 14:41:40.488433 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:41:40 crc kubenswrapper[4751]: I0131 14:41:40.488464 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:41:40 crc kubenswrapper[4751]: I0131 14:41:40.488488 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:41:40 crc kubenswrapper[4751]: I0131 14:41:40.488641 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:41:40 crc kubenswrapper[4751]: I0131 14:41:40.488716 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:41:40 crc kubenswrapper[4751]: I0131 14:41:40.488745 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:41:40 crc kubenswrapper[4751]: I0131 14:41:40.490221 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:41:40 crc kubenswrapper[4751]: I0131 14:41:40.490264 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:41:40 crc kubenswrapper[4751]: I0131 14:41:40.490359 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:41:41 crc kubenswrapper[4751]: I0131 14:41:41.345908 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 14:18:47.640302556 +0000 UTC Jan 31 14:41:41 crc kubenswrapper[4751]: I0131 14:41:41.490829 4751 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 31 14:41:41 crc kubenswrapper[4751]: I0131 14:41:41.490857 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 14:41:41 crc kubenswrapper[4751]: I0131 14:41:41.490883 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 14:41:41 crc kubenswrapper[4751]: I0131 14:41:41.490816 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e66ea760a35f4e073d5ead7b0270164010b4dd14737b23202f83a10290f75d3c"} Jan 31 14:41:41 crc kubenswrapper[4751]: I0131 14:41:41.491003 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7f4a4eb52c2c850f91c212fdc556452ab8cc91168ddb67c2078b806d8725be2a"} Jan 31 14:41:41 crc kubenswrapper[4751]: I0131 14:41:41.491064 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c0b0fe57d51f2684ba60b1818c1e3010e5364c6d196433972b46cb3c3f9b5e61"} Jan 31 14:41:41 crc kubenswrapper[4751]: I0131 14:41:41.490948 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 14:41:41 crc kubenswrapper[4751]: I0131 14:41:41.492386 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:41:41 crc kubenswrapper[4751]: I0131 14:41:41.492438 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:41:41 crc kubenswrapper[4751]: I0131 14:41:41.492454 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:41:41 crc kubenswrapper[4751]: I0131 14:41:41.493658 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:41:41 crc kubenswrapper[4751]: I0131 14:41:41.493690 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:41:41 crc kubenswrapper[4751]: I0131 14:41:41.493705 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:41:41 crc kubenswrapper[4751]: I0131 14:41:41.493713 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:41:41 crc kubenswrapper[4751]: I0131 14:41:41.493747 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:41:41 crc kubenswrapper[4751]: I0131 14:41:41.493763 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:41:42 crc kubenswrapper[4751]: I0131 14:41:42.157197 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 14:41:42 crc kubenswrapper[4751]: I0131 14:41:42.300249 4751 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 31 14:41:42 crc kubenswrapper[4751]: I0131 14:41:42.347106 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 15:03:22.896582443 +0000 UTC Jan 31 14:41:42 crc kubenswrapper[4751]: I0131 14:41:42.387509 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 14:41:42 crc kubenswrapper[4751]: I0131 14:41:42.500113 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"92d196e489f72bd3c04ada6d0ea993f0ad89eb42497efc8723720ca3a7720509"} Jan 31 14:41:42 crc kubenswrapper[4751]: I0131 14:41:42.500194 4751 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 31 14:41:42 crc kubenswrapper[4751]: I0131 14:41:42.500261 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 14:41:42 crc kubenswrapper[4751]: I0131 14:41:42.500314 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 14:41:42 crc kubenswrapper[4751]: I0131 14:41:42.500340 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 14:41:42 crc kubenswrapper[4751]: I0131 14:41:42.500204 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"aa739a6a66bd2196c9131cf929bdb8a133e3e40c3dfa9a105bb3ea33fa2ede20"} Jan 31 14:41:42 crc kubenswrapper[4751]: I0131 14:41:42.501768 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:41:42 crc kubenswrapper[4751]: I0131 14:41:42.501824 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:41:42 crc kubenswrapper[4751]: I0131 14:41:42.501851 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:41:42 crc kubenswrapper[4751]: I0131 14:41:42.501919 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:41:42 crc kubenswrapper[4751]: I0131 14:41:42.501953 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:41:42 crc kubenswrapper[4751]: I0131 14:41:42.501970 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:41:42 crc kubenswrapper[4751]: I0131 14:41:42.501971 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:41:42 crc kubenswrapper[4751]: I0131 14:41:42.502009 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:41:42 crc kubenswrapper[4751]: I0131 14:41:42.502032 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:41:42 crc kubenswrapper[4751]: I0131 14:41:42.793910 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 14:41:42 crc kubenswrapper[4751]: I0131 14:41:42.795755 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:41:42 crc kubenswrapper[4751]: I0131 14:41:42.795815 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:41:42 crc kubenswrapper[4751]: I0131 14:41:42.795850 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:41:42 crc kubenswrapper[4751]: I0131 14:41:42.795907 4751 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 31 14:41:43 crc kubenswrapper[4751]: I0131 14:41:43.347512 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 16:14:36.729956994 +0000 UTC Jan 31 14:41:43 crc kubenswrapper[4751]: I0131 14:41:43.503543 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 14:41:43 crc kubenswrapper[4751]: I0131 14:41:43.504779 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:41:43 crc kubenswrapper[4751]: I0131 14:41:43.504839 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:41:43 crc kubenswrapper[4751]: I0131 14:41:43.504861 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:41:43 crc kubenswrapper[4751]: I0131 14:41:43.603788 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 14:41:43 crc kubenswrapper[4751]: I0131 14:41:43.604166 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 14:41:43 crc kubenswrapper[4751]: I0131 14:41:43.605627 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:41:43 crc kubenswrapper[4751]: I0131 14:41:43.605677 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:41:43 crc kubenswrapper[4751]: I0131 14:41:43.605695 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:41:43 crc kubenswrapper[4751]: I0131 14:41:43.875636 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 14:41:44 crc kubenswrapper[4751]: I0131 14:41:44.348465 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 10:55:25.590535162 +0000 UTC Jan 31 14:41:44 crc kubenswrapper[4751]: I0131 14:41:44.506916 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 14:41:44 crc kubenswrapper[4751]: I0131 14:41:44.508205 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:41:44 crc kubenswrapper[4751]: I0131 14:41:44.508266 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:41:44 crc kubenswrapper[4751]: I0131 14:41:44.508290 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:41:45 crc kubenswrapper[4751]: I0131 14:41:45.349338 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 05:34:51.743664375 +0000 UTC Jan 31 14:41:45 crc kubenswrapper[4751]: I0131 14:41:45.693661 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Jan 31 14:41:45 crc kubenswrapper[4751]: I0131 14:41:45.693959 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 14:41:45 crc kubenswrapper[4751]: I0131 14:41:45.695606 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:41:45 crc kubenswrapper[4751]: I0131 14:41:45.695646 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:41:45 crc kubenswrapper[4751]: I0131 14:41:45.695666 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:41:46 crc kubenswrapper[4751]: I0131 14:41:46.350272 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 02:57:28.635188184 +0000 UTC Jan 31 14:41:46 crc kubenswrapper[4751]: E0131 14:41:46.478527 4751 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 31 14:41:47 crc kubenswrapper[4751]: I0131 14:41:47.231228 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 14:41:47 crc kubenswrapper[4751]: I0131 14:41:47.231505 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 14:41:47 crc kubenswrapper[4751]: I0131 14:41:47.233006 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:41:47 crc kubenswrapper[4751]: I0131 14:41:47.233056 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:41:47 crc kubenswrapper[4751]: I0131 14:41:47.233113 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:41:47 crc kubenswrapper[4751]: I0131 14:41:47.239807 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 14:41:47 crc kubenswrapper[4751]: I0131 14:41:47.350444 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 03:56:57.665967953 +0000 UTC Jan 31 14:41:47 crc kubenswrapper[4751]: I0131 14:41:47.514335 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 14:41:47 crc kubenswrapper[4751]: I0131 14:41:47.515581 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:41:47 crc kubenswrapper[4751]: I0131 14:41:47.515618 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:41:47 crc kubenswrapper[4751]: I0131 14:41:47.515629 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:41:48 crc kubenswrapper[4751]: I0131 14:41:48.350913 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 11:02:58.834129059 +0000 UTC Jan 31 14:41:49 crc kubenswrapper[4751]: I0131 14:41:49.351492 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 09:59:40.555076452 +0000 UTC Jan 31 14:41:50 crc kubenswrapper[4751]: W0131 14:41:50.007132 4751 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Jan 31 14:41:50 crc kubenswrapper[4751]: I0131 14:41:50.007242 4751 trace.go:236] Trace[1671566675]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (31-Jan-2026 14:41:40.005) (total time: 10001ms): Jan 31 14:41:50 crc kubenswrapper[4751]: Trace[1671566675]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (14:41:50.007) Jan 31 14:41:50 crc kubenswrapper[4751]: Trace[1671566675]: [10.001706612s] [10.001706612s] END Jan 31 14:41:50 crc kubenswrapper[4751]: E0131 14:41:50.007273 4751 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Jan 31 14:41:50 crc kubenswrapper[4751]: I0131 14:41:50.231439 4751 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 31 14:41:50 crc kubenswrapper[4751]: I0131 14:41:50.231533 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 31 14:41:50 crc kubenswrapper[4751]: I0131 14:41:50.339556 4751 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Jan 31 14:41:50 crc kubenswrapper[4751]: I0131 14:41:50.352164 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 17:01:15.962187861 +0000 UTC Jan 31 14:41:50 crc kubenswrapper[4751]: W0131 14:41:50.724101 4751 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Jan 31 14:41:50 crc kubenswrapper[4751]: I0131 14:41:50.724225 4751 trace.go:236] Trace[1279759450]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (31-Jan-2026 14:41:40.722) (total time: 10002ms): Jan 31 14:41:50 crc kubenswrapper[4751]: Trace[1279759450]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (14:41:50.724) Jan 31 14:41:50 crc kubenswrapper[4751]: Trace[1279759450]: [10.002026007s] [10.002026007s] END Jan 31 14:41:50 crc kubenswrapper[4751]: E0131 14:41:50.724255 4751 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Jan 31 14:41:50 crc kubenswrapper[4751]: I0131 14:41:50.816012 4751 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Jan 31 14:41:50 crc kubenswrapper[4751]: I0131 14:41:50.816123 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 31 14:41:50 crc kubenswrapper[4751]: I0131 14:41:50.823717 4751 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Jan 31 14:41:50 crc kubenswrapper[4751]: I0131 14:41:50.823784 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 31 14:41:51 crc kubenswrapper[4751]: I0131 14:41:51.186402 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Jan 31 14:41:51 crc kubenswrapper[4751]: I0131 14:41:51.186738 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 14:41:51 crc kubenswrapper[4751]: I0131 14:41:51.188265 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:41:51 crc kubenswrapper[4751]: I0131 14:41:51.188308 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:41:51 crc kubenswrapper[4751]: I0131 14:41:51.188322 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:41:51 crc kubenswrapper[4751]: I0131 14:41:51.237447 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Jan 31 14:41:51 crc kubenswrapper[4751]: I0131 14:41:51.352748 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 03:54:14.548595716 +0000 UTC Jan 31 14:41:51 crc kubenswrapper[4751]: I0131 14:41:51.526837 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 14:41:51 crc kubenswrapper[4751]: I0131 14:41:51.528194 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:41:51 crc kubenswrapper[4751]: I0131 14:41:51.528264 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:41:51 crc kubenswrapper[4751]: I0131 14:41:51.528282 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:41:51 crc kubenswrapper[4751]: I0131 14:41:51.550140 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Jan 31 14:41:52 crc kubenswrapper[4751]: I0131 14:41:52.353487 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 17:43:07.426118064 +0000 UTC Jan 31 14:41:52 crc kubenswrapper[4751]: I0131 14:41:52.395732 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 14:41:52 crc kubenswrapper[4751]: I0131 14:41:52.395975 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 14:41:52 crc kubenswrapper[4751]: I0131 14:41:52.397482 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:41:52 crc kubenswrapper[4751]: I0131 14:41:52.397547 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:41:52 crc kubenswrapper[4751]: I0131 14:41:52.397565 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:41:52 crc kubenswrapper[4751]: I0131 14:41:52.406368 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 14:41:52 crc kubenswrapper[4751]: I0131 14:41:52.530025 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 14:41:52 crc kubenswrapper[4751]: I0131 14:41:52.530154 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 14:41:52 crc kubenswrapper[4751]: I0131 14:41:52.531537 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:41:52 crc kubenswrapper[4751]: I0131 14:41:52.531586 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:41:52 crc kubenswrapper[4751]: I0131 14:41:52.531604 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:41:52 crc kubenswrapper[4751]: I0131 14:41:52.531539 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:41:52 crc kubenswrapper[4751]: I0131 14:41:52.531726 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:41:52 crc kubenswrapper[4751]: I0131 14:41:52.531756 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:41:53 crc kubenswrapper[4751]: I0131 14:41:53.354401 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 11:00:24.715521795 +0000 UTC Jan 31 14:41:53 crc kubenswrapper[4751]: I0131 14:41:53.926569 4751 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 31 14:41:54 crc kubenswrapper[4751]: I0131 14:41:54.355821 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 23:55:10.154730813 +0000 UTC Jan 31 14:41:55 crc kubenswrapper[4751]: I0131 14:41:55.356751 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 21:06:31.751453667 +0000 UTC Jan 31 14:41:55 crc kubenswrapper[4751]: E0131 14:41:55.805378 4751 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Jan 31 14:41:55 crc kubenswrapper[4751]: I0131 14:41:55.808987 4751 trace.go:236] Trace[113917315]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (31-Jan-2026 14:41:43.134) (total time: 12674ms): Jan 31 14:41:55 crc kubenswrapper[4751]: Trace[113917315]: ---"Objects listed" error: 12674ms (14:41:55.808) Jan 31 14:41:55 crc kubenswrapper[4751]: Trace[113917315]: [12.674308794s] [12.674308794s] END Jan 31 14:41:55 crc kubenswrapper[4751]: I0131 14:41:55.809063 4751 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 31 14:41:55 crc kubenswrapper[4751]: I0131 14:41:55.810345 4751 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Jan 31 14:41:55 crc kubenswrapper[4751]: I0131 14:41:55.810625 4751 trace.go:236] Trace[1609368597]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (31-Jan-2026 14:41:43.600) (total time: 12209ms): Jan 31 14:41:55 crc kubenswrapper[4751]: Trace[1609368597]: ---"Objects listed" error: 12209ms (14:41:55.810) Jan 31 14:41:55 crc kubenswrapper[4751]: Trace[1609368597]: [12.209705652s] [12.209705652s] END Jan 31 14:41:55 crc kubenswrapper[4751]: I0131 14:41:55.810668 4751 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 31 14:41:55 crc kubenswrapper[4751]: E0131 14:41:55.811886 4751 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Jan 31 14:41:55 crc kubenswrapper[4751]: I0131 14:41:55.822622 4751 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 31 14:41:55 crc kubenswrapper[4751]: I0131 14:41:55.872179 4751 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:59828->192.168.126.11:17697: read: connection reset by peer" start-of-body= Jan 31 14:41:55 crc kubenswrapper[4751]: I0131 14:41:55.872327 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:59828->192.168.126.11:17697: read: connection reset by peer" Jan 31 14:41:55 crc kubenswrapper[4751]: I0131 14:41:55.872250 4751 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:59838->192.168.126.11:17697: read: connection reset by peer" start-of-body= Jan 31 14:41:55 crc kubenswrapper[4751]: I0131 14:41:55.872499 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:59838->192.168.126.11:17697: read: connection reset by peer" Jan 31 14:41:55 crc kubenswrapper[4751]: I0131 14:41:55.873140 4751 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Jan 31 14:41:55 crc kubenswrapper[4751]: I0131 14:41:55.873237 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Jan 31 14:41:55 crc kubenswrapper[4751]: I0131 14:41:55.873776 4751 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Jan 31 14:41:55 crc kubenswrapper[4751]: I0131 14:41:55.873835 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Jan 31 14:41:56 crc kubenswrapper[4751]: I0131 14:41:56.357211 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 10:28:40.704318502 +0000 UTC Jan 31 14:41:56 crc kubenswrapper[4751]: E0131 14:41:56.478649 4751 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 31 14:41:56 crc kubenswrapper[4751]: I0131 14:41:56.544570 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 31 14:41:56 crc kubenswrapper[4751]: I0131 14:41:56.546991 4751 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="16f93de7a002a1e6a281d90042a003acf677ad136b29da511ead9acd31eb7ab2" exitCode=255 Jan 31 14:41:56 crc kubenswrapper[4751]: I0131 14:41:56.547051 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"16f93de7a002a1e6a281d90042a003acf677ad136b29da511ead9acd31eb7ab2"} Jan 31 14:41:56 crc kubenswrapper[4751]: I0131 14:41:56.547294 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 14:41:56 crc kubenswrapper[4751]: I0131 14:41:56.548725 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:41:56 crc kubenswrapper[4751]: I0131 14:41:56.548794 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:41:56 crc kubenswrapper[4751]: I0131 14:41:56.548821 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:41:56 crc kubenswrapper[4751]: I0131 14:41:56.549902 4751 scope.go:117] "RemoveContainer" containerID="16f93de7a002a1e6a281d90042a003acf677ad136b29da511ead9acd31eb7ab2" Jan 31 14:41:56 crc kubenswrapper[4751]: I0131 14:41:56.809309 4751 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.234966 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.239331 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.337056 4751 apiserver.go:52] "Watching apiserver" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.340450 4751 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.340975 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c"] Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.341555 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.341923 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:41:57 crc kubenswrapper[4751]: E0131 14:41:57.342066 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.342172 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:41:57 crc kubenswrapper[4751]: E0131 14:41:57.342224 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.342281 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.342276 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.342353 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 31 14:41:57 crc kubenswrapper[4751]: E0131 14:41:57.342389 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.343909 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.344139 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.344489 4751 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.344952 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.345164 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.345187 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.346370 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.347025 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.347566 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.348491 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.357611 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 05:22:54.656469349 +0000 UTC Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.366477 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b7a3883-0c5a-4116-adb0-ad25f69cd7f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daae5a3d70247f744d33c832eaf3080a91e73ffe0b5327d111ec8e87175becdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37355e2b9d95a941be3552b713f7fe96973a49528f7a979524779b263412853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9317b698da5ee8a5f40c046d810783e523f27b6806319023e57e08f5cae267c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff7cd215439083e1e83ef38a79946be1c7e9bbc20d66cdb931c456a21bbdf17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.382448 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.404987 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.415676 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.422299 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.422343 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.422367 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.422389 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.422408 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.422428 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.422448 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.422468 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.422490 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.422539 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.422560 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.422580 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.422606 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.422627 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.422648 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.422669 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.422690 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.422748 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.422765 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.422786 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.422808 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.422829 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.422850 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.422869 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.422889 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.422912 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.422921 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.422943 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.422939 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.423028 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.423066 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.423132 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.423166 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.423200 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.423238 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.423272 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.423307 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.423341 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.423376 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.423419 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.423510 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.423563 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.423615 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.423667 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.423718 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.423758 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.423791 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.423829 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.423863 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.423896 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.423929 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.423960 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.423994 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.424030 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.427358 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.427403 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.427444 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.423100 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.427509 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.423195 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.423237 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.423396 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.423591 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.427562 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.423645 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.423658 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.423704 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.423855 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.424020 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.424358 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.424409 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.424440 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.424535 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.424710 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.424729 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.424752 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.424789 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.424952 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.425032 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.425112 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.425133 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.425468 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.425484 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.425521 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.425549 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.425590 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.425704 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.425859 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.425975 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.425994 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.426326 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.425235 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.426484 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.426655 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.426959 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.427134 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.427206 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.427548 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.427910 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.427930 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.427947 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.427965 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.427982 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.427999 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.428013 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.428028 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.428044 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.428061 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.428089 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.428103 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.428117 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.428134 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.428209 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.428234 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.428972 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.429167 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.429573 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.429622 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.429828 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.429848 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.429864 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.429882 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.429897 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.429912 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.429926 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.429943 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.429958 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.429974 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.429989 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.430004 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.430023 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.430040 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.430056 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.430088 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.431421 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.431440 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.431562 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.431586 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.431625 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.431644 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.431661 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.431676 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.431692 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.431707 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.431723 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.431742 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.431758 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.431775 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.431792 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.431807 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.431823 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.431839 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.431855 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.431872 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.431902 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.431919 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.431935 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.432649 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.432668 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.432683 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.432698 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.432713 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.432727 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.432742 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.432757 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.432774 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.432790 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.432805 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.432831 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.432854 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.432870 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.432885 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.432900 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.432915 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.432929 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.433008 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.432609 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.433277 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.433295 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.434197 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.434246 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.434339 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.434372 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.434409 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.434996 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.435117 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.435153 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.435186 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.435218 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.435251 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.435286 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.436346 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.436377 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.436398 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.436417 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.436434 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.436451 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.436469 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.436487 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.436505 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.436524 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.436542 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.436558 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.436578 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.436595 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.436645 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.436662 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.436679 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.436696 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.436718 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.436734 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.436752 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.436768 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.436784 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.436799 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.436815 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.436831 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.436846 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.436861 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.436875 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.436892 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.436909 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.436924 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.439029 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.439148 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.439229 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.439265 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.439379 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.440187 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.440233 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.440268 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.440302 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.440339 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.440383 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.440417 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.440450 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.440484 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.440520 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.440556 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.440602 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.440637 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.440670 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.440704 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.440739 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.440807 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.440848 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.440885 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.440925 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.440965 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.441004 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.441042 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.441103 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.441140 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.441178 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.441211 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.441245 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.441283 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.441323 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.441413 4751 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.441438 4751 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.441461 4751 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.441481 4751 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.441501 4751 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.441522 4751 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.441541 4751 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.441561 4751 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.441581 4751 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.441600 4751 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.441621 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.441641 4751 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.441660 4751 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.441679 4751 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.441699 4751 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.441719 4751 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.441756 4751 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.441775 4751 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.441796 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.441815 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.441834 4751 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.441854 4751 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.441874 4751 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.441892 4751 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.441913 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.441933 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.441952 4751 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.441971 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.441991 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.442011 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.442032 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.442052 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.442096 4751 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.442118 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.442139 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.442157 4751 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.442177 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.442196 4751 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.442215 4751 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.442235 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.442364 4751 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.442388 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.442408 4751 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.442615 4751 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.442638 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.443190 4751 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.443232 4751 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.443246 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.430295 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.430348 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.430382 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.430820 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.430864 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.431036 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.431199 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.431354 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.435277 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.435586 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.435626 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.435734 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.436013 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.436122 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.436231 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.436255 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.436682 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.436796 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.437177 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.437328 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.437383 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.437470 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.438634 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.438878 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.439035 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.439048 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.439060 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.439271 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.439937 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.444847 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.446114 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.446174 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.446227 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.446534 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.446689 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.447026 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.447126 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.447766 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.447963 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.448165 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.448201 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.448358 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.448385 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.448635 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.448887 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.448949 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.449085 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.449248 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.449493 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.449906 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.450425 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.456236 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.456641 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.456719 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.457016 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.457444 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.458294 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: E0131 14:41:57.459095 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:41:57.959030628 +0000 UTC m=+22.333743543 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.459791 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.460031 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.460209 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.460626 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.461341 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.462034 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.462150 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.462207 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.462298 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.462584 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.462494 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.462966 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.462805 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.463100 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.463226 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.463655 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.463704 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.463752 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.463860 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.463939 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.464244 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.464153 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.464850 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: E0131 14:41:57.464983 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 14:41:57 crc kubenswrapper[4751]: E0131 14:41:57.464999 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 14:41:57 crc kubenswrapper[4751]: E0131 14:41:57.465010 4751 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 14:41:57 crc kubenswrapper[4751]: E0131 14:41:57.465050 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-31 14:41:57.965040326 +0000 UTC m=+22.339753211 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.465665 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.465825 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.465826 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.466325 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.466821 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.466653 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.467121 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: E0131 14:41:57.474892 4751 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.475596 4751 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.477180 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.478498 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.479134 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: E0131 14:41:57.479257 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 14:41:57.979218136 +0000 UTC m=+22.353931131 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 14:41:57 crc kubenswrapper[4751]: E0131 14:41:57.481151 4751 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.481386 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 31 14:41:57 crc kubenswrapper[4751]: E0131 14:41:57.481479 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 14:41:57.981202698 +0000 UTC m=+22.355915583 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.481464 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.481777 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.481967 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.482023 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.482088 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.482537 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.482648 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.482962 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.483177 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.483368 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.483641 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.483843 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.484350 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.484439 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.484660 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 14:41:57 crc kubenswrapper[4751]: E0131 14:41:57.484802 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 14:41:57 crc kubenswrapper[4751]: E0131 14:41:57.484828 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 14:41:57 crc kubenswrapper[4751]: E0131 14:41:57.484844 4751 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 14:41:57 crc kubenswrapper[4751]: E0131 14:41:57.484898 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-31 14:41:57.984877944 +0000 UTC m=+22.359590839 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.508247 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.508866 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.508900 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.509376 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.509579 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.510944 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.510999 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.511748 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.511829 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.511951 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.511938 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.511855 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.512066 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.512302 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.512804 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.512969 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.513185 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.513254 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.513462 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.513495 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.513533 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.513870 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.514143 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.515123 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.515394 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.522702 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.522999 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.523304 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.523780 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.523899 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.525734 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.525937 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.526183 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.526215 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.526358 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.526442 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.526468 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.526483 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.526693 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.526756 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.526832 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.527052 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.527115 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.527396 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.527663 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.527886 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.528346 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.530764 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b7a3883-0c5a-4116-adb0-ad25f69cd7f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daae5a3d70247f744d33c832eaf3080a91e73ffe0b5327d111ec8e87175becdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37355e2b9d95a941be3552b713f7fe96973a49528f7a979524779b263412853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9317b698da5ee8a5f40c046d810783e523f27b6806319023e57e08f5cae267c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff7cd215439083e1e83ef38a79946be1c7e9bbc20d66cdb931c456a21bbdf17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.536412 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.543046 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.544708 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.544774 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.544842 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.544855 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.544868 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.544877 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.544885 4751 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.544893 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.544901 4751 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.544909 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.544918 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.544926 4751 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.544934 4751 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.544943 4751 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.544951 4751 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.544959 4751 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.544968 4751 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.544976 4751 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.544985 4751 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.544994 4751 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545002 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545012 4751 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545020 4751 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545028 4751 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545036 4751 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545192 4751 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545204 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545212 4751 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545221 4751 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545229 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545237 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545245 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545254 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545261 4751 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545269 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545277 4751 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545286 4751 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545294 4751 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545302 4751 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545310 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545318 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545326 4751 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545333 4751 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545341 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545350 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545358 4751 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545367 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545379 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545391 4751 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545403 4751 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545415 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545497 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545510 4751 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545520 4751 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545533 4751 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545544 4751 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545558 4751 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545569 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545582 4751 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545592 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545604 4751 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545614 4751 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545627 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545638 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545651 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545662 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545674 4751 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545685 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545699 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545710 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545724 4751 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545735 4751 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545749 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545764 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545780 4751 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545796 4751 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545807 4751 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545819 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545830 4751 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545842 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545856 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545866 4751 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545877 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545883 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545888 4751 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545936 4751 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545945 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545955 4751 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545964 4751 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545974 4751 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545982 4751 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545992 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.546003 4751 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.546013 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.546023 4751 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.546033 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.546041 4751 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.546050 4751 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.546057 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.546111 4751 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.546119 4751 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.546129 4751 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.546137 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.546146 4751 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.546154 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.546162 4751 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.546170 4751 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.546178 4751 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.546187 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.546196 4751 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.546203 4751 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.546214 4751 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.546223 4751 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.546231 4751 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.546240 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.546248 4751 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.546259 4751 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.546269 4751 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.546277 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.546285 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.546293 4751 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.546302 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.546310 4751 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.546318 4751 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.546326 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.546334 4751 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.546342 4751 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.546350 4751 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.546359 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.546367 4751 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.546375 4751 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.546384 4751 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.546392 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.546400 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.546409 4751 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.546417 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.546426 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.546434 4751 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.546443 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.546452 4751 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.546460 4751 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.546469 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.546477 4751 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.546485 4751 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.546495 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.546471 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.546503 4751 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.546641 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.545833 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.554000 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.555188 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.555289 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8df6c6abfaa0e16e8c6de87da9d3584b57a9e83e9dee03f98bb9de38cf71cc19"} Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.555361 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.555657 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 14:41:57 crc kubenswrapper[4751]: E0131 14:41:57.559812 4751 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.561491 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.566517 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.569752 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.580414 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.592243 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.606339 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.617149 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b7a3883-0c5a-4116-adb0-ad25f69cd7f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daae5a3d70247f744d33c832eaf3080a91e73ffe0b5327d111ec8e87175becdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37355e2b9d95a941be3552b713f7fe96973a49528f7a979524779b263412853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9317b698da5ee8a5f40c046d810783e523f27b6806319023e57e08f5cae267c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff7cd215439083e1e83ef38a79946be1c7e9bbc20d66cdb931c456a21bbdf17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.628313 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.638601 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.648018 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.648049 4751 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.663172 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.680871 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 31 14:41:57 crc kubenswrapper[4751]: I0131 14:41:57.696178 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 31 14:41:57 crc kubenswrapper[4751]: W0131 14:41:57.711880 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-4d3590e243eaca0ee5e56d7c6d15fb2c545a920dec35c8fae50837d96db3b72b WatchSource:0}: Error finding container 4d3590e243eaca0ee5e56d7c6d15fb2c545a920dec35c8fae50837d96db3b72b: Status 404 returned error can't find the container with id 4d3590e243eaca0ee5e56d7c6d15fb2c545a920dec35c8fae50837d96db3b72b Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.051436 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.051534 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.051584 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:41:58 crc kubenswrapper[4751]: E0131 14:41:58.051604 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:41:59.051583869 +0000 UTC m=+23.426296764 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.051633 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.051667 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:41:58 crc kubenswrapper[4751]: E0131 14:41:58.051709 4751 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 14:41:58 crc kubenswrapper[4751]: E0131 14:41:58.051727 4751 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 14:41:58 crc kubenswrapper[4751]: E0131 14:41:58.051760 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 14:41:59.051750113 +0000 UTC m=+23.426463018 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 14:41:58 crc kubenswrapper[4751]: E0131 14:41:58.051775 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 14:41:59.051766954 +0000 UTC m=+23.426479849 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 14:41:58 crc kubenswrapper[4751]: E0131 14:41:58.051837 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 14:41:58 crc kubenswrapper[4751]: E0131 14:41:58.051853 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 14:41:58 crc kubenswrapper[4751]: E0131 14:41:58.051865 4751 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 14:41:58 crc kubenswrapper[4751]: E0131 14:41:58.051895 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-31 14:41:59.051886237 +0000 UTC m=+23.426599132 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 14:41:58 crc kubenswrapper[4751]: E0131 14:41:58.051946 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 14:41:58 crc kubenswrapper[4751]: E0131 14:41:58.051957 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 14:41:58 crc kubenswrapper[4751]: E0131 14:41:58.051966 4751 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 14:41:58 crc kubenswrapper[4751]: E0131 14:41:58.051992 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-31 14:41:59.051983459 +0000 UTC m=+23.426696354 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.358639 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 13:47:06.726019216 +0000 UTC Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.412780 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.414154 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.417181 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.418771 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.419959 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.421047 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.422251 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.423422 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.424661 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.425775 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.426777 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.429328 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.430374 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.431489 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.432557 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.433634 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.434762 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.435578 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.436753 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.439456 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.440725 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.442160 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.443148 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.445366 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.446277 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.448121 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.449552 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.450568 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.451843 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.452806 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.453819 4751 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.454025 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.456998 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.458019 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.460301 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.462643 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.463995 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.465160 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.466453 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.467859 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.468810 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.470050 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.472837 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.475508 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.476701 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.479216 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.480619 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.483765 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.485427 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.487490 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.488463 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.489509 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.490869 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.491928 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.560824 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"202411a00c080441e6f714d59fc005cf3be5bb4c7484ec618e42efd4b8389e50"} Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.563847 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"39d4eee728dbd9b6b0ade5149fd481cb3a688a6856f56a3a25a6cb49c944ee27"} Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.563894 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"5bac338294303499772c33b17e0d59dadfd61bbde41282085f90771886819933"} Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.566267 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"7adba94225f6d18961ccad4eab46916b607a2bff977a48d36fee6810aa4b7293"} Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.566379 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"11d68e8091e3d18fff0b3c8f744c2e9002b6307e17815b221af7123715d8abf3"} Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.566411 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"4d3590e243eaca0ee5e56d7c6d15fb2c545a920dec35c8fae50837d96db3b72b"} Jan 31 14:41:58 crc kubenswrapper[4751]: E0131 14:41:58.576010 4751 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-crc\" already exists" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.591565 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b7a3883-0c5a-4116-adb0-ad25f69cd7f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daae5a3d70247f744d33c832eaf3080a91e73ffe0b5327d111ec8e87175becdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37355e2b9d95a941be3552b713f7fe96973a49528f7a979524779b263412853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9317b698da5ee8a5f40c046d810783e523f27b6806319023e57e08f5cae267c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff7cd215439083e1e83ef38a79946be1c7e9bbc20d66cdb931c456a21bbdf17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:41:58Z is after 2025-08-24T17:21:41Z" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.611190 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:41:58Z is after 2025-08-24T17:21:41Z" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.634361 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:41:58Z is after 2025-08-24T17:21:41Z" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.655856 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:41:58Z is after 2025-08-24T17:21:41Z" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.680866 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:41:58Z is after 2025-08-24T17:21:41Z" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.707638 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:41:58Z is after 2025-08-24T17:21:41Z" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.734219 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6afb84a0-c564-45aa-b7a1-cd6f8273fe45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e4fe03f50f95359d284f7d5aebb0df697fba39743cf83857c7f17893cb2bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08603a998264e94ef7871338a47019047495685fb867c5fa4f290f8a990dc6ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddd61fd469b2710e86edf5644cff4c4272ce73a59b6578c4cf941c5408e2d218\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df6c6abfaa0e16e8c6de87da9d3584b57a9e83e9dee03f98bb9de38cf71cc19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f93de7a002a1e6a281d90042a003acf677ad136b29da511ead9acd31eb7ab2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T14:41:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 14:41:50.030836 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 14:41:50.033684 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-814856940/tls.crt::/tmp/serving-cert-814856940/tls.key\\\\\\\"\\\\nI0131 14:41:55.828999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 14:41:55.832995 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 14:41:55.833028 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 14:41:55.833056 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 14:41:55.833065 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 14:41:55.854492 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 14:41:55.854645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 14:41:55.854833 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 14:41:55.854881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 14:41:55.854891 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 14:41:55.855031 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 14:41:55.858196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d34543b9a432a0dd029c18b90dd065749cf5a74b9221dcedf22a810afa749a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:41:58Z is after 2025-08-24T17:21:41Z" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.758522 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d4eee728dbd9b6b0ade5149fd481cb3a688a6856f56a3a25a6cb49c944ee27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:41:58Z is after 2025-08-24T17:21:41Z" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.777136 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:41:58Z is after 2025-08-24T17:21:41Z" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.828459 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:41:58Z is after 2025-08-24T17:21:41Z" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.847573 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6afb84a0-c564-45aa-b7a1-cd6f8273fe45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e4fe03f50f95359d284f7d5aebb0df697fba39743cf83857c7f17893cb2bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08603a998264e94ef7871338a47019047495685fb867c5fa4f290f8a990dc6ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddd61fd469b2710e86edf5644cff4c4272ce73a59b6578c4cf941c5408e2d218\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df6c6abfaa0e16e8c6de87da9d3584b57a9e83e9dee03f98bb9de38cf71cc19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f93de7a002a1e6a281d90042a003acf677ad136b29da511ead9acd31eb7ab2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T14:41:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 14:41:50.030836 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 14:41:50.033684 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-814856940/tls.crt::/tmp/serving-cert-814856940/tls.key\\\\\\\"\\\\nI0131 14:41:55.828999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 14:41:55.832995 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 14:41:55.833028 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 14:41:55.833056 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 14:41:55.833065 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 14:41:55.854492 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 14:41:55.854645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 14:41:55.854833 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 14:41:55.854881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 14:41:55.854891 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 14:41:55.855031 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 14:41:55.858196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d34543b9a432a0dd029c18b90dd065749cf5a74b9221dcedf22a810afa749a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:41:58Z is after 2025-08-24T17:21:41Z" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.864625 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d4eee728dbd9b6b0ade5149fd481cb3a688a6856f56a3a25a6cb49c944ee27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:41:58Z is after 2025-08-24T17:21:41Z" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.883414 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:41:58Z is after 2025-08-24T17:21:41Z" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.896712 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b7a3883-0c5a-4116-adb0-ad25f69cd7f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daae5a3d70247f744d33c832eaf3080a91e73ffe0b5327d111ec8e87175becdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37355e2b9d95a941be3552b713f7fe96973a49528f7a979524779b263412853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9317b698da5ee8a5f40c046d810783e523f27b6806319023e57e08f5cae267c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff7cd215439083e1e83ef38a79946be1c7e9bbc20d66cdb931c456a21bbdf17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:41:58Z is after 2025-08-24T17:21:41Z" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.909328 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:41:58Z is after 2025-08-24T17:21:41Z" Jan 31 14:41:58 crc kubenswrapper[4751]: I0131 14:41:58.926894 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7adba94225f6d18961ccad4eab46916b607a2bff977a48d36fee6810aa4b7293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11d68e8091e3d18fff0b3c8f744c2e9002b6307e17815b221af7123715d8abf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:41:58Z is after 2025-08-24T17:21:41Z" Jan 31 14:41:59 crc kubenswrapper[4751]: I0131 14:41:59.060539 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:41:59 crc kubenswrapper[4751]: I0131 14:41:59.060645 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:41:59 crc kubenswrapper[4751]: I0131 14:41:59.060696 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:41:59 crc kubenswrapper[4751]: I0131 14:41:59.060736 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:41:59 crc kubenswrapper[4751]: I0131 14:41:59.060772 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:41:59 crc kubenswrapper[4751]: E0131 14:41:59.060870 4751 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 14:41:59 crc kubenswrapper[4751]: E0131 14:41:59.060938 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 14:42:01.060916911 +0000 UTC m=+25.435629836 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 14:41:59 crc kubenswrapper[4751]: E0131 14:41:59.061485 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:42:01.061466185 +0000 UTC m=+25.436179110 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:41:59 crc kubenswrapper[4751]: E0131 14:41:59.061604 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 14:41:59 crc kubenswrapper[4751]: E0131 14:41:59.061651 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 14:41:59 crc kubenswrapper[4751]: E0131 14:41:59.061672 4751 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 14:41:59 crc kubenswrapper[4751]: E0131 14:41:59.061719 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-31 14:42:01.061705132 +0000 UTC m=+25.436418057 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 14:41:59 crc kubenswrapper[4751]: E0131 14:41:59.061792 4751 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 14:41:59 crc kubenswrapper[4751]: E0131 14:41:59.061857 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 14:42:01.061844435 +0000 UTC m=+25.436557360 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 14:41:59 crc kubenswrapper[4751]: E0131 14:41:59.061941 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 14:41:59 crc kubenswrapper[4751]: E0131 14:41:59.061965 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 14:41:59 crc kubenswrapper[4751]: E0131 14:41:59.061981 4751 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 14:41:59 crc kubenswrapper[4751]: E0131 14:41:59.062019 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-31 14:42:01.062007219 +0000 UTC m=+25.436720144 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 14:41:59 crc kubenswrapper[4751]: I0131 14:41:59.358832 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 19:04:15.936801103 +0000 UTC Jan 31 14:41:59 crc kubenswrapper[4751]: I0131 14:41:59.405746 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:41:59 crc kubenswrapper[4751]: I0131 14:41:59.405804 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:41:59 crc kubenswrapper[4751]: I0131 14:41:59.405894 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:41:59 crc kubenswrapper[4751]: E0131 14:41:59.406026 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 14:41:59 crc kubenswrapper[4751]: E0131 14:41:59.406258 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 14:41:59 crc kubenswrapper[4751]: E0131 14:41:59.406380 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 14:42:00 crc kubenswrapper[4751]: I0131 14:42:00.359981 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 12:49:07.422974436 +0000 UTC Jan 31 14:42:00 crc kubenswrapper[4751]: I0131 14:42:00.726620 4751 csr.go:261] certificate signing request csr-h8p2w is approved, waiting to be issued Jan 31 14:42:00 crc kubenswrapper[4751]: I0131 14:42:00.778544 4751 csr.go:257] certificate signing request csr-h8p2w is issued Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.077199 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.077292 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:42:01 crc kubenswrapper[4751]: E0131 14:42:01.077369 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:42:05.077331528 +0000 UTC m=+29.452044403 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:42:01 crc kubenswrapper[4751]: E0131 14:42:01.077380 4751 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 14:42:01 crc kubenswrapper[4751]: E0131 14:42:01.077433 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 14:42:05.077426481 +0000 UTC m=+29.452139366 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.077473 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.077536 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.077585 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:42:01 crc kubenswrapper[4751]: E0131 14:42:01.077654 4751 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 14:42:01 crc kubenswrapper[4751]: E0131 14:42:01.077703 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 14:42:05.077689797 +0000 UTC m=+29.452402682 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 14:42:01 crc kubenswrapper[4751]: E0131 14:42:01.077757 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 14:42:01 crc kubenswrapper[4751]: E0131 14:42:01.077777 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 14:42:01 crc kubenswrapper[4751]: E0131 14:42:01.077783 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 14:42:01 crc kubenswrapper[4751]: E0131 14:42:01.077837 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 14:42:01 crc kubenswrapper[4751]: E0131 14:42:01.077852 4751 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 14:42:01 crc kubenswrapper[4751]: E0131 14:42:01.077933 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-31 14:42:05.077910213 +0000 UTC m=+29.452623098 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 14:42:01 crc kubenswrapper[4751]: E0131 14:42:01.077792 4751 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 14:42:01 crc kubenswrapper[4751]: E0131 14:42:01.077985 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-31 14:42:05.077978375 +0000 UTC m=+29.452691260 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.361155 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 09:46:24.694883137 +0000 UTC Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.405280 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.405298 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:42:01 crc kubenswrapper[4751]: E0131 14:42:01.405516 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 14:42:01 crc kubenswrapper[4751]: E0131 14:42:01.405594 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.406025 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:42:01 crc kubenswrapper[4751]: E0131 14:42:01.406323 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.576112 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"fc4ac60a9db8d22605139ad928fb8fd667aaba0b8563be03e796ad077fb525bf"} Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.639462 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6afb84a0-c564-45aa-b7a1-cd6f8273fe45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e4fe03f50f95359d284f7d5aebb0df697fba39743cf83857c7f17893cb2bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08603a998264e94ef7871338a47019047495685fb867c5fa4f290f8a990dc6ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddd61fd469b2710e86edf5644cff4c4272ce73a59b6578c4cf941c5408e2d218\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df6c6abfaa0e16e8c6de87da9d3584b57a9e83e9dee03f98bb9de38cf71cc19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f93de7a002a1e6a281d90042a003acf677ad136b29da511ead9acd31eb7ab2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T14:41:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 14:41:50.030836 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 14:41:50.033684 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-814856940/tls.crt::/tmp/serving-cert-814856940/tls.key\\\\\\\"\\\\nI0131 14:41:55.828999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 14:41:55.832995 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 14:41:55.833028 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 14:41:55.833056 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 14:41:55.833065 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 14:41:55.854492 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 14:41:55.854645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 14:41:55.854833 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 14:41:55.854881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 14:41:55.854891 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 14:41:55.855031 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 14:41:55.858196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d34543b9a432a0dd029c18b90dd065749cf5a74b9221dcedf22a810afa749a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:01Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.653555 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-2wpj7"] Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.654063 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.656038 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.656620 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.656721 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.663055 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-68hvr"] Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.663370 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-68hvr" Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.663843 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.668424 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.668484 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.668515 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.668479 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.681214 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d4eee728dbd9b6b0ade5149fd481cb3a688a6856f56a3a25a6cb49c944ee27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:01Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.698417 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:01Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.713291 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:01Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.727221 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:01Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.738958 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b7a3883-0c5a-4116-adb0-ad25f69cd7f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daae5a3d70247f744d33c832eaf3080a91e73ffe0b5327d111ec8e87175becdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37355e2b9d95a941be3552b713f7fe96973a49528f7a979524779b263412853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9317b698da5ee8a5f40c046d810783e523f27b6806319023e57e08f5cae267c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff7cd215439083e1e83ef38a79946be1c7e9bbc20d66cdb931c456a21bbdf17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:01Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.750641 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc4ac60a9db8d22605139ad928fb8fd667aaba0b8563be03e796ad077fb525bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:01Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.764041 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7adba94225f6d18961ccad4eab46916b607a2bff977a48d36fee6810aa4b7293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11d68e8091e3d18fff0b3c8f744c2e9002b6307e17815b221af7123715d8abf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:01Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.776624 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:01Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.779867 4751 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-01-31 14:37:00 +0000 UTC, rotation deadline is 2026-10-14 12:04:05.425901842 +0000 UTC Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.779964 4751 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6141h22m3.645941634s for next certificate rotation Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.784242 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4c170e8-22c9-43a9-8b34-9d626c2ccddc-proxy-tls\") pod \"machine-config-daemon-2wpj7\" (UID: \"b4c170e8-22c9-43a9-8b34-9d626c2ccddc\") " pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.784273 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b4c170e8-22c9-43a9-8b34-9d626c2ccddc-mcd-auth-proxy-config\") pod \"machine-config-daemon-2wpj7\" (UID: \"b4c170e8-22c9-43a9-8b34-9d626c2ccddc\") " pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.784300 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/b4c170e8-22c9-43a9-8b34-9d626c2ccddc-rootfs\") pod \"machine-config-daemon-2wpj7\" (UID: \"b4c170e8-22c9-43a9-8b34-9d626c2ccddc\") " pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.784324 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fv47c\" (UniqueName: \"kubernetes.io/projected/b4c170e8-22c9-43a9-8b34-9d626c2ccddc-kube-api-access-fv47c\") pod \"machine-config-daemon-2wpj7\" (UID: \"b4c170e8-22c9-43a9-8b34-9d626c2ccddc\") " pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.784351 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/658471aa-68b2-478e-9522-ef5533009174-hosts-file\") pod \"node-resolver-68hvr\" (UID: \"658471aa-68b2-478e-9522-ef5533009174\") " pod="openshift-dns/node-resolver-68hvr" Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.784365 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbl8x\" (UniqueName: \"kubernetes.io/projected/658471aa-68b2-478e-9522-ef5533009174-kube-api-access-nbl8x\") pod \"node-resolver-68hvr\" (UID: \"658471aa-68b2-478e-9522-ef5533009174\") " pod="openshift-dns/node-resolver-68hvr" Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.789100 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:01Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.801201 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:01Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.814283 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6afb84a0-c564-45aa-b7a1-cd6f8273fe45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e4fe03f50f95359d284f7d5aebb0df697fba39743cf83857c7f17893cb2bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08603a998264e94ef7871338a47019047495685fb867c5fa4f290f8a990dc6ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddd61fd469b2710e86edf5644cff4c4272ce73a59b6578c4cf941c5408e2d218\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df6c6abfaa0e16e8c6de87da9d3584b57a9e83e9dee03f98bb9de38cf71cc19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f93de7a002a1e6a281d90042a003acf677ad136b29da511ead9acd31eb7ab2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T14:41:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 14:41:50.030836 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 14:41:50.033684 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-814856940/tls.crt::/tmp/serving-cert-814856940/tls.key\\\\\\\"\\\\nI0131 14:41:55.828999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 14:41:55.832995 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 14:41:55.833028 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 14:41:55.833056 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 14:41:55.833065 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 14:41:55.854492 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 14:41:55.854645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 14:41:55.854833 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 14:41:55.854881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 14:41:55.854891 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 14:41:55.855031 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 14:41:55.858196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d34543b9a432a0dd029c18b90dd065749cf5a74b9221dcedf22a810afa749a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:01Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.827403 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d4eee728dbd9b6b0ade5149fd481cb3a688a6856f56a3a25a6cb49c944ee27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:01Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.837378 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c170e8-22c9-43a9-8b34-9d626c2ccddc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wpj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:01Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.852155 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b7a3883-0c5a-4116-adb0-ad25f69cd7f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daae5a3d70247f744d33c832eaf3080a91e73ffe0b5327d111ec8e87175becdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37355e2b9d95a941be3552b713f7fe96973a49528f7a979524779b263412853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9317b698da5ee8a5f40c046d810783e523f27b6806319023e57e08f5cae267c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff7cd215439083e1e83ef38a79946be1c7e9bbc20d66cdb931c456a21bbdf17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:01Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.877152 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc4ac60a9db8d22605139ad928fb8fd667aaba0b8563be03e796ad077fb525bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:01Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.885126 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fv47c\" (UniqueName: \"kubernetes.io/projected/b4c170e8-22c9-43a9-8b34-9d626c2ccddc-kube-api-access-fv47c\") pod \"machine-config-daemon-2wpj7\" (UID: \"b4c170e8-22c9-43a9-8b34-9d626c2ccddc\") " pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.885210 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/658471aa-68b2-478e-9522-ef5533009174-hosts-file\") pod \"node-resolver-68hvr\" (UID: \"658471aa-68b2-478e-9522-ef5533009174\") " pod="openshift-dns/node-resolver-68hvr" Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.885248 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbl8x\" (UniqueName: \"kubernetes.io/projected/658471aa-68b2-478e-9522-ef5533009174-kube-api-access-nbl8x\") pod \"node-resolver-68hvr\" (UID: \"658471aa-68b2-478e-9522-ef5533009174\") " pod="openshift-dns/node-resolver-68hvr" Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.885302 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4c170e8-22c9-43a9-8b34-9d626c2ccddc-proxy-tls\") pod \"machine-config-daemon-2wpj7\" (UID: \"b4c170e8-22c9-43a9-8b34-9d626c2ccddc\") " pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.885338 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b4c170e8-22c9-43a9-8b34-9d626c2ccddc-mcd-auth-proxy-config\") pod \"machine-config-daemon-2wpj7\" (UID: \"b4c170e8-22c9-43a9-8b34-9d626c2ccddc\") " pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.885414 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/b4c170e8-22c9-43a9-8b34-9d626c2ccddc-rootfs\") pod \"machine-config-daemon-2wpj7\" (UID: \"b4c170e8-22c9-43a9-8b34-9d626c2ccddc\") " pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.885425 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/658471aa-68b2-478e-9522-ef5533009174-hosts-file\") pod \"node-resolver-68hvr\" (UID: \"658471aa-68b2-478e-9522-ef5533009174\") " pod="openshift-dns/node-resolver-68hvr" Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.885503 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/b4c170e8-22c9-43a9-8b34-9d626c2ccddc-rootfs\") pod \"machine-config-daemon-2wpj7\" (UID: \"b4c170e8-22c9-43a9-8b34-9d626c2ccddc\") " pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.886190 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b4c170e8-22c9-43a9-8b34-9d626c2ccddc-mcd-auth-proxy-config\") pod \"machine-config-daemon-2wpj7\" (UID: \"b4c170e8-22c9-43a9-8b34-9d626c2ccddc\") " pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.891179 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4c170e8-22c9-43a9-8b34-9d626c2ccddc-proxy-tls\") pod \"machine-config-daemon-2wpj7\" (UID: \"b4c170e8-22c9-43a9-8b34-9d626c2ccddc\") " pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.899535 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7adba94225f6d18961ccad4eab46916b607a2bff977a48d36fee6810aa4b7293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11d68e8091e3d18fff0b3c8f744c2e9002b6307e17815b221af7123715d8abf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:01Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.907280 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fv47c\" (UniqueName: \"kubernetes.io/projected/b4c170e8-22c9-43a9-8b34-9d626c2ccddc-kube-api-access-fv47c\") pod \"machine-config-daemon-2wpj7\" (UID: \"b4c170e8-22c9-43a9-8b34-9d626c2ccddc\") " pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.915543 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbl8x\" (UniqueName: \"kubernetes.io/projected/658471aa-68b2-478e-9522-ef5533009174-kube-api-access-nbl8x\") pod \"node-resolver-68hvr\" (UID: \"658471aa-68b2-478e-9522-ef5533009174\") " pod="openshift-dns/node-resolver-68hvr" Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.921175 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-68hvr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658471aa-68b2-478e-9522-ef5533009174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbl8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-68hvr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:01Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.969155 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" Jan 31 14:42:01 crc kubenswrapper[4751]: I0131 14:42:01.977816 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-68hvr" Jan 31 14:42:01 crc kubenswrapper[4751]: W0131 14:42:01.982665 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4c170e8_22c9_43a9_8b34_9d626c2ccddc.slice/crio-05d1d264c2b92daddfdd471c8e47a1f262c2bb5d610c43c79d89680bbd0aeaad WatchSource:0}: Error finding container 05d1d264c2b92daddfdd471c8e47a1f262c2bb5d610c43c79d89680bbd0aeaad: Status 404 returned error can't find the container with id 05d1d264c2b92daddfdd471c8e47a1f262c2bb5d610c43c79d89680bbd0aeaad Jan 31 14:42:01 crc kubenswrapper[4751]: W0131 14:42:01.997743 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod658471aa_68b2_478e_9522_ef5533009174.slice/crio-e2832894341213686ed02201435b0fa24e74d13736b436632757af4a77da3862 WatchSource:0}: Error finding container e2832894341213686ed02201435b0fa24e74d13736b436632757af4a77da3862: Status 404 returned error can't find the container with id e2832894341213686ed02201435b0fa24e74d13736b436632757af4a77da3862 Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.053564 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-rp5sb"] Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.054408 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-rtthp"] Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.054647 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-rtthp" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.055175 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-rp5sb" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.056598 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.057054 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.057391 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.058824 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.059133 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.059344 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.059931 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.072012 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.084719 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7adba94225f6d18961ccad4eab46916b607a2bff977a48d36fee6810aa4b7293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11d68e8091e3d18fff0b3c8f744c2e9002b6307e17815b221af7123715d8abf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.092974 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-68hvr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658471aa-68b2-478e-9522-ef5533009174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbl8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-68hvr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.103641 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c170e8-22c9-43a9-8b34-9d626c2ccddc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wpj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.117800 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6afb84a0-c564-45aa-b7a1-cd6f8273fe45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e4fe03f50f95359d284f7d5aebb0df697fba39743cf83857c7f17893cb2bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08603a998264e94ef7871338a47019047495685fb867c5fa4f290f8a990dc6ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddd61fd469b2710e86edf5644cff4c4272ce73a59b6578c4cf941c5408e2d218\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df6c6abfaa0e16e8c6de87da9d3584b57a9e83e9dee03f98bb9de38cf71cc19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f93de7a002a1e6a281d90042a003acf677ad136b29da511ead9acd31eb7ab2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T14:41:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 14:41:50.030836 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 14:41:50.033684 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-814856940/tls.crt::/tmp/serving-cert-814856940/tls.key\\\\\\\"\\\\nI0131 14:41:55.828999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 14:41:55.832995 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 14:41:55.833028 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 14:41:55.833056 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 14:41:55.833065 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 14:41:55.854492 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 14:41:55.854645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 14:41:55.854833 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 14:41:55.854881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 14:41:55.854891 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 14:41:55.855031 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 14:41:55.858196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d34543b9a432a0dd029c18b90dd065749cf5a74b9221dcedf22a810afa749a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.132710 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d4eee728dbd9b6b0ade5149fd481cb3a688a6856f56a3a25a6cb49c944ee27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.148283 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.164568 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.179545 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b7a3883-0c5a-4116-adb0-ad25f69cd7f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daae5a3d70247f744d33c832eaf3080a91e73ffe0b5327d111ec8e87175becdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37355e2b9d95a941be3552b713f7fe96973a49528f7a979524779b263412853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9317b698da5ee8a5f40c046d810783e523f27b6806319023e57e08f5cae267c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff7cd215439083e1e83ef38a79946be1c7e9bbc20d66cdb931c456a21bbdf17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.187149 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c5353863-ec39-4357-9b86-9be42ca17916-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rp5sb\" (UID: \"c5353863-ec39-4357-9b86-9be42ca17916\") " pod="openshift-multus/multus-additional-cni-plugins-rp5sb" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.187195 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e7dd989b-33df-4562-a60b-f273428fea3d-hostroot\") pod \"multus-rtthp\" (UID: \"e7dd989b-33df-4562-a60b-f273428fea3d\") " pod="openshift-multus/multus-rtthp" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.187240 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e7dd989b-33df-4562-a60b-f273428fea3d-multus-daemon-config\") pod \"multus-rtthp\" (UID: \"e7dd989b-33df-4562-a60b-f273428fea3d\") " pod="openshift-multus/multus-rtthp" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.187261 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e7dd989b-33df-4562-a60b-f273428fea3d-host-run-k8s-cni-cncf-io\") pod \"multus-rtthp\" (UID: \"e7dd989b-33df-4562-a60b-f273428fea3d\") " pod="openshift-multus/multus-rtthp" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.187313 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7dd989b-33df-4562-a60b-f273428fea3d-etc-kubernetes\") pod \"multus-rtthp\" (UID: \"e7dd989b-33df-4562-a60b-f273428fea3d\") " pod="openshift-multus/multus-rtthp" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.187331 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e7dd989b-33df-4562-a60b-f273428fea3d-multus-cni-dir\") pod \"multus-rtthp\" (UID: \"e7dd989b-33df-4562-a60b-f273428fea3d\") " pod="openshift-multus/multus-rtthp" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.187347 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e7dd989b-33df-4562-a60b-f273428fea3d-cnibin\") pod \"multus-rtthp\" (UID: \"e7dd989b-33df-4562-a60b-f273428fea3d\") " pod="openshift-multus/multus-rtthp" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.187364 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e7dd989b-33df-4562-a60b-f273428fea3d-host-var-lib-kubelet\") pod \"multus-rtthp\" (UID: \"e7dd989b-33df-4562-a60b-f273428fea3d\") " pod="openshift-multus/multus-rtthp" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.187493 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e7dd989b-33df-4562-a60b-f273428fea3d-host-run-netns\") pod \"multus-rtthp\" (UID: \"e7dd989b-33df-4562-a60b-f273428fea3d\") " pod="openshift-multus/multus-rtthp" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.187562 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e7dd989b-33df-4562-a60b-f273428fea3d-multus-conf-dir\") pod \"multus-rtthp\" (UID: \"e7dd989b-33df-4562-a60b-f273428fea3d\") " pod="openshift-multus/multus-rtthp" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.187610 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c5353863-ec39-4357-9b86-9be42ca17916-system-cni-dir\") pod \"multus-additional-cni-plugins-rp5sb\" (UID: \"c5353863-ec39-4357-9b86-9be42ca17916\") " pod="openshift-multus/multus-additional-cni-plugins-rp5sb" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.187659 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e7dd989b-33df-4562-a60b-f273428fea3d-host-run-multus-certs\") pod \"multus-rtthp\" (UID: \"e7dd989b-33df-4562-a60b-f273428fea3d\") " pod="openshift-multus/multus-rtthp" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.187683 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e7dd989b-33df-4562-a60b-f273428fea3d-system-cni-dir\") pod \"multus-rtthp\" (UID: \"e7dd989b-33df-4562-a60b-f273428fea3d\") " pod="openshift-multus/multus-rtthp" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.187704 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e7dd989b-33df-4562-a60b-f273428fea3d-os-release\") pod \"multus-rtthp\" (UID: \"e7dd989b-33df-4562-a60b-f273428fea3d\") " pod="openshift-multus/multus-rtthp" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.187730 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e7dd989b-33df-4562-a60b-f273428fea3d-multus-socket-dir-parent\") pod \"multus-rtthp\" (UID: \"e7dd989b-33df-4562-a60b-f273428fea3d\") " pod="openshift-multus/multus-rtthp" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.187832 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c5353863-ec39-4357-9b86-9be42ca17916-cni-binary-copy\") pod \"multus-additional-cni-plugins-rp5sb\" (UID: \"c5353863-ec39-4357-9b86-9be42ca17916\") " pod="openshift-multus/multus-additional-cni-plugins-rp5sb" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.187896 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e7dd989b-33df-4562-a60b-f273428fea3d-host-var-lib-cni-multus\") pod \"multus-rtthp\" (UID: \"e7dd989b-33df-4562-a60b-f273428fea3d\") " pod="openshift-multus/multus-rtthp" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.187919 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgxmq\" (UniqueName: \"kubernetes.io/projected/c5353863-ec39-4357-9b86-9be42ca17916-kube-api-access-tgxmq\") pod \"multus-additional-cni-plugins-rp5sb\" (UID: \"c5353863-ec39-4357-9b86-9be42ca17916\") " pod="openshift-multus/multus-additional-cni-plugins-rp5sb" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.187989 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e7dd989b-33df-4562-a60b-f273428fea3d-host-var-lib-cni-bin\") pod \"multus-rtthp\" (UID: \"e7dd989b-33df-4562-a60b-f273428fea3d\") " pod="openshift-multus/multus-rtthp" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.188030 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c5353863-ec39-4357-9b86-9be42ca17916-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rp5sb\" (UID: \"c5353863-ec39-4357-9b86-9be42ca17916\") " pod="openshift-multus/multus-additional-cni-plugins-rp5sb" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.188115 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwrtf\" (UniqueName: \"kubernetes.io/projected/e7dd989b-33df-4562-a60b-f273428fea3d-kube-api-access-hwrtf\") pod \"multus-rtthp\" (UID: \"e7dd989b-33df-4562-a60b-f273428fea3d\") " pod="openshift-multus/multus-rtthp" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.188153 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c5353863-ec39-4357-9b86-9be42ca17916-cnibin\") pod \"multus-additional-cni-plugins-rp5sb\" (UID: \"c5353863-ec39-4357-9b86-9be42ca17916\") " pod="openshift-multus/multus-additional-cni-plugins-rp5sb" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.188185 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c5353863-ec39-4357-9b86-9be42ca17916-os-release\") pod \"multus-additional-cni-plugins-rp5sb\" (UID: \"c5353863-ec39-4357-9b86-9be42ca17916\") " pod="openshift-multus/multus-additional-cni-plugins-rp5sb" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.188236 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e7dd989b-33df-4562-a60b-f273428fea3d-cni-binary-copy\") pod \"multus-rtthp\" (UID: \"e7dd989b-33df-4562-a60b-f273428fea3d\") " pod="openshift-multus/multus-rtthp" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.193954 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc4ac60a9db8d22605139ad928fb8fd667aaba0b8563be03e796ad077fb525bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.212928 4751 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.214781 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.214820 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.214832 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.214977 4751 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.217900 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rtthp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7dd989b-33df-4562-a60b-f273428fea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwrtf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rtthp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.221273 4751 kubelet_node_status.go:115] "Node was previously registered" node="crc" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.221532 4751 kubelet_node_status.go:79] "Successfully registered node" node="crc" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.222569 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.222634 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.222647 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.222664 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.222677 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:02Z","lastTransitionTime":"2026-01-31T14:42:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.231355 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rp5sb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5353863-ec39-4357-9b86-9be42ca17916\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rp5sb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.240757 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-68hvr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658471aa-68b2-478e-9522-ef5533009174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbl8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-68hvr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:02 crc kubenswrapper[4751]: E0131 14:42:02.243306 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2bc08d22-1e39-4800-b402-ea260cc19637\\\",\\\"systemUUID\\\":\\\"ad4951a2-dbbf-4c4e-af3f-cce3c25b01ef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.247168 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.247217 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.247229 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.247257 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.247271 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:02Z","lastTransitionTime":"2026-01-31T14:42:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.253586 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7adba94225f6d18961ccad4eab46916b607a2bff977a48d36fee6810aa4b7293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11d68e8091e3d18fff0b3c8f744c2e9002b6307e17815b221af7123715d8abf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:02 crc kubenswrapper[4751]: E0131 14:42:02.258438 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2bc08d22-1e39-4800-b402-ea260cc19637\\\",\\\"systemUUID\\\":\\\"ad4951a2-dbbf-4c4e-af3f-cce3c25b01ef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.261262 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.261307 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.261321 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.261337 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.261350 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:02Z","lastTransitionTime":"2026-01-31T14:42:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.266886 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6afb84a0-c564-45aa-b7a1-cd6f8273fe45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e4fe03f50f95359d284f7d5aebb0df697fba39743cf83857c7f17893cb2bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08603a998264e94ef7871338a47019047495685fb867c5fa4f290f8a990dc6ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddd61fd469b2710e86edf5644cff4c4272ce73a59b6578c4cf941c5408e2d218\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df6c6abfaa0e16e8c6de87da9d3584b57a9e83e9dee03f98bb9de38cf71cc19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f93de7a002a1e6a281d90042a003acf677ad136b29da511ead9acd31eb7ab2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T14:41:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 14:41:50.030836 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 14:41:50.033684 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-814856940/tls.crt::/tmp/serving-cert-814856940/tls.key\\\\\\\"\\\\nI0131 14:41:55.828999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 14:41:55.832995 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 14:41:55.833028 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 14:41:55.833056 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 14:41:55.833065 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 14:41:55.854492 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 14:41:55.854645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 14:41:55.854833 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 14:41:55.854881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 14:41:55.854891 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 14:41:55.855031 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 14:41:55.858196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d34543b9a432a0dd029c18b90dd065749cf5a74b9221dcedf22a810afa749a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:02 crc kubenswrapper[4751]: E0131 14:42:02.271943 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2bc08d22-1e39-4800-b402-ea260cc19637\\\",\\\"systemUUID\\\":\\\"ad4951a2-dbbf-4c4e-af3f-cce3c25b01ef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.275523 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.275550 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.275557 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.275570 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.275581 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:02Z","lastTransitionTime":"2026-01-31T14:42:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.278371 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d4eee728dbd9b6b0ade5149fd481cb3a688a6856f56a3a25a6cb49c944ee27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.288865 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c5353863-ec39-4357-9b86-9be42ca17916-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rp5sb\" (UID: \"c5353863-ec39-4357-9b86-9be42ca17916\") " pod="openshift-multus/multus-additional-cni-plugins-rp5sb" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.288911 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwrtf\" (UniqueName: \"kubernetes.io/projected/e7dd989b-33df-4562-a60b-f273428fea3d-kube-api-access-hwrtf\") pod \"multus-rtthp\" (UID: \"e7dd989b-33df-4562-a60b-f273428fea3d\") " pod="openshift-multus/multus-rtthp" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.288956 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c5353863-ec39-4357-9b86-9be42ca17916-cnibin\") pod \"multus-additional-cni-plugins-rp5sb\" (UID: \"c5353863-ec39-4357-9b86-9be42ca17916\") " pod="openshift-multus/multus-additional-cni-plugins-rp5sb" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.289087 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c5353863-ec39-4357-9b86-9be42ca17916-cnibin\") pod \"multus-additional-cni-plugins-rp5sb\" (UID: \"c5353863-ec39-4357-9b86-9be42ca17916\") " pod="openshift-multus/multus-additional-cni-plugins-rp5sb" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.288999 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c5353863-ec39-4357-9b86-9be42ca17916-os-release\") pod \"multus-additional-cni-plugins-rp5sb\" (UID: \"c5353863-ec39-4357-9b86-9be42ca17916\") " pod="openshift-multus/multus-additional-cni-plugins-rp5sb" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.289234 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c5353863-ec39-4357-9b86-9be42ca17916-os-release\") pod \"multus-additional-cni-plugins-rp5sb\" (UID: \"c5353863-ec39-4357-9b86-9be42ca17916\") " pod="openshift-multus/multus-additional-cni-plugins-rp5sb" Jan 31 14:42:02 crc kubenswrapper[4751]: E0131 14:42:02.289017 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2bc08d22-1e39-4800-b402-ea260cc19637\\\",\\\"systemUUID\\\":\\\"ad4951a2-dbbf-4c4e-af3f-cce3c25b01ef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.289300 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e7dd989b-33df-4562-a60b-f273428fea3d-cni-binary-copy\") pod \"multus-rtthp\" (UID: \"e7dd989b-33df-4562-a60b-f273428fea3d\") " pod="openshift-multus/multus-rtthp" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.289684 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c5353863-ec39-4357-9b86-9be42ca17916-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rp5sb\" (UID: \"c5353863-ec39-4357-9b86-9be42ca17916\") " pod="openshift-multus/multus-additional-cni-plugins-rp5sb" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.289896 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c5353863-ec39-4357-9b86-9be42ca17916-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rp5sb\" (UID: \"c5353863-ec39-4357-9b86-9be42ca17916\") " pod="openshift-multus/multus-additional-cni-plugins-rp5sb" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.289977 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e7dd989b-33df-4562-a60b-f273428fea3d-cni-binary-copy\") pod \"multus-rtthp\" (UID: \"e7dd989b-33df-4562-a60b-f273428fea3d\") " pod="openshift-multus/multus-rtthp" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.290012 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c5353863-ec39-4357-9b86-9be42ca17916-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rp5sb\" (UID: \"c5353863-ec39-4357-9b86-9be42ca17916\") " pod="openshift-multus/multus-additional-cni-plugins-rp5sb" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.290120 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e7dd989b-33df-4562-a60b-f273428fea3d-hostroot\") pod \"multus-rtthp\" (UID: \"e7dd989b-33df-4562-a60b-f273428fea3d\") " pod="openshift-multus/multus-rtthp" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.290060 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e7dd989b-33df-4562-a60b-f273428fea3d-hostroot\") pod \"multus-rtthp\" (UID: \"e7dd989b-33df-4562-a60b-f273428fea3d\") " pod="openshift-multus/multus-rtthp" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.290199 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e7dd989b-33df-4562-a60b-f273428fea3d-multus-daemon-config\") pod \"multus-rtthp\" (UID: \"e7dd989b-33df-4562-a60b-f273428fea3d\") " pod="openshift-multus/multus-rtthp" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.290427 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e7dd989b-33df-4562-a60b-f273428fea3d-host-run-k8s-cni-cncf-io\") pod \"multus-rtthp\" (UID: \"e7dd989b-33df-4562-a60b-f273428fea3d\") " pod="openshift-multus/multus-rtthp" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.290763 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e7dd989b-33df-4562-a60b-f273428fea3d-multus-daemon-config\") pod \"multus-rtthp\" (UID: \"e7dd989b-33df-4562-a60b-f273428fea3d\") " pod="openshift-multus/multus-rtthp" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.290850 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e7dd989b-33df-4562-a60b-f273428fea3d-host-run-k8s-cni-cncf-io\") pod \"multus-rtthp\" (UID: \"e7dd989b-33df-4562-a60b-f273428fea3d\") " pod="openshift-multus/multus-rtthp" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.290894 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7dd989b-33df-4562-a60b-f273428fea3d-etc-kubernetes\") pod \"multus-rtthp\" (UID: \"e7dd989b-33df-4562-a60b-f273428fea3d\") " pod="openshift-multus/multus-rtthp" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.290921 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e7dd989b-33df-4562-a60b-f273428fea3d-multus-cni-dir\") pod \"multus-rtthp\" (UID: \"e7dd989b-33df-4562-a60b-f273428fea3d\") " pod="openshift-multus/multus-rtthp" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.290942 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e7dd989b-33df-4562-a60b-f273428fea3d-cnibin\") pod \"multus-rtthp\" (UID: \"e7dd989b-33df-4562-a60b-f273428fea3d\") " pod="openshift-multus/multus-rtthp" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.290963 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e7dd989b-33df-4562-a60b-f273428fea3d-host-var-lib-kubelet\") pod \"multus-rtthp\" (UID: \"e7dd989b-33df-4562-a60b-f273428fea3d\") " pod="openshift-multus/multus-rtthp" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.290984 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e7dd989b-33df-4562-a60b-f273428fea3d-host-run-netns\") pod \"multus-rtthp\" (UID: \"e7dd989b-33df-4562-a60b-f273428fea3d\") " pod="openshift-multus/multus-rtthp" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.291006 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e7dd989b-33df-4562-a60b-f273428fea3d-multus-conf-dir\") pod \"multus-rtthp\" (UID: \"e7dd989b-33df-4562-a60b-f273428fea3d\") " pod="openshift-multus/multus-rtthp" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.291012 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7dd989b-33df-4562-a60b-f273428fea3d-etc-kubernetes\") pod \"multus-rtthp\" (UID: \"e7dd989b-33df-4562-a60b-f273428fea3d\") " pod="openshift-multus/multus-rtthp" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.291026 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c5353863-ec39-4357-9b86-9be42ca17916-system-cni-dir\") pod \"multus-additional-cni-plugins-rp5sb\" (UID: \"c5353863-ec39-4357-9b86-9be42ca17916\") " pod="openshift-multus/multus-additional-cni-plugins-rp5sb" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.291048 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e7dd989b-33df-4562-a60b-f273428fea3d-host-run-netns\") pod \"multus-rtthp\" (UID: \"e7dd989b-33df-4562-a60b-f273428fea3d\") " pod="openshift-multus/multus-rtthp" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.291051 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e7dd989b-33df-4562-a60b-f273428fea3d-host-run-multus-certs\") pod \"multus-rtthp\" (UID: \"e7dd989b-33df-4562-a60b-f273428fea3d\") " pod="openshift-multus/multus-rtthp" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.291057 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e7dd989b-33df-4562-a60b-f273428fea3d-cnibin\") pod \"multus-rtthp\" (UID: \"e7dd989b-33df-4562-a60b-f273428fea3d\") " pod="openshift-multus/multus-rtthp" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.291093 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e7dd989b-33df-4562-a60b-f273428fea3d-system-cni-dir\") pod \"multus-rtthp\" (UID: \"e7dd989b-33df-4562-a60b-f273428fea3d\") " pod="openshift-multus/multus-rtthp" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.291145 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e7dd989b-33df-4562-a60b-f273428fea3d-os-release\") pod \"multus-rtthp\" (UID: \"e7dd989b-33df-4562-a60b-f273428fea3d\") " pod="openshift-multus/multus-rtthp" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.291094 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e7dd989b-33df-4562-a60b-f273428fea3d-multus-conf-dir\") pod \"multus-rtthp\" (UID: \"e7dd989b-33df-4562-a60b-f273428fea3d\") " pod="openshift-multus/multus-rtthp" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.291157 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e7dd989b-33df-4562-a60b-f273428fea3d-host-run-multus-certs\") pod \"multus-rtthp\" (UID: \"e7dd989b-33df-4562-a60b-f273428fea3d\") " pod="openshift-multus/multus-rtthp" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.291144 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c5353863-ec39-4357-9b86-9be42ca17916-system-cni-dir\") pod \"multus-additional-cni-plugins-rp5sb\" (UID: \"c5353863-ec39-4357-9b86-9be42ca17916\") " pod="openshift-multus/multus-additional-cni-plugins-rp5sb" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.291177 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e7dd989b-33df-4562-a60b-f273428fea3d-host-var-lib-kubelet\") pod \"multus-rtthp\" (UID: \"e7dd989b-33df-4562-a60b-f273428fea3d\") " pod="openshift-multus/multus-rtthp" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.291215 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e7dd989b-33df-4562-a60b-f273428fea3d-os-release\") pod \"multus-rtthp\" (UID: \"e7dd989b-33df-4562-a60b-f273428fea3d\") " pod="openshift-multus/multus-rtthp" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.291226 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e7dd989b-33df-4562-a60b-f273428fea3d-system-cni-dir\") pod \"multus-rtthp\" (UID: \"e7dd989b-33df-4562-a60b-f273428fea3d\") " pod="openshift-multus/multus-rtthp" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.291189 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e7dd989b-33df-4562-a60b-f273428fea3d-multus-socket-dir-parent\") pod \"multus-rtthp\" (UID: \"e7dd989b-33df-4562-a60b-f273428fea3d\") " pod="openshift-multus/multus-rtthp" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.291262 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c5353863-ec39-4357-9b86-9be42ca17916-cni-binary-copy\") pod \"multus-additional-cni-plugins-rp5sb\" (UID: \"c5353863-ec39-4357-9b86-9be42ca17916\") " pod="openshift-multus/multus-additional-cni-plugins-rp5sb" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.291266 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e7dd989b-33df-4562-a60b-f273428fea3d-multus-cni-dir\") pod \"multus-rtthp\" (UID: \"e7dd989b-33df-4562-a60b-f273428fea3d\") " pod="openshift-multus/multus-rtthp" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.291278 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e7dd989b-33df-4562-a60b-f273428fea3d-multus-socket-dir-parent\") pod \"multus-rtthp\" (UID: \"e7dd989b-33df-4562-a60b-f273428fea3d\") " pod="openshift-multus/multus-rtthp" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.291299 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e7dd989b-33df-4562-a60b-f273428fea3d-host-var-lib-cni-multus\") pod \"multus-rtthp\" (UID: \"e7dd989b-33df-4562-a60b-f273428fea3d\") " pod="openshift-multus/multus-rtthp" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.291323 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgxmq\" (UniqueName: \"kubernetes.io/projected/c5353863-ec39-4357-9b86-9be42ca17916-kube-api-access-tgxmq\") pod \"multus-additional-cni-plugins-rp5sb\" (UID: \"c5353863-ec39-4357-9b86-9be42ca17916\") " pod="openshift-multus/multus-additional-cni-plugins-rp5sb" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.291350 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e7dd989b-33df-4562-a60b-f273428fea3d-host-var-lib-cni-bin\") pod \"multus-rtthp\" (UID: \"e7dd989b-33df-4562-a60b-f273428fea3d\") " pod="openshift-multus/multus-rtthp" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.291357 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e7dd989b-33df-4562-a60b-f273428fea3d-host-var-lib-cni-multus\") pod \"multus-rtthp\" (UID: \"e7dd989b-33df-4562-a60b-f273428fea3d\") " pod="openshift-multus/multus-rtthp" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.291398 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e7dd989b-33df-4562-a60b-f273428fea3d-host-var-lib-cni-bin\") pod \"multus-rtthp\" (UID: \"e7dd989b-33df-4562-a60b-f273428fea3d\") " pod="openshift-multus/multus-rtthp" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.291864 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c5353863-ec39-4357-9b86-9be42ca17916-cni-binary-copy\") pod \"multus-additional-cni-plugins-rp5sb\" (UID: \"c5353863-ec39-4357-9b86-9be42ca17916\") " pod="openshift-multus/multus-additional-cni-plugins-rp5sb" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.292515 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.292898 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.292921 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.292929 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.292943 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.292952 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:02Z","lastTransitionTime":"2026-01-31T14:42:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:02 crc kubenswrapper[4751]: E0131 14:42:02.306673 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2bc08d22-1e39-4800-b402-ea260cc19637\\\",\\\"systemUUID\\\":\\\"ad4951a2-dbbf-4c4e-af3f-cce3c25b01ef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:02 crc kubenswrapper[4751]: E0131 14:42:02.307103 4751 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.307455 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwrtf\" (UniqueName: \"kubernetes.io/projected/e7dd989b-33df-4562-a60b-f273428fea3d-kube-api-access-hwrtf\") pod \"multus-rtthp\" (UID: \"e7dd989b-33df-4562-a60b-f273428fea3d\") " pod="openshift-multus/multus-rtthp" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.309118 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.309188 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.309203 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.309221 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.309239 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:02Z","lastTransitionTime":"2026-01-31T14:42:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.311554 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgxmq\" (UniqueName: \"kubernetes.io/projected/c5353863-ec39-4357-9b86-9be42ca17916-kube-api-access-tgxmq\") pod \"multus-additional-cni-plugins-rp5sb\" (UID: \"c5353863-ec39-4357-9b86-9be42ca17916\") " pod="openshift-multus/multus-additional-cni-plugins-rp5sb" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.314774 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.327686 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c170e8-22c9-43a9-8b34-9d626c2ccddc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wpj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.339272 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc4ac60a9db8d22605139ad928fb8fd667aaba0b8563be03e796ad077fb525bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.352900 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rtthp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7dd989b-33df-4562-a60b-f273428fea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwrtf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rtthp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.361755 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 22:46:45.908673933 +0000 UTC Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.368102 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-rtthp" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.368704 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rp5sb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5353863-ec39-4357-9b86-9be42ca17916\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rp5sb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.374097 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-rp5sb" Jan 31 14:42:02 crc kubenswrapper[4751]: W0131 14:42:02.377805 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7dd989b_33df_4562_a60b_f273428fea3d.slice/crio-ae820f722f84ee3f3836bd980ddcedf25d9d1ac2247797066e9c1251fe6f89a0 WatchSource:0}: Error finding container ae820f722f84ee3f3836bd980ddcedf25d9d1ac2247797066e9c1251fe6f89a0: Status 404 returned error can't find the container with id ae820f722f84ee3f3836bd980ddcedf25d9d1ac2247797066e9c1251fe6f89a0 Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.381717 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b7a3883-0c5a-4116-adb0-ad25f69cd7f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daae5a3d70247f744d33c832eaf3080a91e73ffe0b5327d111ec8e87175becdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37355e2b9d95a941be3552b713f7fe96973a49528f7a979524779b263412853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9317b698da5ee8a5f40c046d810783e523f27b6806319023e57e08f5cae267c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff7cd215439083e1e83ef38a79946be1c7e9bbc20d66cdb931c456a21bbdf17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.396682 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.411757 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.411799 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.411811 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.411826 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.411836 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:02Z","lastTransitionTime":"2026-01-31T14:42:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.419620 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-n8cdt"] Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.420560 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.427365 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.427555 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.427673 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.427716 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.427739 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.427841 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.428059 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.441314 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d4eee728dbd9b6b0ade5149fd481cb3a688a6856f56a3a25a6cb49c944ee27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.453956 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.464877 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.475950 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c170e8-22c9-43a9-8b34-9d626c2ccddc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wpj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.490216 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6afb84a0-c564-45aa-b7a1-cd6f8273fe45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e4fe03f50f95359d284f7d5aebb0df697fba39743cf83857c7f17893cb2bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08603a998264e94ef7871338a47019047495685fb867c5fa4f290f8a990dc6ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddd61fd469b2710e86edf5644cff4c4272ce73a59b6578c4cf941c5408e2d218\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df6c6abfaa0e16e8c6de87da9d3584b57a9e83e9dee03f98bb9de38cf71cc19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f93de7a002a1e6a281d90042a003acf677ad136b29da511ead9acd31eb7ab2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T14:41:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 14:41:50.030836 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 14:41:50.033684 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-814856940/tls.crt::/tmp/serving-cert-814856940/tls.key\\\\\\\"\\\\nI0131 14:41:55.828999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 14:41:55.832995 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 14:41:55.833028 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 14:41:55.833056 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 14:41:55.833065 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 14:41:55.854492 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 14:41:55.854645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 14:41:55.854833 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 14:41:55.854881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 14:41:55.854891 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 14:41:55.855031 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 14:41:55.858196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d34543b9a432a0dd029c18b90dd065749cf5a74b9221dcedf22a810afa749a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.504154 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b7a3883-0c5a-4116-adb0-ad25f69cd7f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daae5a3d70247f744d33c832eaf3080a91e73ffe0b5327d111ec8e87175becdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37355e2b9d95a941be3552b713f7fe96973a49528f7a979524779b263412853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9317b698da5ee8a5f40c046d810783e523f27b6806319023e57e08f5cae267c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff7cd215439083e1e83ef38a79946be1c7e9bbc20d66cdb931c456a21bbdf17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.516026 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.516062 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.516181 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.516203 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.516216 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:02Z","lastTransitionTime":"2026-01-31T14:42:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.516529 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc4ac60a9db8d22605139ad928fb8fd667aaba0b8563be03e796ad077fb525bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.535705 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rtthp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7dd989b-33df-4562-a60b-f273428fea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwrtf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rtthp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.553900 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rp5sb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5353863-ec39-4357-9b86-9be42ca17916\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rp5sb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.569033 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.580739 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rp5sb" event={"ID":"c5353863-ec39-4357-9b86-9be42ca17916","Type":"ContainerStarted","Data":"27d2e140808d508ff439e4cbc7870480463c92690227b2c341c4d3b77b5e3e73"} Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.581696 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7adba94225f6d18961ccad4eab46916b607a2bff977a48d36fee6810aa4b7293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11d68e8091e3d18fff0b3c8f744c2e9002b6307e17815b221af7123715d8abf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.583108 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rtthp" event={"ID":"e7dd989b-33df-4562-a60b-f273428fea3d","Type":"ContainerStarted","Data":"7ef2aa95d0f98ef0706040984786e2b48337984b508447f1b49b155ffb5f4608"} Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.583163 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rtthp" event={"ID":"e7dd989b-33df-4562-a60b-f273428fea3d","Type":"ContainerStarted","Data":"ae820f722f84ee3f3836bd980ddcedf25d9d1ac2247797066e9c1251fe6f89a0"} Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.585271 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-68hvr" event={"ID":"658471aa-68b2-478e-9522-ef5533009174","Type":"ContainerStarted","Data":"a5c5f03d09a887caf686dc5e5d446c4589c7e577cd65e9329cd73d1f33af3d53"} Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.585330 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-68hvr" event={"ID":"658471aa-68b2-478e-9522-ef5533009174","Type":"ContainerStarted","Data":"e2832894341213686ed02201435b0fa24e74d13736b436632757af4a77da3862"} Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.589832 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" event={"ID":"b4c170e8-22c9-43a9-8b34-9d626c2ccddc","Type":"ContainerStarted","Data":"d86496066c7ba97ec9852c160ecb2e25f49cb0b7ae16e667f6279be42fd52679"} Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.589877 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" event={"ID":"b4c170e8-22c9-43a9-8b34-9d626c2ccddc","Type":"ContainerStarted","Data":"3956d143be77f4a50143f9678eb51ab7871e250cae73d87c9e7fce2575e466c2"} Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.589889 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" event={"ID":"b4c170e8-22c9-43a9-8b34-9d626c2ccddc","Type":"ContainerStarted","Data":"05d1d264c2b92daddfdd471c8e47a1f262c2bb5d610c43c79d89680bbd0aeaad"} Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.593823 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-var-lib-openvswitch\") pod \"ovnkube-node-n8cdt\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.593850 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-run-openvswitch\") pod \"ovnkube-node-n8cdt\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.593866 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-ovnkube-script-lib\") pod \"ovnkube-node-n8cdt\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.593881 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-host-slash\") pod \"ovnkube-node-n8cdt\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.593897 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-ovn-node-metrics-cert\") pod \"ovnkube-node-n8cdt\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.593913 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-host-run-netns\") pod \"ovnkube-node-n8cdt\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.593925 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-run-ovn\") pod \"ovnkube-node-n8cdt\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.593941 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-etc-openvswitch\") pod \"ovnkube-node-n8cdt\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.593956 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-host-cni-netd\") pod \"ovnkube-node-n8cdt\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.593970 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-host-kubelet\") pod \"ovnkube-node-n8cdt\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.594029 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-host-run-ovn-kubernetes\") pod \"ovnkube-node-n8cdt\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.594112 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-n8cdt\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.594155 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-node-log\") pod \"ovnkube-node-n8cdt\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.594182 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-log-socket\") pod \"ovnkube-node-n8cdt\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.594212 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-systemd-units\") pod \"ovnkube-node-n8cdt\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.594259 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-ovnkube-config\") pod \"ovnkube-node-n8cdt\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.594289 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-env-overrides\") pod \"ovnkube-node-n8cdt\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.594319 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-host-cni-bin\") pod \"ovnkube-node-n8cdt\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.594342 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhmb7\" (UniqueName: \"kubernetes.io/projected/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-kube-api-access-zhmb7\") pod \"ovnkube-node-n8cdt\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.594362 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-run-systemd\") pod \"ovnkube-node-n8cdt\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.597347 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-68hvr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658471aa-68b2-478e-9522-ef5533009174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbl8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-68hvr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.619384 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.619427 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.619438 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.619458 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.619471 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:02Z","lastTransitionTime":"2026-01-31T14:42:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.619698 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n8cdt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.634166 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rtthp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7dd989b-33df-4562-a60b-f273428fea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ef2aa95d0f98ef0706040984786e2b48337984b508447f1b49b155ffb5f4608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwrtf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rtthp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.647026 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rp5sb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5353863-ec39-4357-9b86-9be42ca17916\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rp5sb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.661509 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b7a3883-0c5a-4116-adb0-ad25f69cd7f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daae5a3d70247f744d33c832eaf3080a91e73ffe0b5327d111ec8e87175becdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37355e2b9d95a941be3552b713f7fe96973a49528f7a979524779b263412853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9317b698da5ee8a5f40c046d810783e523f27b6806319023e57e08f5cae267c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff7cd215439083e1e83ef38a79946be1c7e9bbc20d66cdb931c456a21bbdf17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.673677 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc4ac60a9db8d22605139ad928fb8fd667aaba0b8563be03e796ad077fb525bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.689914 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.695531 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-host-run-netns\") pod \"ovnkube-node-n8cdt\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.695570 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-run-ovn\") pod \"ovnkube-node-n8cdt\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.695596 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-etc-openvswitch\") pod \"ovnkube-node-n8cdt\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.695619 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-host-cni-netd\") pod \"ovnkube-node-n8cdt\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.695641 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-etc-openvswitch\") pod \"ovnkube-node-n8cdt\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.695643 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-host-kubelet\") pod \"ovnkube-node-n8cdt\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.695669 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-host-kubelet\") pod \"ovnkube-node-n8cdt\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.695661 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-host-run-netns\") pod \"ovnkube-node-n8cdt\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.695699 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-host-cni-netd\") pod \"ovnkube-node-n8cdt\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.695702 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-host-run-ovn-kubernetes\") pod \"ovnkube-node-n8cdt\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.695619 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-run-ovn\") pod \"ovnkube-node-n8cdt\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.695748 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-n8cdt\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.695882 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-node-log\") pod \"ovnkube-node-n8cdt\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.695905 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-log-socket\") pod \"ovnkube-node-n8cdt\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.695914 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-n8cdt\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.695974 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-systemd-units\") pod \"ovnkube-node-n8cdt\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.695989 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-log-socket\") pod \"ovnkube-node-n8cdt\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.695999 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-ovnkube-config\") pod \"ovnkube-node-n8cdt\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.695999 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-host-run-ovn-kubernetes\") pod \"ovnkube-node-n8cdt\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.696038 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-node-log\") pod \"ovnkube-node-n8cdt\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.696023 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-env-overrides\") pod \"ovnkube-node-n8cdt\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.696015 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-systemd-units\") pod \"ovnkube-node-n8cdt\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.696238 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-run-systemd\") pod \"ovnkube-node-n8cdt\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.696276 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-host-cni-bin\") pod \"ovnkube-node-n8cdt\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.696298 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhmb7\" (UniqueName: \"kubernetes.io/projected/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-kube-api-access-zhmb7\") pod \"ovnkube-node-n8cdt\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.696341 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-run-openvswitch\") pod \"ovnkube-node-n8cdt\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.696362 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-ovnkube-script-lib\") pod \"ovnkube-node-n8cdt\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.696397 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-var-lib-openvswitch\") pod \"ovnkube-node-n8cdt\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.696411 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-run-systemd\") pod \"ovnkube-node-n8cdt\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.696437 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-host-slash\") pod \"ovnkube-node-n8cdt\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.696451 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-var-lib-openvswitch\") pod \"ovnkube-node-n8cdt\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.696464 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-ovn-node-metrics-cert\") pod \"ovnkube-node-n8cdt\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.696469 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-run-openvswitch\") pod \"ovnkube-node-n8cdt\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.696478 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-host-slash\") pod \"ovnkube-node-n8cdt\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.696671 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-env-overrides\") pod \"ovnkube-node-n8cdt\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.696729 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-ovnkube-config\") pod \"ovnkube-node-n8cdt\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.696716 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-host-cni-bin\") pod \"ovnkube-node-n8cdt\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.697148 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-ovnkube-script-lib\") pod \"ovnkube-node-n8cdt\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.699767 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-ovn-node-metrics-cert\") pod \"ovnkube-node-n8cdt\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.709265 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n8cdt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.712172 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhmb7\" (UniqueName: \"kubernetes.io/projected/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-kube-api-access-zhmb7\") pod \"ovnkube-node-n8cdt\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.722313 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.722358 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.722370 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.722389 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.722401 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:02Z","lastTransitionTime":"2026-01-31T14:42:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.724243 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7adba94225f6d18961ccad4eab46916b607a2bff977a48d36fee6810aa4b7293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11d68e8091e3d18fff0b3c8f744c2e9002b6307e17815b221af7123715d8abf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.733528 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-68hvr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658471aa-68b2-478e-9522-ef5533009174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5c5f03d09a887caf686dc5e5d446c4589c7e577cd65e9329cd73d1f33af3d53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbl8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-68hvr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.749713 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6afb84a0-c564-45aa-b7a1-cd6f8273fe45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e4fe03f50f95359d284f7d5aebb0df697fba39743cf83857c7f17893cb2bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08603a998264e94ef7871338a47019047495685fb867c5fa4f290f8a990dc6ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddd61fd469b2710e86edf5644cff4c4272ce73a59b6578c4cf941c5408e2d218\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df6c6abfaa0e16e8c6de87da9d3584b57a9e83e9dee03f98bb9de38cf71cc19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f93de7a002a1e6a281d90042a003acf677ad136b29da511ead9acd31eb7ab2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T14:41:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 14:41:50.030836 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 14:41:50.033684 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-814856940/tls.crt::/tmp/serving-cert-814856940/tls.key\\\\\\\"\\\\nI0131 14:41:55.828999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 14:41:55.832995 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 14:41:55.833028 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 14:41:55.833056 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 14:41:55.833065 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 14:41:55.854492 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 14:41:55.854645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 14:41:55.854833 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 14:41:55.854881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 14:41:55.854891 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 14:41:55.855031 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 14:41:55.858196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d34543b9a432a0dd029c18b90dd065749cf5a74b9221dcedf22a810afa749a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.753979 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:02 crc kubenswrapper[4751]: W0131 14:42:02.770153 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podceef6ba7_8d2d_4105_beee_6a8bdfd12c9b.slice/crio-4bbc5e8f3ce6775d094673644f5cb7355eba674b33cab2a960c6b275357e72b8 WatchSource:0}: Error finding container 4bbc5e8f3ce6775d094673644f5cb7355eba674b33cab2a960c6b275357e72b8: Status 404 returned error can't find the container with id 4bbc5e8f3ce6775d094673644f5cb7355eba674b33cab2a960c6b275357e72b8 Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.794652 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d4eee728dbd9b6b0ade5149fd481cb3a688a6856f56a3a25a6cb49c944ee27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.824086 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.824124 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.824133 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.824157 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.824167 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:02Z","lastTransitionTime":"2026-01-31T14:42:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.832539 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.870682 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.911472 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c170e8-22c9-43a9-8b34-9d626c2ccddc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d86496066c7ba97ec9852c160ecb2e25f49cb0b7ae16e667f6279be42fd52679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3956d143be77f4a50143f9678eb51ab7871e250cae73d87c9e7fce2575e466c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wpj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.926647 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.926697 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.926705 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.926720 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:02 crc kubenswrapper[4751]: I0131 14:42:02.926728 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:02Z","lastTransitionTime":"2026-01-31T14:42:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.029435 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.029473 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.029483 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.029496 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.029505 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:03Z","lastTransitionTime":"2026-01-31T14:42:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.132655 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.132698 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.132713 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.132730 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.132741 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:03Z","lastTransitionTime":"2026-01-31T14:42:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.235984 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.236516 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.236526 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.236552 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.236568 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:03Z","lastTransitionTime":"2026-01-31T14:42:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.338958 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.339001 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.339010 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.339024 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.339033 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:03Z","lastTransitionTime":"2026-01-31T14:42:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.362514 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 21:44:09.525256209 +0000 UTC Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.405086 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.405125 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.405151 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:42:03 crc kubenswrapper[4751]: E0131 14:42:03.405229 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 14:42:03 crc kubenswrapper[4751]: E0131 14:42:03.405329 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 14:42:03 crc kubenswrapper[4751]: E0131 14:42:03.405408 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.441925 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.441966 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.441979 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.442001 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.442014 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:03Z","lastTransitionTime":"2026-01-31T14:42:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.545210 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.545273 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.545292 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.545316 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.545348 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:03Z","lastTransitionTime":"2026-01-31T14:42:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.595281 4751 generic.go:334] "Generic (PLEG): container finished" podID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" containerID="122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9" exitCode=0 Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.595399 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" event={"ID":"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b","Type":"ContainerDied","Data":"122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9"} Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.595479 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" event={"ID":"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b","Type":"ContainerStarted","Data":"4bbc5e8f3ce6775d094673644f5cb7355eba674b33cab2a960c6b275357e72b8"} Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.597197 4751 generic.go:334] "Generic (PLEG): container finished" podID="c5353863-ec39-4357-9b86-9be42ca17916" containerID="53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39" exitCode=0 Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.597233 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rp5sb" event={"ID":"c5353863-ec39-4357-9b86-9be42ca17916","Type":"ContainerDied","Data":"53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39"} Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.617012 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:03Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.637787 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7adba94225f6d18961ccad4eab46916b607a2bff977a48d36fee6810aa4b7293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11d68e8091e3d18fff0b3c8f744c2e9002b6307e17815b221af7123715d8abf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:03Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.648111 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.648158 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.648174 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.648197 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.648211 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:03Z","lastTransitionTime":"2026-01-31T14:42:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.653396 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-68hvr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658471aa-68b2-478e-9522-ef5533009174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5c5f03d09a887caf686dc5e5d446c4589c7e577cd65e9329cd73d1f33af3d53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbl8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-68hvr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:03Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.684087 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n8cdt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:03Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.699531 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d4eee728dbd9b6b0ade5149fd481cb3a688a6856f56a3a25a6cb49c944ee27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:03Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.716879 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:03Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.729896 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:03Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.742433 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c170e8-22c9-43a9-8b34-9d626c2ccddc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d86496066c7ba97ec9852c160ecb2e25f49cb0b7ae16e667f6279be42fd52679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3956d143be77f4a50143f9678eb51ab7871e250cae73d87c9e7fce2575e466c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wpj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:03Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.751860 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.751919 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.751931 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.751947 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.751959 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:03Z","lastTransitionTime":"2026-01-31T14:42:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.757351 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6afb84a0-c564-45aa-b7a1-cd6f8273fe45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e4fe03f50f95359d284f7d5aebb0df697fba39743cf83857c7f17893cb2bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08603a998264e94ef7871338a47019047495685fb867c5fa4f290f8a990dc6ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddd61fd469b2710e86edf5644cff4c4272ce73a59b6578c4cf941c5408e2d218\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df6c6abfaa0e16e8c6de87da9d3584b57a9e83e9dee03f98bb9de38cf71cc19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f93de7a002a1e6a281d90042a003acf677ad136b29da511ead9acd31eb7ab2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T14:41:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 14:41:50.030836 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 14:41:50.033684 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-814856940/tls.crt::/tmp/serving-cert-814856940/tls.key\\\\\\\"\\\\nI0131 14:41:55.828999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 14:41:55.832995 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 14:41:55.833028 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 14:41:55.833056 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 14:41:55.833065 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 14:41:55.854492 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 14:41:55.854645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 14:41:55.854833 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 14:41:55.854881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 14:41:55.854891 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 14:41:55.855031 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 14:41:55.858196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d34543b9a432a0dd029c18b90dd065749cf5a74b9221dcedf22a810afa749a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:03Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.769432 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b7a3883-0c5a-4116-adb0-ad25f69cd7f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daae5a3d70247f744d33c832eaf3080a91e73ffe0b5327d111ec8e87175becdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37355e2b9d95a941be3552b713f7fe96973a49528f7a979524779b263412853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9317b698da5ee8a5f40c046d810783e523f27b6806319023e57e08f5cae267c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff7cd215439083e1e83ef38a79946be1c7e9bbc20d66cdb931c456a21bbdf17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:03Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.784792 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc4ac60a9db8d22605139ad928fb8fd667aaba0b8563be03e796ad077fb525bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:03Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.802888 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rtthp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7dd989b-33df-4562-a60b-f273428fea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ef2aa95d0f98ef0706040984786e2b48337984b508447f1b49b155ffb5f4608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwrtf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rtthp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:03Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.823914 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rp5sb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5353863-ec39-4357-9b86-9be42ca17916\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rp5sb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:03Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.838368 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7adba94225f6d18961ccad4eab46916b607a2bff977a48d36fee6810aa4b7293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11d68e8091e3d18fff0b3c8f744c2e9002b6307e17815b221af7123715d8abf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:03Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.851622 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-68hvr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658471aa-68b2-478e-9522-ef5533009174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5c5f03d09a887caf686dc5e5d446c4589c7e577cd65e9329cd73d1f33af3d53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbl8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-68hvr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:03Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.855415 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.855467 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.855479 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.855494 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.855525 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:03Z","lastTransitionTime":"2026-01-31T14:42:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.876316 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n8cdt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:03Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.890389 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6afb84a0-c564-45aa-b7a1-cd6f8273fe45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e4fe03f50f95359d284f7d5aebb0df697fba39743cf83857c7f17893cb2bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08603a998264e94ef7871338a47019047495685fb867c5fa4f290f8a990dc6ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddd61fd469b2710e86edf5644cff4c4272ce73a59b6578c4cf941c5408e2d218\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df6c6abfaa0e16e8c6de87da9d3584b57a9e83e9dee03f98bb9de38cf71cc19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f93de7a002a1e6a281d90042a003acf677ad136b29da511ead9acd31eb7ab2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T14:41:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 14:41:50.030836 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 14:41:50.033684 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-814856940/tls.crt::/tmp/serving-cert-814856940/tls.key\\\\\\\"\\\\nI0131 14:41:55.828999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 14:41:55.832995 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 14:41:55.833028 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 14:41:55.833056 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 14:41:55.833065 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 14:41:55.854492 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 14:41:55.854645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 14:41:55.854833 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 14:41:55.854881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 14:41:55.854891 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 14:41:55.855031 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 14:41:55.858196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d34543b9a432a0dd029c18b90dd065749cf5a74b9221dcedf22a810afa749a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:03Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.907971 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d4eee728dbd9b6b0ade5149fd481cb3a688a6856f56a3a25a6cb49c944ee27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:03Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.912645 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-lxrfr"] Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.912995 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-lxrfr" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.915576 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.915662 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.915550 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.915670 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.927373 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:03Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.948832 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:03Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.960163 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.960201 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.960213 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.960231 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.960242 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:03Z","lastTransitionTime":"2026-01-31T14:42:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.962775 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c170e8-22c9-43a9-8b34-9d626c2ccddc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d86496066c7ba97ec9852c160ecb2e25f49cb0b7ae16e667f6279be42fd52679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3956d143be77f4a50143f9678eb51ab7871e250cae73d87c9e7fce2575e466c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wpj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:03Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.977877 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b7a3883-0c5a-4116-adb0-ad25f69cd7f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daae5a3d70247f744d33c832eaf3080a91e73ffe0b5327d111ec8e87175becdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37355e2b9d95a941be3552b713f7fe96973a49528f7a979524779b263412853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9317b698da5ee8a5f40c046d810783e523f27b6806319023e57e08f5cae267c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff7cd215439083e1e83ef38a79946be1c7e9bbc20d66cdb931c456a21bbdf17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:03Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:03 crc kubenswrapper[4751]: I0131 14:42:03.989237 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc4ac60a9db8d22605139ad928fb8fd667aaba0b8563be03e796ad077fb525bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:03Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.002377 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rtthp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7dd989b-33df-4562-a60b-f273428fea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ef2aa95d0f98ef0706040984786e2b48337984b508447f1b49b155ffb5f4608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwrtf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rtthp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:04Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.015610 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rp5sb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5353863-ec39-4357-9b86-9be42ca17916\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rp5sb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:04Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.031282 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:04Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.062324 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.062358 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.062366 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.062378 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.062386 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:04Z","lastTransitionTime":"2026-01-31T14:42:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.071447 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:04Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.108705 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6b895e2a-7887-41c3-b641-9c72bb085dda-host\") pod \"node-ca-lxrfr\" (UID: \"6b895e2a-7887-41c3-b641-9c72bb085dda\") " pod="openshift-image-registry/node-ca-lxrfr" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.108738 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9hbh\" (UniqueName: \"kubernetes.io/projected/6b895e2a-7887-41c3-b641-9c72bb085dda-kube-api-access-s9hbh\") pod \"node-ca-lxrfr\" (UID: \"6b895e2a-7887-41c3-b641-9c72bb085dda\") " pod="openshift-image-registry/node-ca-lxrfr" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.108774 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6b895e2a-7887-41c3-b641-9c72bb085dda-serviceca\") pod \"node-ca-lxrfr\" (UID: \"6b895e2a-7887-41c3-b641-9c72bb085dda\") " pod="openshift-image-registry/node-ca-lxrfr" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.109789 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lxrfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b895e2a-7887-41c3-b641-9c72bb085dda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9hbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lxrfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:04Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.150036 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7adba94225f6d18961ccad4eab46916b607a2bff977a48d36fee6810aa4b7293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11d68e8091e3d18fff0b3c8f744c2e9002b6307e17815b221af7123715d8abf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:04Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.164652 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.164687 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.164695 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.164709 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.164719 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:04Z","lastTransitionTime":"2026-01-31T14:42:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.188312 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-68hvr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658471aa-68b2-478e-9522-ef5533009174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5c5f03d09a887caf686dc5e5d446c4589c7e577cd65e9329cd73d1f33af3d53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbl8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-68hvr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:04Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.209822 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6b895e2a-7887-41c3-b641-9c72bb085dda-serviceca\") pod \"node-ca-lxrfr\" (UID: \"6b895e2a-7887-41c3-b641-9c72bb085dda\") " pod="openshift-image-registry/node-ca-lxrfr" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.209874 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6b895e2a-7887-41c3-b641-9c72bb085dda-host\") pod \"node-ca-lxrfr\" (UID: \"6b895e2a-7887-41c3-b641-9c72bb085dda\") " pod="openshift-image-registry/node-ca-lxrfr" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.209896 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9hbh\" (UniqueName: \"kubernetes.io/projected/6b895e2a-7887-41c3-b641-9c72bb085dda-kube-api-access-s9hbh\") pod \"node-ca-lxrfr\" (UID: \"6b895e2a-7887-41c3-b641-9c72bb085dda\") " pod="openshift-image-registry/node-ca-lxrfr" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.210037 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6b895e2a-7887-41c3-b641-9c72bb085dda-host\") pod \"node-ca-lxrfr\" (UID: \"6b895e2a-7887-41c3-b641-9c72bb085dda\") " pod="openshift-image-registry/node-ca-lxrfr" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.210671 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6b895e2a-7887-41c3-b641-9c72bb085dda-serviceca\") pod \"node-ca-lxrfr\" (UID: \"6b895e2a-7887-41c3-b641-9c72bb085dda\") " pod="openshift-image-registry/node-ca-lxrfr" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.239001 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n8cdt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:04Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.261583 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9hbh\" (UniqueName: \"kubernetes.io/projected/6b895e2a-7887-41c3-b641-9c72bb085dda-kube-api-access-s9hbh\") pod \"node-ca-lxrfr\" (UID: \"6b895e2a-7887-41c3-b641-9c72bb085dda\") " pod="openshift-image-registry/node-ca-lxrfr" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.267090 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.267127 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.267136 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.267152 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.267161 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:04Z","lastTransitionTime":"2026-01-31T14:42:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.289497 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d4eee728dbd9b6b0ade5149fd481cb3a688a6856f56a3a25a6cb49c944ee27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:04Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.330951 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:04Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.362731 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 18:13:54.043065574 +0000 UTC Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.369444 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.369470 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.369479 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.369492 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.369502 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:04Z","lastTransitionTime":"2026-01-31T14:42:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.371790 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:04Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.411414 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c170e8-22c9-43a9-8b34-9d626c2ccddc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d86496066c7ba97ec9852c160ecb2e25f49cb0b7ae16e667f6279be42fd52679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3956d143be77f4a50143f9678eb51ab7871e250cae73d87c9e7fce2575e466c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wpj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:04Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.454182 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6afb84a0-c564-45aa-b7a1-cd6f8273fe45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e4fe03f50f95359d284f7d5aebb0df697fba39743cf83857c7f17893cb2bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08603a998264e94ef7871338a47019047495685fb867c5fa4f290f8a990dc6ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddd61fd469b2710e86edf5644cff4c4272ce73a59b6578c4cf941c5408e2d218\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df6c6abfaa0e16e8c6de87da9d3584b57a9e83e9dee03f98bb9de38cf71cc19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f93de7a002a1e6a281d90042a003acf677ad136b29da511ead9acd31eb7ab2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T14:41:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 14:41:50.030836 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 14:41:50.033684 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-814856940/tls.crt::/tmp/serving-cert-814856940/tls.key\\\\\\\"\\\\nI0131 14:41:55.828999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 14:41:55.832995 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 14:41:55.833028 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 14:41:55.833056 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 14:41:55.833065 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 14:41:55.854492 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 14:41:55.854645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 14:41:55.854833 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 14:41:55.854881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 14:41:55.854891 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 14:41:55.855031 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 14:41:55.858196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d34543b9a432a0dd029c18b90dd065749cf5a74b9221dcedf22a810afa749a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:04Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.471867 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.471904 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.471914 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.471929 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.471940 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:04Z","lastTransitionTime":"2026-01-31T14:42:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.493016 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rp5sb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5353863-ec39-4357-9b86-9be42ca17916\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rp5sb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:04Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.528243 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-lxrfr" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.531950 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b7a3883-0c5a-4116-adb0-ad25f69cd7f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daae5a3d70247f744d33c832eaf3080a91e73ffe0b5327d111ec8e87175becdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37355e2b9d95a941be3552b713f7fe96973a49528f7a979524779b263412853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9317b698da5ee8a5f40c046d810783e523f27b6806319023e57e08f5cae267c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff7cd215439083e1e83ef38a79946be1c7e9bbc20d66cdb931c456a21bbdf17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:04Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:04 crc kubenswrapper[4751]: W0131 14:42:04.540333 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b895e2a_7887_41c3_b641_9c72bb085dda.slice/crio-3321dec98fc03add119161935af375b4c1f47e9db4373c363cff3f4a2824b027 WatchSource:0}: Error finding container 3321dec98fc03add119161935af375b4c1f47e9db4373c363cff3f4a2824b027: Status 404 returned error can't find the container with id 3321dec98fc03add119161935af375b4c1f47e9db4373c363cff3f4a2824b027 Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.571408 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc4ac60a9db8d22605139ad928fb8fd667aaba0b8563be03e796ad077fb525bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:04Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.574263 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.574296 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.574305 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.574318 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.574328 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:04Z","lastTransitionTime":"2026-01-31T14:42:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.601849 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-lxrfr" event={"ID":"6b895e2a-7887-41c3-b641-9c72bb085dda","Type":"ContainerStarted","Data":"3321dec98fc03add119161935af375b4c1f47e9db4373c363cff3f4a2824b027"} Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.604263 4751 generic.go:334] "Generic (PLEG): container finished" podID="c5353863-ec39-4357-9b86-9be42ca17916" containerID="c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db" exitCode=0 Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.604331 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rp5sb" event={"ID":"c5353863-ec39-4357-9b86-9be42ca17916","Type":"ContainerDied","Data":"c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db"} Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.608814 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" event={"ID":"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b","Type":"ContainerStarted","Data":"701178503ef461547e687442de0607af20cee8bca075347d1d1cb2a439562232"} Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.608862 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" event={"ID":"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b","Type":"ContainerStarted","Data":"5623c591d9a6806335a9b671648fc172e749568a7ad29c3bd41f1a990ba56010"} Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.608873 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" event={"ID":"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b","Type":"ContainerStarted","Data":"20273cc05bc490f8dd5258dccd02aa0e97d1d22c159ce2c177594920f4af83de"} Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.608881 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" event={"ID":"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b","Type":"ContainerStarted","Data":"f3446cde960ab284f70a338a9a4bfee334fdbdef503d868692fb847f4df6acba"} Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.608890 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" event={"ID":"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b","Type":"ContainerStarted","Data":"357afc89f219ac34037aaeb7086076b36f05907502c53006aeb588eaf440cc74"} Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.608900 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" event={"ID":"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b","Type":"ContainerStarted","Data":"e8f6719bdf2fef7e21f7c5cdcb3de4b4f08e69bd62e22cb10fe157f638b3c148"} Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.614584 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rtthp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7dd989b-33df-4562-a60b-f273428fea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ef2aa95d0f98ef0706040984786e2b48337984b508447f1b49b155ffb5f4608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwrtf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rtthp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:04Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.651188 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rtthp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7dd989b-33df-4562-a60b-f273428fea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ef2aa95d0f98ef0706040984786e2b48337984b508447f1b49b155ffb5f4608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwrtf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rtthp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:04Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.676653 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.676693 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.676704 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.676720 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.676730 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:04Z","lastTransitionTime":"2026-01-31T14:42:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.690618 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rp5sb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5353863-ec39-4357-9b86-9be42ca17916\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rp5sb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:04Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.730953 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b7a3883-0c5a-4116-adb0-ad25f69cd7f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daae5a3d70247f744d33c832eaf3080a91e73ffe0b5327d111ec8e87175becdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37355e2b9d95a941be3552b713f7fe96973a49528f7a979524779b263412853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9317b698da5ee8a5f40c046d810783e523f27b6806319023e57e08f5cae267c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff7cd215439083e1e83ef38a79946be1c7e9bbc20d66cdb931c456a21bbdf17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:04Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.772392 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc4ac60a9db8d22605139ad928fb8fd667aaba0b8563be03e796ad077fb525bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:04Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.779030 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.779063 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.779095 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.779113 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.779124 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:04Z","lastTransitionTime":"2026-01-31T14:42:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.809291 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:04Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.847455 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lxrfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b895e2a-7887-41c3-b641-9c72bb085dda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9hbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lxrfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:04Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.882378 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.882426 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.882437 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.882455 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.882468 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:04Z","lastTransitionTime":"2026-01-31T14:42:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.893977 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n8cdt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:04Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.935302 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7adba94225f6d18961ccad4eab46916b607a2bff977a48d36fee6810aa4b7293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11d68e8091e3d18fff0b3c8f744c2e9002b6307e17815b221af7123715d8abf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:04Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.967758 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-68hvr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658471aa-68b2-478e-9522-ef5533009174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5c5f03d09a887caf686dc5e5d446c4589c7e577cd65e9329cd73d1f33af3d53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbl8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-68hvr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:04Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.984562 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.984639 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.984661 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.984705 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:04 crc kubenswrapper[4751]: I0131 14:42:04.984740 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:04Z","lastTransitionTime":"2026-01-31T14:42:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.025609 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6afb84a0-c564-45aa-b7a1-cd6f8273fe45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e4fe03f50f95359d284f7d5aebb0df697fba39743cf83857c7f17893cb2bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08603a998264e94ef7871338a47019047495685fb867c5fa4f290f8a990dc6ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddd61fd469b2710e86edf5644cff4c4272ce73a59b6578c4cf941c5408e2d218\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df6c6abfaa0e16e8c6de87da9d3584b57a9e83e9dee03f98bb9de38cf71cc19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f93de7a002a1e6a281d90042a003acf677ad136b29da511ead9acd31eb7ab2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T14:41:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 14:41:50.030836 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 14:41:50.033684 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-814856940/tls.crt::/tmp/serving-cert-814856940/tls.key\\\\\\\"\\\\nI0131 14:41:55.828999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 14:41:55.832995 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 14:41:55.833028 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 14:41:55.833056 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 14:41:55.833065 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 14:41:55.854492 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 14:41:55.854645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 14:41:55.854833 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 14:41:55.854881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 14:41:55.854891 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 14:41:55.855031 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 14:41:55.858196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d34543b9a432a0dd029c18b90dd065749cf5a74b9221dcedf22a810afa749a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:05Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.078420 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d4eee728dbd9b6b0ade5149fd481cb3a688a6856f56a3a25a6cb49c944ee27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:05Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.087414 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.087479 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.087495 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.087523 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.087549 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:05Z","lastTransitionTime":"2026-01-31T14:42:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.097170 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:05Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.117853 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:42:05 crc kubenswrapper[4751]: E0131 14:42:05.118028 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:42:13.117999377 +0000 UTC m=+37.492712262 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.118528 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.118582 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.118650 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:42:05 crc kubenswrapper[4751]: E0131 14:42:05.118681 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.118691 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:42:05 crc kubenswrapper[4751]: E0131 14:42:05.118700 4751 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 14:42:05 crc kubenswrapper[4751]: E0131 14:42:05.118742 4751 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 14:42:05 crc kubenswrapper[4751]: E0131 14:42:05.118750 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 14:42:05 crc kubenswrapper[4751]: E0131 14:42:05.118765 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 14:42:05 crc kubenswrapper[4751]: E0131 14:42:05.118776 4751 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 14:42:05 crc kubenswrapper[4751]: E0131 14:42:05.118781 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 14:42:13.118772487 +0000 UTC m=+37.493485362 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 14:42:05 crc kubenswrapper[4751]: E0131 14:42:05.118704 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 14:42:05 crc kubenswrapper[4751]: E0131 14:42:05.118799 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 14:42:13.118790987 +0000 UTC m=+37.493503872 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 14:42:05 crc kubenswrapper[4751]: E0131 14:42:05.118815 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-31 14:42:13.118809708 +0000 UTC m=+37.493522593 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 14:42:05 crc kubenswrapper[4751]: E0131 14:42:05.118821 4751 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 14:42:05 crc kubenswrapper[4751]: E0131 14:42:05.118894 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-31 14:42:13.118864879 +0000 UTC m=+37.493577944 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.130842 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:05Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.170555 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c170e8-22c9-43a9-8b34-9d626c2ccddc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d86496066c7ba97ec9852c160ecb2e25f49cb0b7ae16e667f6279be42fd52679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3956d143be77f4a50143f9678eb51ab7871e250cae73d87c9e7fce2575e466c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wpj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:05Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.191050 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.191092 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.191100 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.191114 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.191124 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:05Z","lastTransitionTime":"2026-01-31T14:42:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.294432 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.294543 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.294566 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.294609 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.294626 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:05Z","lastTransitionTime":"2026-01-31T14:42:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.363630 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 13:26:52.867534864 +0000 UTC Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.398218 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.398301 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.398321 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.398352 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.398371 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:05Z","lastTransitionTime":"2026-01-31T14:42:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.405617 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.405672 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.405750 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:42:05 crc kubenswrapper[4751]: E0131 14:42:05.405804 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 14:42:05 crc kubenswrapper[4751]: E0131 14:42:05.405957 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 14:42:05 crc kubenswrapper[4751]: E0131 14:42:05.406047 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.502548 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.502628 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.502649 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.502678 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.502701 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:05Z","lastTransitionTime":"2026-01-31T14:42:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.605735 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.605796 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.605813 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.605837 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.605854 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:05Z","lastTransitionTime":"2026-01-31T14:42:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.616324 4751 generic.go:334] "Generic (PLEG): container finished" podID="c5353863-ec39-4357-9b86-9be42ca17916" containerID="d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e" exitCode=0 Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.616389 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rp5sb" event={"ID":"c5353863-ec39-4357-9b86-9be42ca17916","Type":"ContainerDied","Data":"d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e"} Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.619652 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-lxrfr" event={"ID":"6b895e2a-7887-41c3-b641-9c72bb085dda","Type":"ContainerStarted","Data":"b5d0546721c181de3a5f2f4e791dcaeba69f94467bfbf08b4470332a0b4b49fc"} Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.640357 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c170e8-22c9-43a9-8b34-9d626c2ccddc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d86496066c7ba97ec9852c160ecb2e25f49cb0b7ae16e667f6279be42fd52679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3956d143be77f4a50143f9678eb51ab7871e250cae73d87c9e7fce2575e466c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wpj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:05Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.659459 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6afb84a0-c564-45aa-b7a1-cd6f8273fe45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e4fe03f50f95359d284f7d5aebb0df697fba39743cf83857c7f17893cb2bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08603a998264e94ef7871338a47019047495685fb867c5fa4f290f8a990dc6ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddd61fd469b2710e86edf5644cff4c4272ce73a59b6578c4cf941c5408e2d218\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df6c6abfaa0e16e8c6de87da9d3584b57a9e83e9dee03f98bb9de38cf71cc19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f93de7a002a1e6a281d90042a003acf677ad136b29da511ead9acd31eb7ab2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T14:41:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 14:41:50.030836 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 14:41:50.033684 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-814856940/tls.crt::/tmp/serving-cert-814856940/tls.key\\\\\\\"\\\\nI0131 14:41:55.828999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 14:41:55.832995 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 14:41:55.833028 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 14:41:55.833056 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 14:41:55.833065 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 14:41:55.854492 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 14:41:55.854645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 14:41:55.854833 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 14:41:55.854881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 14:41:55.854891 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 14:41:55.855031 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 14:41:55.858196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d34543b9a432a0dd029c18b90dd065749cf5a74b9221dcedf22a810afa749a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:05Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.682206 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d4eee728dbd9b6b0ade5149fd481cb3a688a6856f56a3a25a6cb49c944ee27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:05Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.697163 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:05Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.709185 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.709226 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.709237 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.709252 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.709263 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:05Z","lastTransitionTime":"2026-01-31T14:42:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.714095 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:05Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.728872 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b7a3883-0c5a-4116-adb0-ad25f69cd7f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daae5a3d70247f744d33c832eaf3080a91e73ffe0b5327d111ec8e87175becdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37355e2b9d95a941be3552b713f7fe96973a49528f7a979524779b263412853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9317b698da5ee8a5f40c046d810783e523f27b6806319023e57e08f5cae267c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff7cd215439083e1e83ef38a79946be1c7e9bbc20d66cdb931c456a21bbdf17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:05Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.741732 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc4ac60a9db8d22605139ad928fb8fd667aaba0b8563be03e796ad077fb525bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:05Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.754875 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rtthp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7dd989b-33df-4562-a60b-f273428fea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ef2aa95d0f98ef0706040984786e2b48337984b508447f1b49b155ffb5f4608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwrtf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rtthp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:05Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.769509 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rp5sb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5353863-ec39-4357-9b86-9be42ca17916\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rp5sb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:05Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.786099 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:05Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.797567 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lxrfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b895e2a-7887-41c3-b641-9c72bb085dda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9hbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lxrfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:05Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.811833 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7adba94225f6d18961ccad4eab46916b607a2bff977a48d36fee6810aa4b7293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11d68e8091e3d18fff0b3c8f744c2e9002b6307e17815b221af7123715d8abf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:05Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.812721 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.812772 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.812786 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.812806 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.812821 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:05Z","lastTransitionTime":"2026-01-31T14:42:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.825699 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-68hvr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658471aa-68b2-478e-9522-ef5533009174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5c5f03d09a887caf686dc5e5d446c4589c7e577cd65e9329cd73d1f33af3d53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbl8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-68hvr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:05Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.853517 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n8cdt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:05Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.867798 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b7a3883-0c5a-4116-adb0-ad25f69cd7f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daae5a3d70247f744d33c832eaf3080a91e73ffe0b5327d111ec8e87175becdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37355e2b9d95a941be3552b713f7fe96973a49528f7a979524779b263412853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9317b698da5ee8a5f40c046d810783e523f27b6806319023e57e08f5cae267c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff7cd215439083e1e83ef38a79946be1c7e9bbc20d66cdb931c456a21bbdf17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:05Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.882525 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc4ac60a9db8d22605139ad928fb8fd667aaba0b8563be03e796ad077fb525bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:05Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.900657 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rtthp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7dd989b-33df-4562-a60b-f273428fea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ef2aa95d0f98ef0706040984786e2b48337984b508447f1b49b155ffb5f4608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwrtf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rtthp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:05Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.915600 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.915654 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.915671 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.915694 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.915710 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:05Z","lastTransitionTime":"2026-01-31T14:42:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.916911 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rp5sb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5353863-ec39-4357-9b86-9be42ca17916\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rp5sb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:05Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.931890 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:05Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:05 crc kubenswrapper[4751]: I0131 14:42:05.971512 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lxrfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b895e2a-7887-41c3-b641-9c72bb085dda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d0546721c181de3a5f2f4e791dcaeba69f94467bfbf08b4470332a0b4b49fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9hbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lxrfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:05Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.013161 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7adba94225f6d18961ccad4eab46916b607a2bff977a48d36fee6810aa4b7293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11d68e8091e3d18fff0b3c8f744c2e9002b6307e17815b221af7123715d8abf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:06Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.017882 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.017942 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.017962 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.017988 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.018008 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:06Z","lastTransitionTime":"2026-01-31T14:42:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.053320 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-68hvr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658471aa-68b2-478e-9522-ef5533009174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5c5f03d09a887caf686dc5e5d446c4589c7e577cd65e9329cd73d1f33af3d53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbl8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-68hvr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:06Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.106257 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n8cdt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:06Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.122793 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.122858 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.122883 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.122913 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.122936 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:06Z","lastTransitionTime":"2026-01-31T14:42:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.137179 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d4eee728dbd9b6b0ade5149fd481cb3a688a6856f56a3a25a6cb49c944ee27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:06Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.174951 4751 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.180326 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Patch \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/pods/networking-console-plugin-85b44fc459-gdk6g/status\": read tcp 38.102.83.98:47400->38.102.83.98:6443: use of closed network connection" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.222962 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:06Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.225375 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.225559 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.225685 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.225807 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.225931 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:06Z","lastTransitionTime":"2026-01-31T14:42:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.254788 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c170e8-22c9-43a9-8b34-9d626c2ccddc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d86496066c7ba97ec9852c160ecb2e25f49cb0b7ae16e667f6279be42fd52679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3956d143be77f4a50143f9678eb51ab7871e250cae73d87c9e7fce2575e466c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wpj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:06Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.298293 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6afb84a0-c564-45aa-b7a1-cd6f8273fe45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e4fe03f50f95359d284f7d5aebb0df697fba39743cf83857c7f17893cb2bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08603a998264e94ef7871338a47019047495685fb867c5fa4f290f8a990dc6ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddd61fd469b2710e86edf5644cff4c4272ce73a59b6578c4cf941c5408e2d218\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df6c6abfaa0e16e8c6de87da9d3584b57a9e83e9dee03f98bb9de38cf71cc19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f93de7a002a1e6a281d90042a003acf677ad136b29da511ead9acd31eb7ab2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T14:41:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 14:41:50.030836 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 14:41:50.033684 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-814856940/tls.crt::/tmp/serving-cert-814856940/tls.key\\\\\\\"\\\\nI0131 14:41:55.828999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 14:41:55.832995 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 14:41:55.833028 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 14:41:55.833056 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 14:41:55.833065 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 14:41:55.854492 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 14:41:55.854645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 14:41:55.854833 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 14:41:55.854881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 14:41:55.854891 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 14:41:55.855031 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 14:41:55.858196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d34543b9a432a0dd029c18b90dd065749cf5a74b9221dcedf22a810afa749a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:06Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.363928 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 21:40:35.355473091 +0000 UTC Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.365199 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.365278 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.365303 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.365335 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.365356 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:06Z","lastTransitionTime":"2026-01-31T14:42:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.425933 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7adba94225f6d18961ccad4eab46916b607a2bff977a48d36fee6810aa4b7293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11d68e8091e3d18fff0b3c8f744c2e9002b6307e17815b221af7123715d8abf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:06Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.445217 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-68hvr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658471aa-68b2-478e-9522-ef5533009174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5c5f03d09a887caf686dc5e5d446c4589c7e577cd65e9329cd73d1f33af3d53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbl8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-68hvr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:06Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.468008 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.468061 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.468106 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.468134 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.468152 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:06Z","lastTransitionTime":"2026-01-31T14:42:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.480311 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n8cdt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:06Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.506904 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c170e8-22c9-43a9-8b34-9d626c2ccddc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d86496066c7ba97ec9852c160ecb2e25f49cb0b7ae16e667f6279be42fd52679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3956d143be77f4a50143f9678eb51ab7871e250cae73d87c9e7fce2575e466c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wpj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:06Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.531542 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6afb84a0-c564-45aa-b7a1-cd6f8273fe45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e4fe03f50f95359d284f7d5aebb0df697fba39743cf83857c7f17893cb2bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08603a998264e94ef7871338a47019047495685fb867c5fa4f290f8a990dc6ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddd61fd469b2710e86edf5644cff4c4272ce73a59b6578c4cf941c5408e2d218\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df6c6abfaa0e16e8c6de87da9d3584b57a9e83e9dee03f98bb9de38cf71cc19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f93de7a002a1e6a281d90042a003acf677ad136b29da511ead9acd31eb7ab2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T14:41:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 14:41:50.030836 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 14:41:50.033684 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-814856940/tls.crt::/tmp/serving-cert-814856940/tls.key\\\\\\\"\\\\nI0131 14:41:55.828999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 14:41:55.832995 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 14:41:55.833028 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 14:41:55.833056 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 14:41:55.833065 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 14:41:55.854492 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 14:41:55.854645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 14:41:55.854833 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 14:41:55.854881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 14:41:55.854891 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 14:41:55.855031 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 14:41:55.858196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d34543b9a432a0dd029c18b90dd065749cf5a74b9221dcedf22a810afa749a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:06Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.553439 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d4eee728dbd9b6b0ade5149fd481cb3a688a6856f56a3a25a6cb49c944ee27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:06Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.571248 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.571280 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.571290 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.571307 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.571318 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:06Z","lastTransitionTime":"2026-01-31T14:42:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.572024 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:06Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.613500 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:06Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.626339 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" event={"ID":"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b","Type":"ContainerStarted","Data":"e7f5e784895f315aa4d8fb6039b9e70c46f977a80fdd5d39ead81d75048abd6c"} Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.629383 4751 generic.go:334] "Generic (PLEG): container finished" podID="c5353863-ec39-4357-9b86-9be42ca17916" containerID="3c73fb39c6c9db93acb0fdc34044b1cafc05046a470e3897eeaa7fac8107691e" exitCode=0 Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.630273 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rp5sb" event={"ID":"c5353863-ec39-4357-9b86-9be42ca17916","Type":"ContainerDied","Data":"3c73fb39c6c9db93acb0fdc34044b1cafc05046a470e3897eeaa7fac8107691e"} Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.654890 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b7a3883-0c5a-4116-adb0-ad25f69cd7f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daae5a3d70247f744d33c832eaf3080a91e73ffe0b5327d111ec8e87175becdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37355e2b9d95a941be3552b713f7fe96973a49528f7a979524779b263412853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9317b698da5ee8a5f40c046d810783e523f27b6806319023e57e08f5cae267c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff7cd215439083e1e83ef38a79946be1c7e9bbc20d66cdb931c456a21bbdf17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:06Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.673787 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.673824 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.673837 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.673855 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.673866 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:06Z","lastTransitionTime":"2026-01-31T14:42:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.694227 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc4ac60a9db8d22605139ad928fb8fd667aaba0b8563be03e796ad077fb525bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:06Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.735200 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rtthp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7dd989b-33df-4562-a60b-f273428fea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ef2aa95d0f98ef0706040984786e2b48337984b508447f1b49b155ffb5f4608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwrtf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rtthp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:06Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.776109 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rp5sb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5353863-ec39-4357-9b86-9be42ca17916\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rp5sb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:06Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.776564 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.776591 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.776603 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.776620 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.776634 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:06Z","lastTransitionTime":"2026-01-31T14:42:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.815642 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:06Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.851898 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lxrfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b895e2a-7887-41c3-b641-9c72bb085dda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d0546721c181de3a5f2f4e791dcaeba69f94467bfbf08b4470332a0b4b49fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9hbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lxrfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:06Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.879415 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.879460 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.879470 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.879485 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.879495 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:06Z","lastTransitionTime":"2026-01-31T14:42:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.898038 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:06Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.933004 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lxrfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b895e2a-7887-41c3-b641-9c72bb085dda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d0546721c181de3a5f2f4e791dcaeba69f94467bfbf08b4470332a0b4b49fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9hbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lxrfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:06Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.978762 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7adba94225f6d18961ccad4eab46916b607a2bff977a48d36fee6810aa4b7293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11d68e8091e3d18fff0b3c8f744c2e9002b6307e17815b221af7123715d8abf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:06Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.982919 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.983018 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.983035 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.983063 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:06 crc kubenswrapper[4751]: I0131 14:42:06.983130 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:06Z","lastTransitionTime":"2026-01-31T14:42:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.011956 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-68hvr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658471aa-68b2-478e-9522-ef5533009174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5c5f03d09a887caf686dc5e5d446c4589c7e577cd65e9329cd73d1f33af3d53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbl8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-68hvr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:07Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.065317 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n8cdt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:07Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.085953 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.086034 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.086099 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.086131 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.086150 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:07Z","lastTransitionTime":"2026-01-31T14:42:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.101204 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6afb84a0-c564-45aa-b7a1-cd6f8273fe45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e4fe03f50f95359d284f7d5aebb0df697fba39743cf83857c7f17893cb2bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08603a998264e94ef7871338a47019047495685fb867c5fa4f290f8a990dc6ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddd61fd469b2710e86edf5644cff4c4272ce73a59b6578c4cf941c5408e2d218\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df6c6abfaa0e16e8c6de87da9d3584b57a9e83e9dee03f98bb9de38cf71cc19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f93de7a002a1e6a281d90042a003acf677ad136b29da511ead9acd31eb7ab2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T14:41:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 14:41:50.030836 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 14:41:50.033684 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-814856940/tls.crt::/tmp/serving-cert-814856940/tls.key\\\\\\\"\\\\nI0131 14:41:55.828999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 14:41:55.832995 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 14:41:55.833028 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 14:41:55.833056 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 14:41:55.833065 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 14:41:55.854492 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 14:41:55.854645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 14:41:55.854833 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 14:41:55.854881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 14:41:55.854891 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 14:41:55.855031 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 14:41:55.858196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d34543b9a432a0dd029c18b90dd065749cf5a74b9221dcedf22a810afa749a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:07Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.139133 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d4eee728dbd9b6b0ade5149fd481cb3a688a6856f56a3a25a6cb49c944ee27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:07Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.175968 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:07Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.188723 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.188792 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.188816 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.188844 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.188862 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:07Z","lastTransitionTime":"2026-01-31T14:42:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.216826 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:07Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.255954 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c170e8-22c9-43a9-8b34-9d626c2ccddc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d86496066c7ba97ec9852c160ecb2e25f49cb0b7ae16e667f6279be42fd52679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3956d143be77f4a50143f9678eb51ab7871e250cae73d87c9e7fce2575e466c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wpj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:07Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.292035 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.292143 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.292169 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.292211 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.292242 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:07Z","lastTransitionTime":"2026-01-31T14:42:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.297383 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b7a3883-0c5a-4116-adb0-ad25f69cd7f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daae5a3d70247f744d33c832eaf3080a91e73ffe0b5327d111ec8e87175becdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37355e2b9d95a941be3552b713f7fe96973a49528f7a979524779b263412853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9317b698da5ee8a5f40c046d810783e523f27b6806319023e57e08f5cae267c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff7cd215439083e1e83ef38a79946be1c7e9bbc20d66cdb931c456a21bbdf17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:07Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.337890 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc4ac60a9db8d22605139ad928fb8fd667aaba0b8563be03e796ad077fb525bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:07Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.364779 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 10:20:01.305474936 +0000 UTC Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.379200 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rtthp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7dd989b-33df-4562-a60b-f273428fea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ef2aa95d0f98ef0706040984786e2b48337984b508447f1b49b155ffb5f4608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwrtf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rtthp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:07Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.397065 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.397423 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.397632 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.397827 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.398013 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:07Z","lastTransitionTime":"2026-01-31T14:42:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.405130 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:42:07 crc kubenswrapper[4751]: E0131 14:42:07.405500 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.405584 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:42:07 crc kubenswrapper[4751]: E0131 14:42:07.405886 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.405643 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:42:07 crc kubenswrapper[4751]: E0131 14:42:07.406262 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.424800 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rp5sb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5353863-ec39-4357-9b86-9be42ca17916\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c73fb39c6c9db93acb0fdc34044b1cafc05046a470e3897eeaa7fac8107691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c73fb39c6c9db93acb0fdc34044b1cafc05046a470e3897eeaa7fac8107691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rp5sb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:07Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.502730 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.503119 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.503263 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.503420 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.503550 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:07Z","lastTransitionTime":"2026-01-31T14:42:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.606625 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.606651 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.606661 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.606677 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.606688 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:07Z","lastTransitionTime":"2026-01-31T14:42:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.640476 4751 generic.go:334] "Generic (PLEG): container finished" podID="c5353863-ec39-4357-9b86-9be42ca17916" containerID="fd0a7bc51ee2198f74d0adbfa358d772dc5176a7671b6a551157ad301348181d" exitCode=0 Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.640536 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rp5sb" event={"ID":"c5353863-ec39-4357-9b86-9be42ca17916","Type":"ContainerDied","Data":"fd0a7bc51ee2198f74d0adbfa358d772dc5176a7671b6a551157ad301348181d"} Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.654242 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7adba94225f6d18961ccad4eab46916b607a2bff977a48d36fee6810aa4b7293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11d68e8091e3d18fff0b3c8f744c2e9002b6307e17815b221af7123715d8abf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:07Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.671950 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-68hvr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658471aa-68b2-478e-9522-ef5533009174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5c5f03d09a887caf686dc5e5d446c4589c7e577cd65e9329cd73d1f33af3d53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbl8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-68hvr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:07Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.691776 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n8cdt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:07Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.704367 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:07Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.712514 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.712575 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.712595 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.712620 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.712637 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:07Z","lastTransitionTime":"2026-01-31T14:42:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.716013 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c170e8-22c9-43a9-8b34-9d626c2ccddc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d86496066c7ba97ec9852c160ecb2e25f49cb0b7ae16e667f6279be42fd52679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3956d143be77f4a50143f9678eb51ab7871e250cae73d87c9e7fce2575e466c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wpj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:07Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.731682 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6afb84a0-c564-45aa-b7a1-cd6f8273fe45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e4fe03f50f95359d284f7d5aebb0df697fba39743cf83857c7f17893cb2bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08603a998264e94ef7871338a47019047495685fb867c5fa4f290f8a990dc6ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddd61fd469b2710e86edf5644cff4c4272ce73a59b6578c4cf941c5408e2d218\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df6c6abfaa0e16e8c6de87da9d3584b57a9e83e9dee03f98bb9de38cf71cc19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f93de7a002a1e6a281d90042a003acf677ad136b29da511ead9acd31eb7ab2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T14:41:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 14:41:50.030836 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 14:41:50.033684 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-814856940/tls.crt::/tmp/serving-cert-814856940/tls.key\\\\\\\"\\\\nI0131 14:41:55.828999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 14:41:55.832995 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 14:41:55.833028 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 14:41:55.833056 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 14:41:55.833065 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 14:41:55.854492 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 14:41:55.854645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 14:41:55.854833 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 14:41:55.854881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 14:41:55.854891 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 14:41:55.855031 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 14:41:55.858196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d34543b9a432a0dd029c18b90dd065749cf5a74b9221dcedf22a810afa749a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:07Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.751974 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d4eee728dbd9b6b0ade5149fd481cb3a688a6856f56a3a25a6cb49c944ee27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:07Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.765679 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:07Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.783218 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b7a3883-0c5a-4116-adb0-ad25f69cd7f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daae5a3d70247f744d33c832eaf3080a91e73ffe0b5327d111ec8e87175becdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37355e2b9d95a941be3552b713f7fe96973a49528f7a979524779b263412853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9317b698da5ee8a5f40c046d810783e523f27b6806319023e57e08f5cae267c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff7cd215439083e1e83ef38a79946be1c7e9bbc20d66cdb931c456a21bbdf17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:07Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.814871 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc4ac60a9db8d22605139ad928fb8fd667aaba0b8563be03e796ad077fb525bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:07Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.815954 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.817098 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.817109 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.817123 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.817134 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:07Z","lastTransitionTime":"2026-01-31T14:42:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.853912 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rtthp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7dd989b-33df-4562-a60b-f273428fea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ef2aa95d0f98ef0706040984786e2b48337984b508447f1b49b155ffb5f4608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwrtf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rtthp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:07Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.902304 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rp5sb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5353863-ec39-4357-9b86-9be42ca17916\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c73fb39c6c9db93acb0fdc34044b1cafc05046a470e3897eeaa7fac8107691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c73fb39c6c9db93acb0fdc34044b1cafc05046a470e3897eeaa7fac8107691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd0a7bc51ee2198f74d0adbfa358d772dc5176a7671b6a551157ad301348181d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd0a7bc51ee2198f74d0adbfa358d772dc5176a7671b6a551157ad301348181d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rp5sb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:07Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.920934 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.921169 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.921247 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.921365 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.921455 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:07Z","lastTransitionTime":"2026-01-31T14:42:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.929834 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:07Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:07 crc kubenswrapper[4751]: I0131 14:42:07.972005 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lxrfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b895e2a-7887-41c3-b641-9c72bb085dda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d0546721c181de3a5f2f4e791dcaeba69f94467bfbf08b4470332a0b4b49fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9hbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lxrfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:07Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.024316 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.024377 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.024396 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.024423 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.024440 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:08Z","lastTransitionTime":"2026-01-31T14:42:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.127306 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.127363 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.127380 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.127454 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.127475 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:08Z","lastTransitionTime":"2026-01-31T14:42:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.230808 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.230888 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.230915 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.230953 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.230975 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:08Z","lastTransitionTime":"2026-01-31T14:42:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.334308 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.334379 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.334397 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.334422 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.334439 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:08Z","lastTransitionTime":"2026-01-31T14:42:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.365868 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 17:10:37.784530189 +0000 UTC Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.437212 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.437267 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.437285 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.437311 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.437328 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:08Z","lastTransitionTime":"2026-01-31T14:42:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.540200 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.540235 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.540245 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.540262 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.540275 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:08Z","lastTransitionTime":"2026-01-31T14:42:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.644149 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.644717 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.644828 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.644911 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.644985 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:08Z","lastTransitionTime":"2026-01-31T14:42:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.652012 4751 generic.go:334] "Generic (PLEG): container finished" podID="c5353863-ec39-4357-9b86-9be42ca17916" containerID="2dbe202f26fa5c09ba45db2a0bbb7fd7796b91e665df4529eddfdabd6ce083fc" exitCode=0 Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.652096 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rp5sb" event={"ID":"c5353863-ec39-4357-9b86-9be42ca17916","Type":"ContainerDied","Data":"2dbe202f26fa5c09ba45db2a0bbb7fd7796b91e665df4529eddfdabd6ce083fc"} Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.679229 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:08Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.694690 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lxrfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b895e2a-7887-41c3-b641-9c72bb085dda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d0546721c181de3a5f2f4e791dcaeba69f94467bfbf08b4470332a0b4b49fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9hbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lxrfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:08Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.725358 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n8cdt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:08Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.744234 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7adba94225f6d18961ccad4eab46916b607a2bff977a48d36fee6810aa4b7293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11d68e8091e3d18fff0b3c8f744c2e9002b6307e17815b221af7123715d8abf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:08Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.747244 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.747357 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.747443 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.747576 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.747684 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:08Z","lastTransitionTime":"2026-01-31T14:42:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.760031 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-68hvr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658471aa-68b2-478e-9522-ef5533009174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5c5f03d09a887caf686dc5e5d446c4589c7e577cd65e9329cd73d1f33af3d53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbl8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-68hvr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:08Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.779951 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6afb84a0-c564-45aa-b7a1-cd6f8273fe45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e4fe03f50f95359d284f7d5aebb0df697fba39743cf83857c7f17893cb2bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08603a998264e94ef7871338a47019047495685fb867c5fa4f290f8a990dc6ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddd61fd469b2710e86edf5644cff4c4272ce73a59b6578c4cf941c5408e2d218\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df6c6abfaa0e16e8c6de87da9d3584b57a9e83e9dee03f98bb9de38cf71cc19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f93de7a002a1e6a281d90042a003acf677ad136b29da511ead9acd31eb7ab2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T14:41:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 14:41:50.030836 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 14:41:50.033684 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-814856940/tls.crt::/tmp/serving-cert-814856940/tls.key\\\\\\\"\\\\nI0131 14:41:55.828999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 14:41:55.832995 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 14:41:55.833028 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 14:41:55.833056 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 14:41:55.833065 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 14:41:55.854492 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 14:41:55.854645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 14:41:55.854833 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 14:41:55.854881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 14:41:55.854891 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 14:41:55.855031 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 14:41:55.858196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d34543b9a432a0dd029c18b90dd065749cf5a74b9221dcedf22a810afa749a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:08Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.799429 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d4eee728dbd9b6b0ade5149fd481cb3a688a6856f56a3a25a6cb49c944ee27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:08Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.816439 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:08Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.833946 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:08Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.849782 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c170e8-22c9-43a9-8b34-9d626c2ccddc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d86496066c7ba97ec9852c160ecb2e25f49cb0b7ae16e667f6279be42fd52679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3956d143be77f4a50143f9678eb51ab7871e250cae73d87c9e7fce2575e466c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wpj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:08Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.850658 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.850716 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.850735 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.850760 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.850781 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:08Z","lastTransitionTime":"2026-01-31T14:42:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.869122 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rtthp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7dd989b-33df-4562-a60b-f273428fea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ef2aa95d0f98ef0706040984786e2b48337984b508447f1b49b155ffb5f4608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwrtf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rtthp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:08Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.889745 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rp5sb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5353863-ec39-4357-9b86-9be42ca17916\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c73fb39c6c9db93acb0fdc34044b1cafc05046a470e3897eeaa7fac8107691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c73fb39c6c9db93acb0fdc34044b1cafc05046a470e3897eeaa7fac8107691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd0a7bc51ee2198f74d0adbfa358d772dc5176a7671b6a551157ad301348181d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd0a7bc51ee2198f74d0adbfa358d772dc5176a7671b6a551157ad301348181d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dbe202f26fa5c09ba45db2a0bbb7fd7796b91e665df4529eddfdabd6ce083fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dbe202f26fa5c09ba45db2a0bbb7fd7796b91e665df4529eddfdabd6ce083fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rp5sb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:08Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.902258 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b7a3883-0c5a-4116-adb0-ad25f69cd7f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daae5a3d70247f744d33c832eaf3080a91e73ffe0b5327d111ec8e87175becdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37355e2b9d95a941be3552b713f7fe96973a49528f7a979524779b263412853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9317b698da5ee8a5f40c046d810783e523f27b6806319023e57e08f5cae267c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff7cd215439083e1e83ef38a79946be1c7e9bbc20d66cdb931c456a21bbdf17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:08Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.914150 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc4ac60a9db8d22605139ad928fb8fd667aaba0b8563be03e796ad077fb525bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:08Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.952998 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.953040 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.953054 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.953094 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:08 crc kubenswrapper[4751]: I0131 14:42:08.953115 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:08Z","lastTransitionTime":"2026-01-31T14:42:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.055859 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.055931 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.055951 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.055977 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.055996 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:09Z","lastTransitionTime":"2026-01-31T14:42:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.158972 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.159001 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.159011 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.159027 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.159038 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:09Z","lastTransitionTime":"2026-01-31T14:42:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.262278 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.262340 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.262365 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.262393 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.262415 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:09Z","lastTransitionTime":"2026-01-31T14:42:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.369776 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 19:58:13.040301254 +0000 UTC Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.371537 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.371716 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.371745 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.371837 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.371862 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:09Z","lastTransitionTime":"2026-01-31T14:42:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.405290 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.405365 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.405424 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:42:09 crc kubenswrapper[4751]: E0131 14:42:09.405739 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 14:42:09 crc kubenswrapper[4751]: E0131 14:42:09.405894 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 14:42:09 crc kubenswrapper[4751]: E0131 14:42:09.406023 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.476182 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.476252 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.476270 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.476297 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.476317 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:09Z","lastTransitionTime":"2026-01-31T14:42:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.580324 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.580379 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.580395 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.580418 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.580436 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:09Z","lastTransitionTime":"2026-01-31T14:42:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.662779 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rp5sb" event={"ID":"c5353863-ec39-4357-9b86-9be42ca17916","Type":"ContainerStarted","Data":"99c893782bcbcbd370030fe164c6849abe258ce88db65e3d22c269d20d0bef4e"} Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.672508 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" event={"ID":"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b","Type":"ContainerStarted","Data":"c599de37a76f4a9f00441a0b18a38e5315e42c49b315308d22f67c4cc68a8794"} Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.673523 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.673775 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.683028 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.683115 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.683142 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.683177 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.683205 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:09Z","lastTransitionTime":"2026-01-31T14:42:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.685222 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7adba94225f6d18961ccad4eab46916b607a2bff977a48d36fee6810aa4b7293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11d68e8091e3d18fff0b3c8f744c2e9002b6307e17815b221af7123715d8abf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:09Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.702430 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-68hvr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658471aa-68b2-478e-9522-ef5533009174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5c5f03d09a887caf686dc5e5d446c4589c7e577cd65e9329cd73d1f33af3d53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbl8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-68hvr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:09Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.757319 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.758822 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.785663 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.785713 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.785730 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.785756 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.785775 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:09Z","lastTransitionTime":"2026-01-31T14:42:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.803844 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n8cdt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:09Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.828961 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6afb84a0-c564-45aa-b7a1-cd6f8273fe45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e4fe03f50f95359d284f7d5aebb0df697fba39743cf83857c7f17893cb2bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08603a998264e94ef7871338a47019047495685fb867c5fa4f290f8a990dc6ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddd61fd469b2710e86edf5644cff4c4272ce73a59b6578c4cf941c5408e2d218\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df6c6abfaa0e16e8c6de87da9d3584b57a9e83e9dee03f98bb9de38cf71cc19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f93de7a002a1e6a281d90042a003acf677ad136b29da511ead9acd31eb7ab2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T14:41:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 14:41:50.030836 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 14:41:50.033684 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-814856940/tls.crt::/tmp/serving-cert-814856940/tls.key\\\\\\\"\\\\nI0131 14:41:55.828999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 14:41:55.832995 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 14:41:55.833028 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 14:41:55.833056 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 14:41:55.833065 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 14:41:55.854492 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 14:41:55.854645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 14:41:55.854833 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 14:41:55.854881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 14:41:55.854891 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 14:41:55.855031 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 14:41:55.858196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d34543b9a432a0dd029c18b90dd065749cf5a74b9221dcedf22a810afa749a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:09Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.843582 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d4eee728dbd9b6b0ade5149fd481cb3a688a6856f56a3a25a6cb49c944ee27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:09Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.857513 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:09Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.870824 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:09Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.882695 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c170e8-22c9-43a9-8b34-9d626c2ccddc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d86496066c7ba97ec9852c160ecb2e25f49cb0b7ae16e667f6279be42fd52679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3956d143be77f4a50143f9678eb51ab7871e250cae73d87c9e7fce2575e466c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wpj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:09Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.888206 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.888255 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.888272 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.888301 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.888319 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:09Z","lastTransitionTime":"2026-01-31T14:42:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.898316 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b7a3883-0c5a-4116-adb0-ad25f69cd7f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daae5a3d70247f744d33c832eaf3080a91e73ffe0b5327d111ec8e87175becdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37355e2b9d95a941be3552b713f7fe96973a49528f7a979524779b263412853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9317b698da5ee8a5f40c046d810783e523f27b6806319023e57e08f5cae267c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff7cd215439083e1e83ef38a79946be1c7e9bbc20d66cdb931c456a21bbdf17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:09Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.917414 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc4ac60a9db8d22605139ad928fb8fd667aaba0b8563be03e796ad077fb525bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:09Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.933358 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rtthp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7dd989b-33df-4562-a60b-f273428fea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ef2aa95d0f98ef0706040984786e2b48337984b508447f1b49b155ffb5f4608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwrtf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rtthp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:09Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.955117 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rp5sb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5353863-ec39-4357-9b86-9be42ca17916\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99c893782bcbcbd370030fe164c6849abe258ce88db65e3d22c269d20d0bef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c73fb39c6c9db93acb0fdc34044b1cafc05046a470e3897eeaa7fac8107691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c73fb39c6c9db93acb0fdc34044b1cafc05046a470e3897eeaa7fac8107691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd0a7bc51ee2198f74d0adbfa358d772dc5176a7671b6a551157ad301348181d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd0a7bc51ee2198f74d0adbfa358d772dc5176a7671b6a551157ad301348181d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dbe202f26fa5c09ba45db2a0bbb7fd7796b91e665df4529eddfdabd6ce083fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dbe202f26fa5c09ba45db2a0bbb7fd7796b91e665df4529eddfdabd6ce083fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rp5sb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:09Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.971208 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:09Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.983634 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lxrfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b895e2a-7887-41c3-b641-9c72bb085dda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d0546721c181de3a5f2f4e791dcaeba69f94467bfbf08b4470332a0b4b49fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9hbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lxrfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:09Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.990423 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.990487 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.990506 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.990531 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:09 crc kubenswrapper[4751]: I0131 14:42:09.990548 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:09Z","lastTransitionTime":"2026-01-31T14:42:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:10 crc kubenswrapper[4751]: I0131 14:42:10.004198 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:10Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:10 crc kubenswrapper[4751]: I0131 14:42:10.021046 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lxrfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b895e2a-7887-41c3-b641-9c72bb085dda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d0546721c181de3a5f2f4e791dcaeba69f94467bfbf08b4470332a0b4b49fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9hbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lxrfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:10Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:10 crc kubenswrapper[4751]: I0131 14:42:10.040092 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7adba94225f6d18961ccad4eab46916b607a2bff977a48d36fee6810aa4b7293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11d68e8091e3d18fff0b3c8f744c2e9002b6307e17815b221af7123715d8abf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:10Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:10 crc kubenswrapper[4751]: I0131 14:42:10.054265 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-68hvr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658471aa-68b2-478e-9522-ef5533009174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5c5f03d09a887caf686dc5e5d446c4589c7e577cd65e9329cd73d1f33af3d53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbl8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-68hvr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:10Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:10 crc kubenswrapper[4751]: I0131 14:42:10.084527 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3446cde960ab284f70a338a9a4bfee334fdbdef503d868692fb847f4df6acba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20273cc05bc490f8dd5258dccd02aa0e97d1d22c159ce2c177594920f4af83de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://701178503ef461547e687442de0607af20cee8bca075347d1d1cb2a439562232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5623c591d9a6806335a9b671648fc172e749568a7ad29c3bd41f1a990ba56010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://357afc89f219ac34037aaeb7086076b36f05907502c53006aeb588eaf440cc74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8f6719bdf2fef7e21f7c5cdcb3de4b4f08e69bd62e22cb10fe157f638b3c148\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c599de37a76f4a9f00441a0b18a38e5315e42c49b315308d22f67c4cc68a8794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7f5e784895f315aa4d8fb6039b9e70c46f977a80fdd5d39ead81d75048abd6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n8cdt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:10Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:10 crc kubenswrapper[4751]: I0131 14:42:10.093101 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:10 crc kubenswrapper[4751]: I0131 14:42:10.093152 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:10 crc kubenswrapper[4751]: I0131 14:42:10.093169 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:10 crc kubenswrapper[4751]: I0131 14:42:10.093193 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:10 crc kubenswrapper[4751]: I0131 14:42:10.093209 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:10Z","lastTransitionTime":"2026-01-31T14:42:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:10 crc kubenswrapper[4751]: I0131 14:42:10.099985 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:10Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:10 crc kubenswrapper[4751]: I0131 14:42:10.118988 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:10Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:10 crc kubenswrapper[4751]: I0131 14:42:10.142719 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c170e8-22c9-43a9-8b34-9d626c2ccddc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d86496066c7ba97ec9852c160ecb2e25f49cb0b7ae16e667f6279be42fd52679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3956d143be77f4a50143f9678eb51ab7871e250cae73d87c9e7fce2575e466c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wpj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:10Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:10 crc kubenswrapper[4751]: I0131 14:42:10.174605 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6afb84a0-c564-45aa-b7a1-cd6f8273fe45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e4fe03f50f95359d284f7d5aebb0df697fba39743cf83857c7f17893cb2bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08603a998264e94ef7871338a47019047495685fb867c5fa4f290f8a990dc6ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddd61fd469b2710e86edf5644cff4c4272ce73a59b6578c4cf941c5408e2d218\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df6c6abfaa0e16e8c6de87da9d3584b57a9e83e9dee03f98bb9de38cf71cc19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f93de7a002a1e6a281d90042a003acf677ad136b29da511ead9acd31eb7ab2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T14:41:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 14:41:50.030836 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 14:41:50.033684 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-814856940/tls.crt::/tmp/serving-cert-814856940/tls.key\\\\\\\"\\\\nI0131 14:41:55.828999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 14:41:55.832995 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 14:41:55.833028 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 14:41:55.833056 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 14:41:55.833065 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 14:41:55.854492 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 14:41:55.854645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 14:41:55.854833 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 14:41:55.854881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 14:41:55.854891 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 14:41:55.855031 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 14:41:55.858196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d34543b9a432a0dd029c18b90dd065749cf5a74b9221dcedf22a810afa749a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:10Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:10 crc kubenswrapper[4751]: I0131 14:42:10.196003 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:10 crc kubenswrapper[4751]: I0131 14:42:10.196299 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:10 crc kubenswrapper[4751]: I0131 14:42:10.196373 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:10 crc kubenswrapper[4751]: I0131 14:42:10.196454 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:10 crc kubenswrapper[4751]: I0131 14:42:10.196524 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:10Z","lastTransitionTime":"2026-01-31T14:42:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:10 crc kubenswrapper[4751]: I0131 14:42:10.197822 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d4eee728dbd9b6b0ade5149fd481cb3a688a6856f56a3a25a6cb49c944ee27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:10Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:10 crc kubenswrapper[4751]: I0131 14:42:10.219515 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b7a3883-0c5a-4116-adb0-ad25f69cd7f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daae5a3d70247f744d33c832eaf3080a91e73ffe0b5327d111ec8e87175becdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37355e2b9d95a941be3552b713f7fe96973a49528f7a979524779b263412853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9317b698da5ee8a5f40c046d810783e523f27b6806319023e57e08f5cae267c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff7cd215439083e1e83ef38a79946be1c7e9bbc20d66cdb931c456a21bbdf17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:10Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:10 crc kubenswrapper[4751]: I0131 14:42:10.232328 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc4ac60a9db8d22605139ad928fb8fd667aaba0b8563be03e796ad077fb525bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:10Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:10 crc kubenswrapper[4751]: I0131 14:42:10.244295 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rtthp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7dd989b-33df-4562-a60b-f273428fea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ef2aa95d0f98ef0706040984786e2b48337984b508447f1b49b155ffb5f4608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwrtf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rtthp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:10Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:10 crc kubenswrapper[4751]: I0131 14:42:10.260507 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rp5sb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5353863-ec39-4357-9b86-9be42ca17916\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99c893782bcbcbd370030fe164c6849abe258ce88db65e3d22c269d20d0bef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c73fb39c6c9db93acb0fdc34044b1cafc05046a470e3897eeaa7fac8107691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c73fb39c6c9db93acb0fdc34044b1cafc05046a470e3897eeaa7fac8107691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd0a7bc51ee2198f74d0adbfa358d772dc5176a7671b6a551157ad301348181d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd0a7bc51ee2198f74d0adbfa358d772dc5176a7671b6a551157ad301348181d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dbe202f26fa5c09ba45db2a0bbb7fd7796b91e665df4529eddfdabd6ce083fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dbe202f26fa5c09ba45db2a0bbb7fd7796b91e665df4529eddfdabd6ce083fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rp5sb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:10Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:10 crc kubenswrapper[4751]: I0131 14:42:10.299636 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:10 crc kubenswrapper[4751]: I0131 14:42:10.299685 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:10 crc kubenswrapper[4751]: I0131 14:42:10.299699 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:10 crc kubenswrapper[4751]: I0131 14:42:10.299717 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:10 crc kubenswrapper[4751]: I0131 14:42:10.299730 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:10Z","lastTransitionTime":"2026-01-31T14:42:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:10 crc kubenswrapper[4751]: I0131 14:42:10.370343 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 06:57:01.065586986 +0000 UTC Jan 31 14:42:10 crc kubenswrapper[4751]: I0131 14:42:10.402773 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:10 crc kubenswrapper[4751]: I0131 14:42:10.403062 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:10 crc kubenswrapper[4751]: I0131 14:42:10.403296 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:10 crc kubenswrapper[4751]: I0131 14:42:10.403525 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:10 crc kubenswrapper[4751]: I0131 14:42:10.403717 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:10Z","lastTransitionTime":"2026-01-31T14:42:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:10 crc kubenswrapper[4751]: I0131 14:42:10.507005 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:10 crc kubenswrapper[4751]: I0131 14:42:10.507105 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:10 crc kubenswrapper[4751]: I0131 14:42:10.507132 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:10 crc kubenswrapper[4751]: I0131 14:42:10.507160 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:10 crc kubenswrapper[4751]: I0131 14:42:10.507184 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:10Z","lastTransitionTime":"2026-01-31T14:42:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:10 crc kubenswrapper[4751]: I0131 14:42:10.609841 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:10 crc kubenswrapper[4751]: I0131 14:42:10.609916 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:10 crc kubenswrapper[4751]: I0131 14:42:10.609941 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:10 crc kubenswrapper[4751]: I0131 14:42:10.609974 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:10 crc kubenswrapper[4751]: I0131 14:42:10.609997 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:10Z","lastTransitionTime":"2026-01-31T14:42:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:10 crc kubenswrapper[4751]: I0131 14:42:10.676754 4751 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 31 14:42:10 crc kubenswrapper[4751]: I0131 14:42:10.712765 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:10 crc kubenswrapper[4751]: I0131 14:42:10.713352 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:10 crc kubenswrapper[4751]: I0131 14:42:10.713741 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:10 crc kubenswrapper[4751]: I0131 14:42:10.713929 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:10 crc kubenswrapper[4751]: I0131 14:42:10.714225 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:10Z","lastTransitionTime":"2026-01-31T14:42:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:10 crc kubenswrapper[4751]: I0131 14:42:10.983098 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:10 crc kubenswrapper[4751]: I0131 14:42:10.983179 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:10 crc kubenswrapper[4751]: I0131 14:42:10.983205 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:10 crc kubenswrapper[4751]: I0131 14:42:10.983239 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:10 crc kubenswrapper[4751]: I0131 14:42:10.983276 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:10Z","lastTransitionTime":"2026-01-31T14:42:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:11 crc kubenswrapper[4751]: I0131 14:42:11.086949 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:11 crc kubenswrapper[4751]: I0131 14:42:11.087000 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:11 crc kubenswrapper[4751]: I0131 14:42:11.087016 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:11 crc kubenswrapper[4751]: I0131 14:42:11.087039 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:11 crc kubenswrapper[4751]: I0131 14:42:11.087056 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:11Z","lastTransitionTime":"2026-01-31T14:42:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:11 crc kubenswrapper[4751]: I0131 14:42:11.189442 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:11 crc kubenswrapper[4751]: I0131 14:42:11.189486 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:11 crc kubenswrapper[4751]: I0131 14:42:11.189503 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:11 crc kubenswrapper[4751]: I0131 14:42:11.189526 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:11 crc kubenswrapper[4751]: I0131 14:42:11.189542 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:11Z","lastTransitionTime":"2026-01-31T14:42:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:11 crc kubenswrapper[4751]: I0131 14:42:11.291771 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:11 crc kubenswrapper[4751]: I0131 14:42:11.291805 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:11 crc kubenswrapper[4751]: I0131 14:42:11.291814 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:11 crc kubenswrapper[4751]: I0131 14:42:11.291827 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:11 crc kubenswrapper[4751]: I0131 14:42:11.291836 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:11Z","lastTransitionTime":"2026-01-31T14:42:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:11 crc kubenswrapper[4751]: I0131 14:42:11.370635 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 09:28:13.84131058 +0000 UTC Jan 31 14:42:11 crc kubenswrapper[4751]: I0131 14:42:11.393939 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:11 crc kubenswrapper[4751]: I0131 14:42:11.394170 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:11 crc kubenswrapper[4751]: I0131 14:42:11.394244 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:11 crc kubenswrapper[4751]: I0131 14:42:11.394322 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:11 crc kubenswrapper[4751]: I0131 14:42:11.394376 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:11Z","lastTransitionTime":"2026-01-31T14:42:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:11 crc kubenswrapper[4751]: I0131 14:42:11.405026 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:42:11 crc kubenswrapper[4751]: I0131 14:42:11.405084 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:42:11 crc kubenswrapper[4751]: I0131 14:42:11.405137 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:42:11 crc kubenswrapper[4751]: E0131 14:42:11.405453 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 14:42:11 crc kubenswrapper[4751]: E0131 14:42:11.405272 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 14:42:11 crc kubenswrapper[4751]: E0131 14:42:11.405504 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 14:42:11 crc kubenswrapper[4751]: I0131 14:42:11.496930 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:11 crc kubenswrapper[4751]: I0131 14:42:11.496962 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:11 crc kubenswrapper[4751]: I0131 14:42:11.496970 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:11 crc kubenswrapper[4751]: I0131 14:42:11.496983 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:11 crc kubenswrapper[4751]: I0131 14:42:11.496992 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:11Z","lastTransitionTime":"2026-01-31T14:42:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:11 crc kubenswrapper[4751]: I0131 14:42:11.599944 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:11 crc kubenswrapper[4751]: I0131 14:42:11.599978 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:11 crc kubenswrapper[4751]: I0131 14:42:11.599988 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:11 crc kubenswrapper[4751]: I0131 14:42:11.600003 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:11 crc kubenswrapper[4751]: I0131 14:42:11.600015 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:11Z","lastTransitionTime":"2026-01-31T14:42:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:11 crc kubenswrapper[4751]: I0131 14:42:11.680295 4751 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 31 14:42:11 crc kubenswrapper[4751]: I0131 14:42:11.702625 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:11 crc kubenswrapper[4751]: I0131 14:42:11.702965 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:11 crc kubenswrapper[4751]: I0131 14:42:11.702983 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:11 crc kubenswrapper[4751]: I0131 14:42:11.703007 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:11 crc kubenswrapper[4751]: I0131 14:42:11.703024 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:11Z","lastTransitionTime":"2026-01-31T14:42:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:11 crc kubenswrapper[4751]: I0131 14:42:11.806236 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:11 crc kubenswrapper[4751]: I0131 14:42:11.806287 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:11 crc kubenswrapper[4751]: I0131 14:42:11.806305 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:11 crc kubenswrapper[4751]: I0131 14:42:11.806329 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:11 crc kubenswrapper[4751]: I0131 14:42:11.806347 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:11Z","lastTransitionTime":"2026-01-31T14:42:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:11 crc kubenswrapper[4751]: I0131 14:42:11.909533 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:11 crc kubenswrapper[4751]: I0131 14:42:11.909571 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:11 crc kubenswrapper[4751]: I0131 14:42:11.909588 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:11 crc kubenswrapper[4751]: I0131 14:42:11.909609 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:11 crc kubenswrapper[4751]: I0131 14:42:11.909626 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:11Z","lastTransitionTime":"2026-01-31T14:42:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.011975 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.012028 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.012045 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.012096 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.012122 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:12Z","lastTransitionTime":"2026-01-31T14:42:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.114442 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.114474 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.114486 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.114500 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.114511 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:12Z","lastTransitionTime":"2026-01-31T14:42:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.217399 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.217456 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.217474 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.217498 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.217515 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:12Z","lastTransitionTime":"2026-01-31T14:42:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.319893 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.319946 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.319963 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.319984 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.320000 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:12Z","lastTransitionTime":"2026-01-31T14:42:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.371660 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 18:01:18.7718094 +0000 UTC Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.422551 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.422613 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.422638 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.422667 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.422692 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:12Z","lastTransitionTime":"2026-01-31T14:42:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.468307 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.468368 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.468392 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.468418 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.468441 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:12Z","lastTransitionTime":"2026-01-31T14:42:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:12 crc kubenswrapper[4751]: E0131 14:42:12.489973 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2bc08d22-1e39-4800-b402-ea260cc19637\\\",\\\"systemUUID\\\":\\\"ad4951a2-dbbf-4c4e-af3f-cce3c25b01ef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:12Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.494262 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.494319 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.494343 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.494372 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.494392 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:12Z","lastTransitionTime":"2026-01-31T14:42:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:12 crc kubenswrapper[4751]: E0131 14:42:12.515193 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2bc08d22-1e39-4800-b402-ea260cc19637\\\",\\\"systemUUID\\\":\\\"ad4951a2-dbbf-4c4e-af3f-cce3c25b01ef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:12Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.521958 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.522021 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.522039 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.522099 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.522121 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:12Z","lastTransitionTime":"2026-01-31T14:42:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:12 crc kubenswrapper[4751]: E0131 14:42:12.546278 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2bc08d22-1e39-4800-b402-ea260cc19637\\\",\\\"systemUUID\\\":\\\"ad4951a2-dbbf-4c4e-af3f-cce3c25b01ef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:12Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.552758 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.552813 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.552831 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.552859 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.552878 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:12Z","lastTransitionTime":"2026-01-31T14:42:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:12 crc kubenswrapper[4751]: E0131 14:42:12.573639 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2bc08d22-1e39-4800-b402-ea260cc19637\\\",\\\"systemUUID\\\":\\\"ad4951a2-dbbf-4c4e-af3f-cce3c25b01ef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:12Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.583328 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.583447 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.583482 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.583513 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.583542 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:12Z","lastTransitionTime":"2026-01-31T14:42:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:12 crc kubenswrapper[4751]: E0131 14:42:12.605324 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2bc08d22-1e39-4800-b402-ea260cc19637\\\",\\\"systemUUID\\\":\\\"ad4951a2-dbbf-4c4e-af3f-cce3c25b01ef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:12Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:12 crc kubenswrapper[4751]: E0131 14:42:12.605578 4751 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.608391 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.608454 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.608473 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.608499 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.608519 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:12Z","lastTransitionTime":"2026-01-31T14:42:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.686976 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n8cdt_ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b/ovnkube-controller/0.log" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.690741 4751 generic.go:334] "Generic (PLEG): container finished" podID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" containerID="c599de37a76f4a9f00441a0b18a38e5315e42c49b315308d22f67c4cc68a8794" exitCode=1 Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.690802 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" event={"ID":"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b","Type":"ContainerDied","Data":"c599de37a76f4a9f00441a0b18a38e5315e42c49b315308d22f67c4cc68a8794"} Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.691993 4751 scope.go:117] "RemoveContainer" containerID="c599de37a76f4a9f00441a0b18a38e5315e42c49b315308d22f67c4cc68a8794" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.712422 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.712476 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.712495 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.712519 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.712483 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7adba94225f6d18961ccad4eab46916b607a2bff977a48d36fee6810aa4b7293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11d68e8091e3d18fff0b3c8f744c2e9002b6307e17815b221af7123715d8abf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:12Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.712538 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:12Z","lastTransitionTime":"2026-01-31T14:42:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.726508 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-68hvr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658471aa-68b2-478e-9522-ef5533009174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5c5f03d09a887caf686dc5e5d446c4589c7e577cd65e9329cd73d1f33af3d53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbl8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-68hvr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:12Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.750059 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3446cde960ab284f70a338a9a4bfee334fdbdef503d868692fb847f4df6acba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20273cc05bc490f8dd5258dccd02aa0e97d1d22c159ce2c177594920f4af83de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://701178503ef461547e687442de0607af20cee8bca075347d1d1cb2a439562232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5623c591d9a6806335a9b671648fc172e749568a7ad29c3bd41f1a990ba56010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://357afc89f219ac34037aaeb7086076b36f05907502c53006aeb588eaf440cc74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8f6719bdf2fef7e21f7c5cdcb3de4b4f08e69bd62e22cb10fe157f638b3c148\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c599de37a76f4a9f00441a0b18a38e5315e42c49b315308d22f67c4cc68a8794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c599de37a76f4a9f00441a0b18a38e5315e42c49b315308d22f67c4cc68a8794\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T14:42:11Z\\\",\\\"message\\\":\\\"d (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 14:42:11.894312 6073 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0131 14:42:11.894354 6073 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0131 14:42:11.894387 6073 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0131 14:42:11.894536 6073 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0131 14:42:11.895132 6073 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0131 14:42:11.895157 6073 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0131 14:42:11.895178 6073 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0131 14:42:11.895199 6073 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0131 14:42:11.895209 6073 handler.go:208] Removed *v1.Node event handler 2\\\\nI0131 14:42:11.895239 6073 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0131 14:42:11.895309 6073 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0131 14:42:11.895321 6073 handler.go:208] Removed *v1.Node event handler 7\\\\nI0131 14:42:11.895334 6073 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0131 14:42:11.895316 6073 factory.go:656] Stopping watch factory\\\\nI0131 14:42:11.895369 6073 ovnkube.go:599] Stopped ovnkube\\\\nI0131 14:42:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7f5e784895f315aa4d8fb6039b9e70c46f977a80fdd5d39ead81d75048abd6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n8cdt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:12Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.770524 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:12Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.786879 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:12Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.800771 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c170e8-22c9-43a9-8b34-9d626c2ccddc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d86496066c7ba97ec9852c160ecb2e25f49cb0b7ae16e667f6279be42fd52679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3956d143be77f4a50143f9678eb51ab7871e250cae73d87c9e7fce2575e466c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wpj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:12Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.815250 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.815327 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.815362 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.815381 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.815395 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:12Z","lastTransitionTime":"2026-01-31T14:42:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.820003 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6afb84a0-c564-45aa-b7a1-cd6f8273fe45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e4fe03f50f95359d284f7d5aebb0df697fba39743cf83857c7f17893cb2bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08603a998264e94ef7871338a47019047495685fb867c5fa4f290f8a990dc6ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddd61fd469b2710e86edf5644cff4c4272ce73a59b6578c4cf941c5408e2d218\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df6c6abfaa0e16e8c6de87da9d3584b57a9e83e9dee03f98bb9de38cf71cc19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f93de7a002a1e6a281d90042a003acf677ad136b29da511ead9acd31eb7ab2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T14:41:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 14:41:50.030836 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 14:41:50.033684 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-814856940/tls.crt::/tmp/serving-cert-814856940/tls.key\\\\\\\"\\\\nI0131 14:41:55.828999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 14:41:55.832995 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 14:41:55.833028 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 14:41:55.833056 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 14:41:55.833065 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 14:41:55.854492 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 14:41:55.854645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 14:41:55.854833 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 14:41:55.854881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 14:41:55.854891 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 14:41:55.855031 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 14:41:55.858196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d34543b9a432a0dd029c18b90dd065749cf5a74b9221dcedf22a810afa749a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:12Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.836955 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d4eee728dbd9b6b0ade5149fd481cb3a688a6856f56a3a25a6cb49c944ee27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:12Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.852348 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b7a3883-0c5a-4116-adb0-ad25f69cd7f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daae5a3d70247f744d33c832eaf3080a91e73ffe0b5327d111ec8e87175becdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37355e2b9d95a941be3552b713f7fe96973a49528f7a979524779b263412853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9317b698da5ee8a5f40c046d810783e523f27b6806319023e57e08f5cae267c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff7cd215439083e1e83ef38a79946be1c7e9bbc20d66cdb931c456a21bbdf17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:12Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.865104 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc4ac60a9db8d22605139ad928fb8fd667aaba0b8563be03e796ad077fb525bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:12Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.882920 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rtthp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7dd989b-33df-4562-a60b-f273428fea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ef2aa95d0f98ef0706040984786e2b48337984b508447f1b49b155ffb5f4608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwrtf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rtthp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:12Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.904134 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rp5sb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5353863-ec39-4357-9b86-9be42ca17916\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99c893782bcbcbd370030fe164c6849abe258ce88db65e3d22c269d20d0bef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c73fb39c6c9db93acb0fdc34044b1cafc05046a470e3897eeaa7fac8107691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c73fb39c6c9db93acb0fdc34044b1cafc05046a470e3897eeaa7fac8107691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd0a7bc51ee2198f74d0adbfa358d772dc5176a7671b6a551157ad301348181d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd0a7bc51ee2198f74d0adbfa358d772dc5176a7671b6a551157ad301348181d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dbe202f26fa5c09ba45db2a0bbb7fd7796b91e665df4529eddfdabd6ce083fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dbe202f26fa5c09ba45db2a0bbb7fd7796b91e665df4529eddfdabd6ce083fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rp5sb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:12Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.917785 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.917856 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.917872 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.917892 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.917930 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:12Z","lastTransitionTime":"2026-01-31T14:42:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.921706 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:12Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:12 crc kubenswrapper[4751]: I0131 14:42:12.935240 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lxrfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b895e2a-7887-41c3-b641-9c72bb085dda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d0546721c181de3a5f2f4e791dcaeba69f94467bfbf08b4470332a0b4b49fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9hbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lxrfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:12Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.020920 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.020970 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.020987 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.021014 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.021036 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:13Z","lastTransitionTime":"2026-01-31T14:42:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.124021 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.124084 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.124098 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.124117 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.124131 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:13Z","lastTransitionTime":"2026-01-31T14:42:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.202095 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.202205 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:42:13 crc kubenswrapper[4751]: E0131 14:42:13.202261 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:42:29.202210367 +0000 UTC m=+53.576923292 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.202307 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:42:13 crc kubenswrapper[4751]: E0131 14:42:13.202335 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 14:42:13 crc kubenswrapper[4751]: E0131 14:42:13.202352 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 14:42:13 crc kubenswrapper[4751]: E0131 14:42:13.202365 4751 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.202362 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:42:13 crc kubenswrapper[4751]: E0131 14:42:13.202402 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-31 14:42:29.202391841 +0000 UTC m=+53.577104736 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.202422 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:42:13 crc kubenswrapper[4751]: E0131 14:42:13.202490 4751 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 14:42:13 crc kubenswrapper[4751]: E0131 14:42:13.202484 4751 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 14:42:13 crc kubenswrapper[4751]: E0131 14:42:13.202508 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 14:42:13 crc kubenswrapper[4751]: E0131 14:42:13.202540 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 14:42:13 crc kubenswrapper[4751]: E0131 14:42:13.202558 4751 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 14:42:13 crc kubenswrapper[4751]: E0131 14:42:13.202526 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 14:42:29.202517145 +0000 UTC m=+53.577230040 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 14:42:13 crc kubenswrapper[4751]: E0131 14:42:13.202659 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 14:42:29.202622407 +0000 UTC m=+53.577335302 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 14:42:13 crc kubenswrapper[4751]: E0131 14:42:13.202697 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-31 14:42:29.202682069 +0000 UTC m=+53.577395074 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.225941 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.225998 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.226008 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.226024 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.226033 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:13Z","lastTransitionTime":"2026-01-31T14:42:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.328612 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.328639 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.328667 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.328681 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.328692 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:13Z","lastTransitionTime":"2026-01-31T14:42:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.372396 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 08:43:20.213969369 +0000 UTC Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.405037 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.405042 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.405102 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:42:13 crc kubenswrapper[4751]: E0131 14:42:13.405304 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 14:42:13 crc kubenswrapper[4751]: E0131 14:42:13.405448 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 14:42:13 crc kubenswrapper[4751]: E0131 14:42:13.405531 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.430661 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.430712 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.430722 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.430738 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.430747 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:13Z","lastTransitionTime":"2026-01-31T14:42:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.533621 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.533680 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.533697 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.533719 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.533736 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:13Z","lastTransitionTime":"2026-01-31T14:42:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.612066 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.634880 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:13Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.640658 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.640743 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.640772 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.640831 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.640871 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:13Z","lastTransitionTime":"2026-01-31T14:42:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.657122 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lxrfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b895e2a-7887-41c3-b641-9c72bb085dda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d0546721c181de3a5f2f4e791dcaeba69f94467bfbf08b4470332a0b4b49fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9hbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lxrfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:13Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.677651 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7adba94225f6d18961ccad4eab46916b607a2bff977a48d36fee6810aa4b7293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11d68e8091e3d18fff0b3c8f744c2e9002b6307e17815b221af7123715d8abf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:13Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.694691 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-68hvr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658471aa-68b2-478e-9522-ef5533009174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5c5f03d09a887caf686dc5e5d446c4589c7e577cd65e9329cd73d1f33af3d53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbl8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-68hvr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:13Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.699000 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n8cdt_ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b/ovnkube-controller/0.log" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.703423 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" event={"ID":"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b","Type":"ContainerStarted","Data":"bbb2056238b11891fff35b2c74e6a809f7d369d116a25803ac32725568d9461a"} Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.703626 4751 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.714404 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3446cde960ab284f70a338a9a4bfee334fdbdef503d868692fb847f4df6acba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20273cc05bc490f8dd5258dccd02aa0e97d1d22c159ce2c177594920f4af83de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://701178503ef461547e687442de0607af20cee8bca075347d1d1cb2a439562232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5623c591d9a6806335a9b671648fc172e749568a7ad29c3bd41f1a990ba56010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://357afc89f219ac34037aaeb7086076b36f05907502c53006aeb588eaf440cc74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8f6719bdf2fef7e21f7c5cdcb3de4b4f08e69bd62e22cb10fe157f638b3c148\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c599de37a76f4a9f00441a0b18a38e5315e42c49b315308d22f67c4cc68a8794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c599de37a76f4a9f00441a0b18a38e5315e42c49b315308d22f67c4cc68a8794\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T14:42:11Z\\\",\\\"message\\\":\\\"d (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 14:42:11.894312 6073 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0131 14:42:11.894354 6073 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0131 14:42:11.894387 6073 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0131 14:42:11.894536 6073 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0131 14:42:11.895132 6073 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0131 14:42:11.895157 6073 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0131 14:42:11.895178 6073 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0131 14:42:11.895199 6073 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0131 14:42:11.895209 6073 handler.go:208] Removed *v1.Node event handler 2\\\\nI0131 14:42:11.895239 6073 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0131 14:42:11.895309 6073 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0131 14:42:11.895321 6073 handler.go:208] Removed *v1.Node event handler 7\\\\nI0131 14:42:11.895334 6073 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0131 14:42:11.895316 6073 factory.go:656] Stopping watch factory\\\\nI0131 14:42:11.895369 6073 ovnkube.go:599] Stopped ovnkube\\\\nI0131 14:42:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7f5e784895f315aa4d8fb6039b9e70c46f977a80fdd5d39ead81d75048abd6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n8cdt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:13Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.731200 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c170e8-22c9-43a9-8b34-9d626c2ccddc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d86496066c7ba97ec9852c160ecb2e25f49cb0b7ae16e667f6279be42fd52679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3956d143be77f4a50143f9678eb51ab7871e250cae73d87c9e7fce2575e466c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wpj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:13Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.744484 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.744530 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.744547 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.744573 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.744590 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:13Z","lastTransitionTime":"2026-01-31T14:42:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.752699 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6afb84a0-c564-45aa-b7a1-cd6f8273fe45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e4fe03f50f95359d284f7d5aebb0df697fba39743cf83857c7f17893cb2bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08603a998264e94ef7871338a47019047495685fb867c5fa4f290f8a990dc6ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddd61fd469b2710e86edf5644cff4c4272ce73a59b6578c4cf941c5408e2d218\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df6c6abfaa0e16e8c6de87da9d3584b57a9e83e9dee03f98bb9de38cf71cc19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f93de7a002a1e6a281d90042a003acf677ad136b29da511ead9acd31eb7ab2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T14:41:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 14:41:50.030836 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 14:41:50.033684 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-814856940/tls.crt::/tmp/serving-cert-814856940/tls.key\\\\\\\"\\\\nI0131 14:41:55.828999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 14:41:55.832995 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 14:41:55.833028 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 14:41:55.833056 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 14:41:55.833065 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 14:41:55.854492 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 14:41:55.854645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 14:41:55.854833 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 14:41:55.854881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 14:41:55.854891 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 14:41:55.855031 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 14:41:55.858196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d34543b9a432a0dd029c18b90dd065749cf5a74b9221dcedf22a810afa749a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:13Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.772784 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d4eee728dbd9b6b0ade5149fd481cb3a688a6856f56a3a25a6cb49c944ee27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:13Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.791767 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:13Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.812853 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:13Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.833370 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b7a3883-0c5a-4116-adb0-ad25f69cd7f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daae5a3d70247f744d33c832eaf3080a91e73ffe0b5327d111ec8e87175becdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37355e2b9d95a941be3552b713f7fe96973a49528f7a979524779b263412853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9317b698da5ee8a5f40c046d810783e523f27b6806319023e57e08f5cae267c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff7cd215439083e1e83ef38a79946be1c7e9bbc20d66cdb931c456a21bbdf17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:13Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.847410 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.847509 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.847534 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.847567 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.847593 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:13Z","lastTransitionTime":"2026-01-31T14:42:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.852638 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc4ac60a9db8d22605139ad928fb8fd667aaba0b8563be03e796ad077fb525bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:13Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.873229 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rtthp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7dd989b-33df-4562-a60b-f273428fea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ef2aa95d0f98ef0706040984786e2b48337984b508447f1b49b155ffb5f4608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwrtf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rtthp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:13Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.896133 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rp5sb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5353863-ec39-4357-9b86-9be42ca17916\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99c893782bcbcbd370030fe164c6849abe258ce88db65e3d22c269d20d0bef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c73fb39c6c9db93acb0fdc34044b1cafc05046a470e3897eeaa7fac8107691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c73fb39c6c9db93acb0fdc34044b1cafc05046a470e3897eeaa7fac8107691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd0a7bc51ee2198f74d0adbfa358d772dc5176a7671b6a551157ad301348181d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd0a7bc51ee2198f74d0adbfa358d772dc5176a7671b6a551157ad301348181d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dbe202f26fa5c09ba45db2a0bbb7fd7796b91e665df4529eddfdabd6ce083fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dbe202f26fa5c09ba45db2a0bbb7fd7796b91e665df4529eddfdabd6ce083fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rp5sb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:13Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.916693 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rtthp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7dd989b-33df-4562-a60b-f273428fea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ef2aa95d0f98ef0706040984786e2b48337984b508447f1b49b155ffb5f4608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwrtf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rtthp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:13Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.939527 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rp5sb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5353863-ec39-4357-9b86-9be42ca17916\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99c893782bcbcbd370030fe164c6849abe258ce88db65e3d22c269d20d0bef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c73fb39c6c9db93acb0fdc34044b1cafc05046a470e3897eeaa7fac8107691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c73fb39c6c9db93acb0fdc34044b1cafc05046a470e3897eeaa7fac8107691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd0a7bc51ee2198f74d0adbfa358d772dc5176a7671b6a551157ad301348181d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd0a7bc51ee2198f74d0adbfa358d772dc5176a7671b6a551157ad301348181d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dbe202f26fa5c09ba45db2a0bbb7fd7796b91e665df4529eddfdabd6ce083fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dbe202f26fa5c09ba45db2a0bbb7fd7796b91e665df4529eddfdabd6ce083fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rp5sb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:13Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.950602 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.950654 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.950671 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.950693 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.950712 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:13Z","lastTransitionTime":"2026-01-31T14:42:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.959537 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b7a3883-0c5a-4116-adb0-ad25f69cd7f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daae5a3d70247f744d33c832eaf3080a91e73ffe0b5327d111ec8e87175becdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37355e2b9d95a941be3552b713f7fe96973a49528f7a979524779b263412853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9317b698da5ee8a5f40c046d810783e523f27b6806319023e57e08f5cae267c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff7cd215439083e1e83ef38a79946be1c7e9bbc20d66cdb931c456a21bbdf17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:13Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.977960 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc4ac60a9db8d22605139ad928fb8fd667aaba0b8563be03e796ad077fb525bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:13Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:13 crc kubenswrapper[4751]: I0131 14:42:13.997911 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:13Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.013878 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lxrfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b895e2a-7887-41c3-b641-9c72bb085dda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d0546721c181de3a5f2f4e791dcaeba69f94467bfbf08b4470332a0b4b49fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9hbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lxrfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:14Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.044945 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3446cde960ab284f70a338a9a4bfee334fdbdef503d868692fb847f4df6acba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20273cc05bc490f8dd5258dccd02aa0e97d1d22c159ce2c177594920f4af83de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://701178503ef461547e687442de0607af20cee8bca075347d1d1cb2a439562232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5623c591d9a6806335a9b671648fc172e749568a7ad29c3bd41f1a990ba56010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://357afc89f219ac34037aaeb7086076b36f05907502c53006aeb588eaf440cc74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8f6719bdf2fef7e21f7c5cdcb3de4b4f08e69bd62e22cb10fe157f638b3c148\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbb2056238b11891fff35b2c74e6a809f7d369d116a25803ac32725568d9461a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c599de37a76f4a9f00441a0b18a38e5315e42c49b315308d22f67c4cc68a8794\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T14:42:11Z\\\",\\\"message\\\":\\\"d (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 14:42:11.894312 6073 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0131 14:42:11.894354 6073 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0131 14:42:11.894387 6073 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0131 14:42:11.894536 6073 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0131 14:42:11.895132 6073 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0131 14:42:11.895157 6073 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0131 14:42:11.895178 6073 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0131 14:42:11.895199 6073 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0131 14:42:11.895209 6073 handler.go:208] Removed *v1.Node event handler 2\\\\nI0131 14:42:11.895239 6073 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0131 14:42:11.895309 6073 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0131 14:42:11.895321 6073 handler.go:208] Removed *v1.Node event handler 7\\\\nI0131 14:42:11.895334 6073 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0131 14:42:11.895316 6073 factory.go:656] Stopping watch factory\\\\nI0131 14:42:11.895369 6073 ovnkube.go:599] Stopped ovnkube\\\\nI0131 14:42:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7f5e784895f315aa4d8fb6039b9e70c46f977a80fdd5d39ead81d75048abd6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n8cdt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:14Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.054560 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.054614 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.054632 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.054699 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.054721 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:14Z","lastTransitionTime":"2026-01-31T14:42:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.066646 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7adba94225f6d18961ccad4eab46916b607a2bff977a48d36fee6810aa4b7293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11d68e8091e3d18fff0b3c8f744c2e9002b6307e17815b221af7123715d8abf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:14Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.083313 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-68hvr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658471aa-68b2-478e-9522-ef5533009174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5c5f03d09a887caf686dc5e5d446c4589c7e577cd65e9329cd73d1f33af3d53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbl8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-68hvr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:14Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.108655 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6afb84a0-c564-45aa-b7a1-cd6f8273fe45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e4fe03f50f95359d284f7d5aebb0df697fba39743cf83857c7f17893cb2bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08603a998264e94ef7871338a47019047495685fb867c5fa4f290f8a990dc6ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddd61fd469b2710e86edf5644cff4c4272ce73a59b6578c4cf941c5408e2d218\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df6c6abfaa0e16e8c6de87da9d3584b57a9e83e9dee03f98bb9de38cf71cc19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f93de7a002a1e6a281d90042a003acf677ad136b29da511ead9acd31eb7ab2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T14:41:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 14:41:50.030836 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 14:41:50.033684 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-814856940/tls.crt::/tmp/serving-cert-814856940/tls.key\\\\\\\"\\\\nI0131 14:41:55.828999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 14:41:55.832995 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 14:41:55.833028 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 14:41:55.833056 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 14:41:55.833065 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 14:41:55.854492 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 14:41:55.854645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 14:41:55.854833 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 14:41:55.854881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 14:41:55.854891 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 14:41:55.855031 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 14:41:55.858196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d34543b9a432a0dd029c18b90dd065749cf5a74b9221dcedf22a810afa749a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:14Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.130581 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d4eee728dbd9b6b0ade5149fd481cb3a688a6856f56a3a25a6cb49c944ee27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:14Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.147492 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:14Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.157611 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.157682 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.157699 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.158194 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.158249 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:14Z","lastTransitionTime":"2026-01-31T14:42:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.169353 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:14Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.189452 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c170e8-22c9-43a9-8b34-9d626c2ccddc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d86496066c7ba97ec9852c160ecb2e25f49cb0b7ae16e667f6279be42fd52679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3956d143be77f4a50143f9678eb51ab7871e250cae73d87c9e7fce2575e466c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wpj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:14Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.262379 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.262437 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.262455 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.262479 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.262496 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:14Z","lastTransitionTime":"2026-01-31T14:42:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.366061 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.366186 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.366204 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.366230 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.366246 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:14Z","lastTransitionTime":"2026-01-31T14:42:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.373382 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 05:43:15.311610298 +0000 UTC Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.377049 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4v79q"] Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.382157 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4v79q" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.385373 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.387339 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.407549 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:14Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.413329 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cd8c0730-67df-445e-a6ce-c2edce5d9c59-env-overrides\") pod \"ovnkube-control-plane-749d76644c-4v79q\" (UID: \"cd8c0730-67df-445e-a6ce-c2edce5d9c59\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4v79q" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.413599 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cd8c0730-67df-445e-a6ce-c2edce5d9c59-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-4v79q\" (UID: \"cd8c0730-67df-445e-a6ce-c2edce5d9c59\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4v79q" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.413654 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45q8c\" (UniqueName: \"kubernetes.io/projected/cd8c0730-67df-445e-a6ce-c2edce5d9c59-kube-api-access-45q8c\") pod \"ovnkube-control-plane-749d76644c-4v79q\" (UID: \"cd8c0730-67df-445e-a6ce-c2edce5d9c59\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4v79q" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.413698 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cd8c0730-67df-445e-a6ce-c2edce5d9c59-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-4v79q\" (UID: \"cd8c0730-67df-445e-a6ce-c2edce5d9c59\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4v79q" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.424838 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lxrfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b895e2a-7887-41c3-b641-9c72bb085dda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d0546721c181de3a5f2f4e791dcaeba69f94467bfbf08b4470332a0b4b49fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9hbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lxrfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:14Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.445009 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7adba94225f6d18961ccad4eab46916b607a2bff977a48d36fee6810aa4b7293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11d68e8091e3d18fff0b3c8f744c2e9002b6307e17815b221af7123715d8abf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:14Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.459403 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-68hvr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658471aa-68b2-478e-9522-ef5533009174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5c5f03d09a887caf686dc5e5d446c4589c7e577cd65e9329cd73d1f33af3d53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbl8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-68hvr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:14Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.472429 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.472496 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.472519 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.472545 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.472562 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:14Z","lastTransitionTime":"2026-01-31T14:42:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.491870 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3446cde960ab284f70a338a9a4bfee334fdbdef503d868692fb847f4df6acba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20273cc05bc490f8dd5258dccd02aa0e97d1d22c159ce2c177594920f4af83de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://701178503ef461547e687442de0607af20cee8bca075347d1d1cb2a439562232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5623c591d9a6806335a9b671648fc172e749568a7ad29c3bd41f1a990ba56010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://357afc89f219ac34037aaeb7086076b36f05907502c53006aeb588eaf440cc74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8f6719bdf2fef7e21f7c5cdcb3de4b4f08e69bd62e22cb10fe157f638b3c148\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbb2056238b11891fff35b2c74e6a809f7d369d116a25803ac32725568d9461a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c599de37a76f4a9f00441a0b18a38e5315e42c49b315308d22f67c4cc68a8794\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T14:42:11Z\\\",\\\"message\\\":\\\"d (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 14:42:11.894312 6073 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0131 14:42:11.894354 6073 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0131 14:42:11.894387 6073 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0131 14:42:11.894536 6073 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0131 14:42:11.895132 6073 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0131 14:42:11.895157 6073 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0131 14:42:11.895178 6073 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0131 14:42:11.895199 6073 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0131 14:42:11.895209 6073 handler.go:208] Removed *v1.Node event handler 2\\\\nI0131 14:42:11.895239 6073 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0131 14:42:11.895309 6073 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0131 14:42:11.895321 6073 handler.go:208] Removed *v1.Node event handler 7\\\\nI0131 14:42:11.895334 6073 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0131 14:42:11.895316 6073 factory.go:656] Stopping watch factory\\\\nI0131 14:42:11.895369 6073 ovnkube.go:599] Stopped ovnkube\\\\nI0131 14:42:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7f5e784895f315aa4d8fb6039b9e70c46f977a80fdd5d39ead81d75048abd6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n8cdt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:14Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.513791 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d4eee728dbd9b6b0ade5149fd481cb3a688a6856f56a3a25a6cb49c944ee27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:14Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.514547 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cd8c0730-67df-445e-a6ce-c2edce5d9c59-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-4v79q\" (UID: \"cd8c0730-67df-445e-a6ce-c2edce5d9c59\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4v79q" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.514614 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45q8c\" (UniqueName: \"kubernetes.io/projected/cd8c0730-67df-445e-a6ce-c2edce5d9c59-kube-api-access-45q8c\") pod \"ovnkube-control-plane-749d76644c-4v79q\" (UID: \"cd8c0730-67df-445e-a6ce-c2edce5d9c59\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4v79q" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.514662 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cd8c0730-67df-445e-a6ce-c2edce5d9c59-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-4v79q\" (UID: \"cd8c0730-67df-445e-a6ce-c2edce5d9c59\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4v79q" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.514829 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cd8c0730-67df-445e-a6ce-c2edce5d9c59-env-overrides\") pod \"ovnkube-control-plane-749d76644c-4v79q\" (UID: \"cd8c0730-67df-445e-a6ce-c2edce5d9c59\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4v79q" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.517140 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cd8c0730-67df-445e-a6ce-c2edce5d9c59-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-4v79q\" (UID: \"cd8c0730-67df-445e-a6ce-c2edce5d9c59\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4v79q" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.518673 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cd8c0730-67df-445e-a6ce-c2edce5d9c59-env-overrides\") pod \"ovnkube-control-plane-749d76644c-4v79q\" (UID: \"cd8c0730-67df-445e-a6ce-c2edce5d9c59\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4v79q" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.524263 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cd8c0730-67df-445e-a6ce-c2edce5d9c59-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-4v79q\" (UID: \"cd8c0730-67df-445e-a6ce-c2edce5d9c59\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4v79q" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.536866 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:14Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.544242 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45q8c\" (UniqueName: \"kubernetes.io/projected/cd8c0730-67df-445e-a6ce-c2edce5d9c59-kube-api-access-45q8c\") pod \"ovnkube-control-plane-749d76644c-4v79q\" (UID: \"cd8c0730-67df-445e-a6ce-c2edce5d9c59\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4v79q" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.556817 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:14Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.574361 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c170e8-22c9-43a9-8b34-9d626c2ccddc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d86496066c7ba97ec9852c160ecb2e25f49cb0b7ae16e667f6279be42fd52679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3956d143be77f4a50143f9678eb51ab7871e250cae73d87c9e7fce2575e466c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wpj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:14Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.575751 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.575795 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.575813 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.575837 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.575855 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:14Z","lastTransitionTime":"2026-01-31T14:42:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.596663 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6afb84a0-c564-45aa-b7a1-cd6f8273fe45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e4fe03f50f95359d284f7d5aebb0df697fba39743cf83857c7f17893cb2bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08603a998264e94ef7871338a47019047495685fb867c5fa4f290f8a990dc6ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddd61fd469b2710e86edf5644cff4c4272ce73a59b6578c4cf941c5408e2d218\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df6c6abfaa0e16e8c6de87da9d3584b57a9e83e9dee03f98bb9de38cf71cc19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f93de7a002a1e6a281d90042a003acf677ad136b29da511ead9acd31eb7ab2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T14:41:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 14:41:50.030836 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 14:41:50.033684 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-814856940/tls.crt::/tmp/serving-cert-814856940/tls.key\\\\\\\"\\\\nI0131 14:41:55.828999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 14:41:55.832995 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 14:41:55.833028 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 14:41:55.833056 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 14:41:55.833065 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 14:41:55.854492 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 14:41:55.854645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 14:41:55.854833 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 14:41:55.854881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 14:41:55.854891 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 14:41:55.855031 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 14:41:55.858196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d34543b9a432a0dd029c18b90dd065749cf5a74b9221dcedf22a810afa749a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:14Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.615520 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4v79q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd8c0730-67df-445e-a6ce-c2edce5d9c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45q8c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45q8c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4v79q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:14Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.635554 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b7a3883-0c5a-4116-adb0-ad25f69cd7f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daae5a3d70247f744d33c832eaf3080a91e73ffe0b5327d111ec8e87175becdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37355e2b9d95a941be3552b713f7fe96973a49528f7a979524779b263412853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9317b698da5ee8a5f40c046d810783e523f27b6806319023e57e08f5cae267c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff7cd215439083e1e83ef38a79946be1c7e9bbc20d66cdb931c456a21bbdf17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:14Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.654034 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc4ac60a9db8d22605139ad928fb8fd667aaba0b8563be03e796ad077fb525bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:14Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.676782 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rtthp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7dd989b-33df-4562-a60b-f273428fea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ef2aa95d0f98ef0706040984786e2b48337984b508447f1b49b155ffb5f4608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwrtf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rtthp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:14Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.678436 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.678853 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.679005 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.679211 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.679351 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:14Z","lastTransitionTime":"2026-01-31T14:42:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.702275 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rp5sb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5353863-ec39-4357-9b86-9be42ca17916\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99c893782bcbcbd370030fe164c6849abe258ce88db65e3d22c269d20d0bef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c73fb39c6c9db93acb0fdc34044b1cafc05046a470e3897eeaa7fac8107691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c73fb39c6c9db93acb0fdc34044b1cafc05046a470e3897eeaa7fac8107691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd0a7bc51ee2198f74d0adbfa358d772dc5176a7671b6a551157ad301348181d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd0a7bc51ee2198f74d0adbfa358d772dc5176a7671b6a551157ad301348181d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dbe202f26fa5c09ba45db2a0bbb7fd7796b91e665df4529eddfdabd6ce083fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dbe202f26fa5c09ba45db2a0bbb7fd7796b91e665df4529eddfdabd6ce083fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rp5sb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:14Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.703186 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4v79q" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.714469 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n8cdt_ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b/ovnkube-controller/1.log" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.715635 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n8cdt_ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b/ovnkube-controller/0.log" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.720771 4751 generic.go:334] "Generic (PLEG): container finished" podID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" containerID="bbb2056238b11891fff35b2c74e6a809f7d369d116a25803ac32725568d9461a" exitCode=1 Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.720823 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" event={"ID":"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b","Type":"ContainerDied","Data":"bbb2056238b11891fff35b2c74e6a809f7d369d116a25803ac32725568d9461a"} Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.720914 4751 scope.go:117] "RemoveContainer" containerID="c599de37a76f4a9f00441a0b18a38e5315e42c49b315308d22f67c4cc68a8794" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.722026 4751 scope.go:117] "RemoveContainer" containerID="bbb2056238b11891fff35b2c74e6a809f7d369d116a25803ac32725568d9461a" Jan 31 14:42:14 crc kubenswrapper[4751]: E0131 14:42:14.722328 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-n8cdt_openshift-ovn-kubernetes(ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" podUID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" Jan 31 14:42:14 crc kubenswrapper[4751]: W0131 14:42:14.730616 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd8c0730_67df_445e_a6ce_c2edce5d9c59.slice/crio-f9785bdb93fca017a08b7a9cef8f01e0a6749cd8cb26f88592770c721ada1695 WatchSource:0}: Error finding container f9785bdb93fca017a08b7a9cef8f01e0a6749cd8cb26f88592770c721ada1695: Status 404 returned error can't find the container with id f9785bdb93fca017a08b7a9cef8f01e0a6749cd8cb26f88592770c721ada1695 Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.746801 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6afb84a0-c564-45aa-b7a1-cd6f8273fe45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e4fe03f50f95359d284f7d5aebb0df697fba39743cf83857c7f17893cb2bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08603a998264e94ef7871338a47019047495685fb867c5fa4f290f8a990dc6ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddd61fd469b2710e86edf5644cff4c4272ce73a59b6578c4cf941c5408e2d218\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df6c6abfaa0e16e8c6de87da9d3584b57a9e83e9dee03f98bb9de38cf71cc19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f93de7a002a1e6a281d90042a003acf677ad136b29da511ead9acd31eb7ab2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T14:41:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 14:41:50.030836 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 14:41:50.033684 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-814856940/tls.crt::/tmp/serving-cert-814856940/tls.key\\\\\\\"\\\\nI0131 14:41:55.828999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 14:41:55.832995 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 14:41:55.833028 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 14:41:55.833056 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 14:41:55.833065 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 14:41:55.854492 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 14:41:55.854645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 14:41:55.854833 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 14:41:55.854881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 14:41:55.854891 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 14:41:55.855031 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 14:41:55.858196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d34543b9a432a0dd029c18b90dd065749cf5a74b9221dcedf22a810afa749a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:14Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.771570 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d4eee728dbd9b6b0ade5149fd481cb3a688a6856f56a3a25a6cb49c944ee27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:14Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.782697 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.782760 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.782777 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.782803 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.782821 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:14Z","lastTransitionTime":"2026-01-31T14:42:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.791793 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:14Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.812956 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:14Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.831245 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c170e8-22c9-43a9-8b34-9d626c2ccddc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d86496066c7ba97ec9852c160ecb2e25f49cb0b7ae16e667f6279be42fd52679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3956d143be77f4a50143f9678eb51ab7871e250cae73d87c9e7fce2575e466c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wpj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:14Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.850301 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b7a3883-0c5a-4116-adb0-ad25f69cd7f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daae5a3d70247f744d33c832eaf3080a91e73ffe0b5327d111ec8e87175becdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37355e2b9d95a941be3552b713f7fe96973a49528f7a979524779b263412853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9317b698da5ee8a5f40c046d810783e523f27b6806319023e57e08f5cae267c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff7cd215439083e1e83ef38a79946be1c7e9bbc20d66cdb931c456a21bbdf17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:14Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.869402 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc4ac60a9db8d22605139ad928fb8fd667aaba0b8563be03e796ad077fb525bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:14Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.885944 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.885999 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.886019 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.886043 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.886061 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:14Z","lastTransitionTime":"2026-01-31T14:42:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.887170 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rtthp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7dd989b-33df-4562-a60b-f273428fea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ef2aa95d0f98ef0706040984786e2b48337984b508447f1b49b155ffb5f4608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwrtf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rtthp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:14Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.906378 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rp5sb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5353863-ec39-4357-9b86-9be42ca17916\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99c893782bcbcbd370030fe164c6849abe258ce88db65e3d22c269d20d0bef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c73fb39c6c9db93acb0fdc34044b1cafc05046a470e3897eeaa7fac8107691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c73fb39c6c9db93acb0fdc34044b1cafc05046a470e3897eeaa7fac8107691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd0a7bc51ee2198f74d0adbfa358d772dc5176a7671b6a551157ad301348181d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd0a7bc51ee2198f74d0adbfa358d772dc5176a7671b6a551157ad301348181d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dbe202f26fa5c09ba45db2a0bbb7fd7796b91e665df4529eddfdabd6ce083fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dbe202f26fa5c09ba45db2a0bbb7fd7796b91e665df4529eddfdabd6ce083fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rp5sb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:14Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.920243 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4v79q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd8c0730-67df-445e-a6ce-c2edce5d9c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45q8c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45q8c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4v79q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:14Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.939868 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:14Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.953013 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lxrfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b895e2a-7887-41c3-b641-9c72bb085dda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d0546721c181de3a5f2f4e791dcaeba69f94467bfbf08b4470332a0b4b49fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9hbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lxrfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:14Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.968768 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7adba94225f6d18961ccad4eab46916b607a2bff977a48d36fee6810aa4b7293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11d68e8091e3d18fff0b3c8f744c2e9002b6307e17815b221af7123715d8abf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:14Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.985633 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-68hvr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658471aa-68b2-478e-9522-ef5533009174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5c5f03d09a887caf686dc5e5d446c4589c7e577cd65e9329cd73d1f33af3d53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbl8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-68hvr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:14Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.990614 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.990683 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.990709 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.990744 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:14 crc kubenswrapper[4751]: I0131 14:42:14.990769 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:14Z","lastTransitionTime":"2026-01-31T14:42:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.013677 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3446cde960ab284f70a338a9a4bfee334fdbdef503d868692fb847f4df6acba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20273cc05bc490f8dd5258dccd02aa0e97d1d22c159ce2c177594920f4af83de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://701178503ef461547e687442de0607af20cee8bca075347d1d1cb2a439562232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5623c591d9a6806335a9b671648fc172e749568a7ad29c3bd41f1a990ba56010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://357afc89f219ac34037aaeb7086076b36f05907502c53006aeb588eaf440cc74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8f6719bdf2fef7e21f7c5cdcb3de4b4f08e69bd62e22cb10fe157f638b3c148\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbb2056238b11891fff35b2c74e6a809f7d369d116a25803ac32725568d9461a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c599de37a76f4a9f00441a0b18a38e5315e42c49b315308d22f67c4cc68a8794\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T14:42:11Z\\\",\\\"message\\\":\\\"d (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 14:42:11.894312 6073 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0131 14:42:11.894354 6073 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0131 14:42:11.894387 6073 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0131 14:42:11.894536 6073 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0131 14:42:11.895132 6073 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0131 14:42:11.895157 6073 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0131 14:42:11.895178 6073 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0131 14:42:11.895199 6073 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0131 14:42:11.895209 6073 handler.go:208] Removed *v1.Node event handler 2\\\\nI0131 14:42:11.895239 6073 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0131 14:42:11.895309 6073 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0131 14:42:11.895321 6073 handler.go:208] Removed *v1.Node event handler 7\\\\nI0131 14:42:11.895334 6073 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0131 14:42:11.895316 6073 factory.go:656] Stopping watch factory\\\\nI0131 14:42:11.895369 6073 ovnkube.go:599] Stopped ovnkube\\\\nI0131 14:42:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbb2056238b11891fff35b2c74e6a809f7d369d116a25803ac32725568d9461a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T14:42:13Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 14:42:13.656951 6206 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0131 14:42:13.657012 6206 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0131 14:42:13.657046 6206 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0131 14:42:13.657054 6206 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0131 14:42:13.657134 6206 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0131 14:42:13.657145 6206 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0131 14:42:13.657168 6206 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0131 14:42:13.657186 6206 factory.go:656] Stopping watch factory\\\\nI0131 14:42:13.657200 6206 ovnkube.go:599] Stopped ovnkube\\\\nI0131 14:42:13.657232 6206 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0131 14:42:13.657247 6206 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0131 14:42:13.657256 6206 handler.go:208] Removed *v1.Node event handler 2\\\\nI0131 14:42:13.657266 6206 handler.go:208] Removed *v1.Node event handler 7\\\\nI0131 14:42:13.657275 6206 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0131 14:42:13.657286 6206 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0131 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7f5e784895f315aa4d8fb6039b9e70c46f977a80fdd5d39ead81d75048abd6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n8cdt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:15Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.093863 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.093916 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.093926 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.093939 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.093950 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:15Z","lastTransitionTime":"2026-01-31T14:42:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.197118 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.197174 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.197191 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.197215 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.197231 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:15Z","lastTransitionTime":"2026-01-31T14:42:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.301413 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.301480 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.301512 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.301537 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.301559 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:15Z","lastTransitionTime":"2026-01-31T14:42:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.373624 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 07:01:40.415046217 +0000 UTC Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.404643 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.404698 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.404718 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.404743 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.404760 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:15Z","lastTransitionTime":"2026-01-31T14:42:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.404829 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.404917 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.404951 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:42:15 crc kubenswrapper[4751]: E0131 14:42:15.405145 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 14:42:15 crc kubenswrapper[4751]: E0131 14:42:15.405341 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 14:42:15 crc kubenswrapper[4751]: E0131 14:42:15.405475 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.508586 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.508679 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.508703 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.508737 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.508760 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:15Z","lastTransitionTime":"2026-01-31T14:42:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.612048 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.612132 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.612149 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.612174 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.612192 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:15Z","lastTransitionTime":"2026-01-31T14:42:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.715031 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.715142 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.715161 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.715187 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.715208 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:15Z","lastTransitionTime":"2026-01-31T14:42:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.727181 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4v79q" event={"ID":"cd8c0730-67df-445e-a6ce-c2edce5d9c59","Type":"ContainerStarted","Data":"34680c760b5c6a6e2a521731e962301b54aa3184d5d66792fb43e991c6502a53"} Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.727251 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4v79q" event={"ID":"cd8c0730-67df-445e-a6ce-c2edce5d9c59","Type":"ContainerStarted","Data":"f27098d3c41dbb10f76ba04a8a989e91ff3eb6fe0fb0ca746e33417839235c5c"} Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.727270 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4v79q" event={"ID":"cd8c0730-67df-445e-a6ce-c2edce5d9c59","Type":"ContainerStarted","Data":"f9785bdb93fca017a08b7a9cef8f01e0a6749cd8cb26f88592770c721ada1695"} Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.730880 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n8cdt_ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b/ovnkube-controller/1.log" Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.752219 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6afb84a0-c564-45aa-b7a1-cd6f8273fe45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e4fe03f50f95359d284f7d5aebb0df697fba39743cf83857c7f17893cb2bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08603a998264e94ef7871338a47019047495685fb867c5fa4f290f8a990dc6ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddd61fd469b2710e86edf5644cff4c4272ce73a59b6578c4cf941c5408e2d218\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df6c6abfaa0e16e8c6de87da9d3584b57a9e83e9dee03f98bb9de38cf71cc19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f93de7a002a1e6a281d90042a003acf677ad136b29da511ead9acd31eb7ab2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T14:41:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 14:41:50.030836 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 14:41:50.033684 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-814856940/tls.crt::/tmp/serving-cert-814856940/tls.key\\\\\\\"\\\\nI0131 14:41:55.828999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 14:41:55.832995 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 14:41:55.833028 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 14:41:55.833056 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 14:41:55.833065 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 14:41:55.854492 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 14:41:55.854645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 14:41:55.854833 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 14:41:55.854881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 14:41:55.854891 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 14:41:55.855031 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 14:41:55.858196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d34543b9a432a0dd029c18b90dd065749cf5a74b9221dcedf22a810afa749a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:15Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.778555 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d4eee728dbd9b6b0ade5149fd481cb3a688a6856f56a3a25a6cb49c944ee27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:15Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.810212 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:15Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.817986 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.818023 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.818035 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.818051 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.818063 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:15Z","lastTransitionTime":"2026-01-31T14:42:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.833601 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:15Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.848219 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c170e8-22c9-43a9-8b34-9d626c2ccddc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d86496066c7ba97ec9852c160ecb2e25f49cb0b7ae16e667f6279be42fd52679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3956d143be77f4a50143f9678eb51ab7871e250cae73d87c9e7fce2575e466c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wpj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:15Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.859802 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc4ac60a9db8d22605139ad928fb8fd667aaba0b8563be03e796ad077fb525bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:15Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.864871 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-xtn6l"] Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.865548 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:42:15 crc kubenswrapper[4751]: E0131 14:42:15.865641 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xtn6l" podUID="68aeb9c7-d3c3-4c34-96ab-bb947421c504" Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.872618 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rtthp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7dd989b-33df-4562-a60b-f273428fea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ef2aa95d0f98ef0706040984786e2b48337984b508447f1b49b155ffb5f4608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwrtf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rtthp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:15Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.895518 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rp5sb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5353863-ec39-4357-9b86-9be42ca17916\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99c893782bcbcbd370030fe164c6849abe258ce88db65e3d22c269d20d0bef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c73fb39c6c9db93acb0fdc34044b1cafc05046a470e3897eeaa7fac8107691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c73fb39c6c9db93acb0fdc34044b1cafc05046a470e3897eeaa7fac8107691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd0a7bc51ee2198f74d0adbfa358d772dc5176a7671b6a551157ad301348181d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd0a7bc51ee2198f74d0adbfa358d772dc5176a7671b6a551157ad301348181d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dbe202f26fa5c09ba45db2a0bbb7fd7796b91e665df4529eddfdabd6ce083fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dbe202f26fa5c09ba45db2a0bbb7fd7796b91e665df4529eddfdabd6ce083fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rp5sb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:15Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.908882 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4v79q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd8c0730-67df-445e-a6ce-c2edce5d9c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f27098d3c41dbb10f76ba04a8a989e91ff3eb6fe0fb0ca746e33417839235c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45q8c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34680c760b5c6a6e2a521731e962301b54aa3184d5d66792fb43e991c6502a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45q8c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4v79q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:15Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.920564 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.920626 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.920652 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.920680 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.920703 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:15Z","lastTransitionTime":"2026-01-31T14:42:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.927516 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/68aeb9c7-d3c3-4c34-96ab-bb947421c504-metrics-certs\") pod \"network-metrics-daemon-xtn6l\" (UID: \"68aeb9c7-d3c3-4c34-96ab-bb947421c504\") " pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.927613 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hljmn\" (UniqueName: \"kubernetes.io/projected/68aeb9c7-d3c3-4c34-96ab-bb947421c504-kube-api-access-hljmn\") pod \"network-metrics-daemon-xtn6l\" (UID: \"68aeb9c7-d3c3-4c34-96ab-bb947421c504\") " pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.929413 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b7a3883-0c5a-4116-adb0-ad25f69cd7f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daae5a3d70247f744d33c832eaf3080a91e73ffe0b5327d111ec8e87175becdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37355e2b9d95a941be3552b713f7fe96973a49528f7a979524779b263412853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9317b698da5ee8a5f40c046d810783e523f27b6806319023e57e08f5cae267c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff7cd215439083e1e83ef38a79946be1c7e9bbc20d66cdb931c456a21bbdf17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:15Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.944363 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lxrfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b895e2a-7887-41c3-b641-9c72bb085dda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d0546721c181de3a5f2f4e791dcaeba69f94467bfbf08b4470332a0b4b49fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9hbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lxrfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:15Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.961325 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:15Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:15 crc kubenswrapper[4751]: I0131 14:42:15.977989 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-68hvr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658471aa-68b2-478e-9522-ef5533009174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5c5f03d09a887caf686dc5e5d446c4589c7e577cd65e9329cd73d1f33af3d53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbl8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-68hvr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:15Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.000279 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3446cde960ab284f70a338a9a4bfee334fdbdef503d868692fb847f4df6acba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20273cc05bc490f8dd5258dccd02aa0e97d1d22c159ce2c177594920f4af83de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://701178503ef461547e687442de0607af20cee8bca075347d1d1cb2a439562232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5623c591d9a6806335a9b671648fc172e749568a7ad29c3bd41f1a990ba56010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://357afc89f219ac34037aaeb7086076b36f05907502c53006aeb588eaf440cc74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8f6719bdf2fef7e21f7c5cdcb3de4b4f08e69bd62e22cb10fe157f638b3c148\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbb2056238b11891fff35b2c74e6a809f7d369d116a25803ac32725568d9461a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c599de37a76f4a9f00441a0b18a38e5315e42c49b315308d22f67c4cc68a8794\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T14:42:11Z\\\",\\\"message\\\":\\\"d (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 14:42:11.894312 6073 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0131 14:42:11.894354 6073 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0131 14:42:11.894387 6073 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0131 14:42:11.894536 6073 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0131 14:42:11.895132 6073 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0131 14:42:11.895157 6073 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0131 14:42:11.895178 6073 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0131 14:42:11.895199 6073 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0131 14:42:11.895209 6073 handler.go:208] Removed *v1.Node event handler 2\\\\nI0131 14:42:11.895239 6073 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0131 14:42:11.895309 6073 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0131 14:42:11.895321 6073 handler.go:208] Removed *v1.Node event handler 7\\\\nI0131 14:42:11.895334 6073 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0131 14:42:11.895316 6073 factory.go:656] Stopping watch factory\\\\nI0131 14:42:11.895369 6073 ovnkube.go:599] Stopped ovnkube\\\\nI0131 14:42:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbb2056238b11891fff35b2c74e6a809f7d369d116a25803ac32725568d9461a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T14:42:13Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 14:42:13.656951 6206 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0131 14:42:13.657012 6206 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0131 14:42:13.657046 6206 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0131 14:42:13.657054 6206 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0131 14:42:13.657134 6206 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0131 14:42:13.657145 6206 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0131 14:42:13.657168 6206 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0131 14:42:13.657186 6206 factory.go:656] Stopping watch factory\\\\nI0131 14:42:13.657200 6206 ovnkube.go:599] Stopped ovnkube\\\\nI0131 14:42:13.657232 6206 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0131 14:42:13.657247 6206 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0131 14:42:13.657256 6206 handler.go:208] Removed *v1.Node event handler 2\\\\nI0131 14:42:13.657266 6206 handler.go:208] Removed *v1.Node event handler 7\\\\nI0131 14:42:13.657275 6206 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0131 14:42:13.657286 6206 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0131 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7f5e784895f315aa4d8fb6039b9e70c46f977a80fdd5d39ead81d75048abd6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n8cdt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:15Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.016748 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7adba94225f6d18961ccad4eab46916b607a2bff977a48d36fee6810aa4b7293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11d68e8091e3d18fff0b3c8f744c2e9002b6307e17815b221af7123715d8abf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:16Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.022960 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.023012 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.023029 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.023052 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.023106 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:16Z","lastTransitionTime":"2026-01-31T14:42:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.028733 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/68aeb9c7-d3c3-4c34-96ab-bb947421c504-metrics-certs\") pod \"network-metrics-daemon-xtn6l\" (UID: \"68aeb9c7-d3c3-4c34-96ab-bb947421c504\") " pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.028827 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hljmn\" (UniqueName: \"kubernetes.io/projected/68aeb9c7-d3c3-4c34-96ab-bb947421c504-kube-api-access-hljmn\") pod \"network-metrics-daemon-xtn6l\" (UID: \"68aeb9c7-d3c3-4c34-96ab-bb947421c504\") " pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:42:16 crc kubenswrapper[4751]: E0131 14:42:16.029250 4751 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.038381 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7adba94225f6d18961ccad4eab46916b607a2bff977a48d36fee6810aa4b7293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11d68e8091e3d18fff0b3c8f744c2e9002b6307e17815b221af7123715d8abf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:16Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:16 crc kubenswrapper[4751]: E0131 14:42:16.038555 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68aeb9c7-d3c3-4c34-96ab-bb947421c504-metrics-certs podName:68aeb9c7-d3c3-4c34-96ab-bb947421c504 nodeName:}" failed. No retries permitted until 2026-01-31 14:42:16.538508711 +0000 UTC m=+40.913221606 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/68aeb9c7-d3c3-4c34-96ab-bb947421c504-metrics-certs") pod "network-metrics-daemon-xtn6l" (UID: "68aeb9c7-d3c3-4c34-96ab-bb947421c504") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.055930 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-68hvr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658471aa-68b2-478e-9522-ef5533009174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5c5f03d09a887caf686dc5e5d446c4589c7e577cd65e9329cd73d1f33af3d53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbl8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-68hvr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:16Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.067200 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hljmn\" (UniqueName: \"kubernetes.io/projected/68aeb9c7-d3c3-4c34-96ab-bb947421c504-kube-api-access-hljmn\") pod \"network-metrics-daemon-xtn6l\" (UID: \"68aeb9c7-d3c3-4c34-96ab-bb947421c504\") " pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.076706 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3446cde960ab284f70a338a9a4bfee334fdbdef503d868692fb847f4df6acba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20273cc05bc490f8dd5258dccd02aa0e97d1d22c159ce2c177594920f4af83de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://701178503ef461547e687442de0607af20cee8bca075347d1d1cb2a439562232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5623c591d9a6806335a9b671648fc172e749568a7ad29c3bd41f1a990ba56010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://357afc89f219ac34037aaeb7086076b36f05907502c53006aeb588eaf440cc74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8f6719bdf2fef7e21f7c5cdcb3de4b4f08e69bd62e22cb10fe157f638b3c148\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbb2056238b11891fff35b2c74e6a809f7d369d116a25803ac32725568d9461a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c599de37a76f4a9f00441a0b18a38e5315e42c49b315308d22f67c4cc68a8794\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T14:42:11Z\\\",\\\"message\\\":\\\"d (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 14:42:11.894312 6073 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0131 14:42:11.894354 6073 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0131 14:42:11.894387 6073 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0131 14:42:11.894536 6073 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0131 14:42:11.895132 6073 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0131 14:42:11.895157 6073 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0131 14:42:11.895178 6073 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0131 14:42:11.895199 6073 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0131 14:42:11.895209 6073 handler.go:208] Removed *v1.Node event handler 2\\\\nI0131 14:42:11.895239 6073 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0131 14:42:11.895309 6073 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0131 14:42:11.895321 6073 handler.go:208] Removed *v1.Node event handler 7\\\\nI0131 14:42:11.895334 6073 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0131 14:42:11.895316 6073 factory.go:656] Stopping watch factory\\\\nI0131 14:42:11.895369 6073 ovnkube.go:599] Stopped ovnkube\\\\nI0131 14:42:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbb2056238b11891fff35b2c74e6a809f7d369d116a25803ac32725568d9461a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T14:42:13Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 14:42:13.656951 6206 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0131 14:42:13.657012 6206 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0131 14:42:13.657046 6206 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0131 14:42:13.657054 6206 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0131 14:42:13.657134 6206 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0131 14:42:13.657145 6206 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0131 14:42:13.657168 6206 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0131 14:42:13.657186 6206 factory.go:656] Stopping watch factory\\\\nI0131 14:42:13.657200 6206 ovnkube.go:599] Stopped ovnkube\\\\nI0131 14:42:13.657232 6206 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0131 14:42:13.657247 6206 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0131 14:42:13.657256 6206 handler.go:208] Removed *v1.Node event handler 2\\\\nI0131 14:42:13.657266 6206 handler.go:208] Removed *v1.Node event handler 7\\\\nI0131 14:42:13.657275 6206 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0131 14:42:13.657286 6206 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0131 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7f5e784895f315aa4d8fb6039b9e70c46f977a80fdd5d39ead81d75048abd6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n8cdt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:16Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.090740 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:16Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.107808 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:16Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.121487 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c170e8-22c9-43a9-8b34-9d626c2ccddc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d86496066c7ba97ec9852c160ecb2e25f49cb0b7ae16e667f6279be42fd52679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3956d143be77f4a50143f9678eb51ab7871e250cae73d87c9e7fce2575e466c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wpj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:16Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.125826 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.125868 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.125879 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.125898 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.125910 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:16Z","lastTransitionTime":"2026-01-31T14:42:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.138830 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6afb84a0-c564-45aa-b7a1-cd6f8273fe45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e4fe03f50f95359d284f7d5aebb0df697fba39743cf83857c7f17893cb2bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08603a998264e94ef7871338a47019047495685fb867c5fa4f290f8a990dc6ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddd61fd469b2710e86edf5644cff4c4272ce73a59b6578c4cf941c5408e2d218\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df6c6abfaa0e16e8c6de87da9d3584b57a9e83e9dee03f98bb9de38cf71cc19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f93de7a002a1e6a281d90042a003acf677ad136b29da511ead9acd31eb7ab2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T14:41:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 14:41:50.030836 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 14:41:50.033684 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-814856940/tls.crt::/tmp/serving-cert-814856940/tls.key\\\\\\\"\\\\nI0131 14:41:55.828999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 14:41:55.832995 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 14:41:55.833028 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 14:41:55.833056 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 14:41:55.833065 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 14:41:55.854492 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 14:41:55.854645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 14:41:55.854833 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 14:41:55.854881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 14:41:55.854891 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 14:41:55.855031 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 14:41:55.858196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d34543b9a432a0dd029c18b90dd065749cf5a74b9221dcedf22a810afa749a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:16Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.154812 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d4eee728dbd9b6b0ade5149fd481cb3a688a6856f56a3a25a6cb49c944ee27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:16Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.172381 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b7a3883-0c5a-4116-adb0-ad25f69cd7f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daae5a3d70247f744d33c832eaf3080a91e73ffe0b5327d111ec8e87175becdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37355e2b9d95a941be3552b713f7fe96973a49528f7a979524779b263412853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9317b698da5ee8a5f40c046d810783e523f27b6806319023e57e08f5cae267c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff7cd215439083e1e83ef38a79946be1c7e9bbc20d66cdb931c456a21bbdf17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:16Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.189844 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc4ac60a9db8d22605139ad928fb8fd667aaba0b8563be03e796ad077fb525bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:16Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.209441 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rtthp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7dd989b-33df-4562-a60b-f273428fea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ef2aa95d0f98ef0706040984786e2b48337984b508447f1b49b155ffb5f4608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwrtf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rtthp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:16Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.228546 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.228633 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.228661 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.228708 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.228734 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:16Z","lastTransitionTime":"2026-01-31T14:42:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.231184 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rp5sb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5353863-ec39-4357-9b86-9be42ca17916\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99c893782bcbcbd370030fe164c6849abe258ce88db65e3d22c269d20d0bef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c73fb39c6c9db93acb0fdc34044b1cafc05046a470e3897eeaa7fac8107691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c73fb39c6c9db93acb0fdc34044b1cafc05046a470e3897eeaa7fac8107691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd0a7bc51ee2198f74d0adbfa358d772dc5176a7671b6a551157ad301348181d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd0a7bc51ee2198f74d0adbfa358d772dc5176a7671b6a551157ad301348181d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dbe202f26fa5c09ba45db2a0bbb7fd7796b91e665df4529eddfdabd6ce083fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dbe202f26fa5c09ba45db2a0bbb7fd7796b91e665df4529eddfdabd6ce083fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rp5sb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:16Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.248193 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4v79q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd8c0730-67df-445e-a6ce-c2edce5d9c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f27098d3c41dbb10f76ba04a8a989e91ff3eb6fe0fb0ca746e33417839235c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45q8c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34680c760b5c6a6e2a521731e962301b54aa3184d5d66792fb43e991c6502a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45q8c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4v79q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:16Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.266721 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:16Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.281976 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lxrfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b895e2a-7887-41c3-b641-9c72bb085dda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d0546721c181de3a5f2f4e791dcaeba69f94467bfbf08b4470332a0b4b49fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9hbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lxrfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:16Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.299362 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xtn6l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68aeb9c7-d3c3-4c34-96ab-bb947421c504\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hljmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hljmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xtn6l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:16Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.331575 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.331636 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.331653 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.331680 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.331697 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:16Z","lastTransitionTime":"2026-01-31T14:42:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.373843 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 08:36:33.92356774 +0000 UTC Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.423996 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:16Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.434469 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.434545 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.434567 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.434593 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.434610 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:16Z","lastTransitionTime":"2026-01-31T14:42:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.441728 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lxrfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b895e2a-7887-41c3-b641-9c72bb085dda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d0546721c181de3a5f2f4e791dcaeba69f94467bfbf08b4470332a0b4b49fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9hbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lxrfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:16Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.455893 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xtn6l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68aeb9c7-d3c3-4c34-96ab-bb947421c504\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hljmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hljmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xtn6l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:16Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.473608 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7adba94225f6d18961ccad4eab46916b607a2bff977a48d36fee6810aa4b7293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11d68e8091e3d18fff0b3c8f744c2e9002b6307e17815b221af7123715d8abf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:16Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.489625 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-68hvr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658471aa-68b2-478e-9522-ef5533009174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5c5f03d09a887caf686dc5e5d446c4589c7e577cd65e9329cd73d1f33af3d53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbl8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-68hvr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:16Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.517291 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3446cde960ab284f70a338a9a4bfee334fdbdef503d868692fb847f4df6acba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20273cc05bc490f8dd5258dccd02aa0e97d1d22c159ce2c177594920f4af83de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://701178503ef461547e687442de0607af20cee8bca075347d1d1cb2a439562232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5623c591d9a6806335a9b671648fc172e749568a7ad29c3bd41f1a990ba56010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://357afc89f219ac34037aaeb7086076b36f05907502c53006aeb588eaf440cc74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8f6719bdf2fef7e21f7c5cdcb3de4b4f08e69bd62e22cb10fe157f638b3c148\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbb2056238b11891fff35b2c74e6a809f7d369d116a25803ac32725568d9461a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c599de37a76f4a9f00441a0b18a38e5315e42c49b315308d22f67c4cc68a8794\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T14:42:11Z\\\",\\\"message\\\":\\\"d (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 14:42:11.894312 6073 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0131 14:42:11.894354 6073 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0131 14:42:11.894387 6073 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0131 14:42:11.894536 6073 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0131 14:42:11.895132 6073 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0131 14:42:11.895157 6073 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0131 14:42:11.895178 6073 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0131 14:42:11.895199 6073 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0131 14:42:11.895209 6073 handler.go:208] Removed *v1.Node event handler 2\\\\nI0131 14:42:11.895239 6073 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0131 14:42:11.895309 6073 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0131 14:42:11.895321 6073 handler.go:208] Removed *v1.Node event handler 7\\\\nI0131 14:42:11.895334 6073 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0131 14:42:11.895316 6073 factory.go:656] Stopping watch factory\\\\nI0131 14:42:11.895369 6073 ovnkube.go:599] Stopped ovnkube\\\\nI0131 14:42:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbb2056238b11891fff35b2c74e6a809f7d369d116a25803ac32725568d9461a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T14:42:13Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 14:42:13.656951 6206 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0131 14:42:13.657012 6206 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0131 14:42:13.657046 6206 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0131 14:42:13.657054 6206 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0131 14:42:13.657134 6206 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0131 14:42:13.657145 6206 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0131 14:42:13.657168 6206 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0131 14:42:13.657186 6206 factory.go:656] Stopping watch factory\\\\nI0131 14:42:13.657200 6206 ovnkube.go:599] Stopped ovnkube\\\\nI0131 14:42:13.657232 6206 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0131 14:42:13.657247 6206 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0131 14:42:13.657256 6206 handler.go:208] Removed *v1.Node event handler 2\\\\nI0131 14:42:13.657266 6206 handler.go:208] Removed *v1.Node event handler 7\\\\nI0131 14:42:13.657275 6206 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0131 14:42:13.657286 6206 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0131 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7f5e784895f315aa4d8fb6039b9e70c46f977a80fdd5d39ead81d75048abd6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n8cdt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:16Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.538263 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.538336 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.538365 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.538397 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.538419 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:16Z","lastTransitionTime":"2026-01-31T14:42:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.538331 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6afb84a0-c564-45aa-b7a1-cd6f8273fe45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e4fe03f50f95359d284f7d5aebb0df697fba39743cf83857c7f17893cb2bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08603a998264e94ef7871338a47019047495685fb867c5fa4f290f8a990dc6ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddd61fd469b2710e86edf5644cff4c4272ce73a59b6578c4cf941c5408e2d218\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df6c6abfaa0e16e8c6de87da9d3584b57a9e83e9dee03f98bb9de38cf71cc19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f93de7a002a1e6a281d90042a003acf677ad136b29da511ead9acd31eb7ab2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T14:41:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 14:41:50.030836 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 14:41:50.033684 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-814856940/tls.crt::/tmp/serving-cert-814856940/tls.key\\\\\\\"\\\\nI0131 14:41:55.828999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 14:41:55.832995 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 14:41:55.833028 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 14:41:55.833056 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 14:41:55.833065 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 14:41:55.854492 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 14:41:55.854645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 14:41:55.854833 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 14:41:55.854881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 14:41:55.854891 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 14:41:55.855031 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 14:41:55.858196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d34543b9a432a0dd029c18b90dd065749cf5a74b9221dcedf22a810afa749a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:16Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.560937 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d4eee728dbd9b6b0ade5149fd481cb3a688a6856f56a3a25a6cb49c944ee27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:16Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.580199 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:16Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.601847 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:16Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.621011 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c170e8-22c9-43a9-8b34-9d626c2ccddc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d86496066c7ba97ec9852c160ecb2e25f49cb0b7ae16e667f6279be42fd52679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3956d143be77f4a50143f9678eb51ab7871e250cae73d87c9e7fce2575e466c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wpj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:16Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.633302 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/68aeb9c7-d3c3-4c34-96ab-bb947421c504-metrics-certs\") pod \"network-metrics-daemon-xtn6l\" (UID: \"68aeb9c7-d3c3-4c34-96ab-bb947421c504\") " pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:42:16 crc kubenswrapper[4751]: E0131 14:42:16.633583 4751 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 14:42:16 crc kubenswrapper[4751]: E0131 14:42:16.633727 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68aeb9c7-d3c3-4c34-96ab-bb947421c504-metrics-certs podName:68aeb9c7-d3c3-4c34-96ab-bb947421c504 nodeName:}" failed. No retries permitted until 2026-01-31 14:42:17.633692489 +0000 UTC m=+42.008405434 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/68aeb9c7-d3c3-4c34-96ab-bb947421c504-metrics-certs") pod "network-metrics-daemon-xtn6l" (UID: "68aeb9c7-d3c3-4c34-96ab-bb947421c504") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.637888 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b7a3883-0c5a-4116-adb0-ad25f69cd7f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daae5a3d70247f744d33c832eaf3080a91e73ffe0b5327d111ec8e87175becdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37355e2b9d95a941be3552b713f7fe96973a49528f7a979524779b263412853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9317b698da5ee8a5f40c046d810783e523f27b6806319023e57e08f5cae267c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff7cd215439083e1e83ef38a79946be1c7e9bbc20d66cdb931c456a21bbdf17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:16Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.643588 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.643638 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.643656 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.643680 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.643697 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:16Z","lastTransitionTime":"2026-01-31T14:42:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.656157 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc4ac60a9db8d22605139ad928fb8fd667aaba0b8563be03e796ad077fb525bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:16Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.676545 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rtthp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7dd989b-33df-4562-a60b-f273428fea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ef2aa95d0f98ef0706040984786e2b48337984b508447f1b49b155ffb5f4608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwrtf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rtthp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:16Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.697408 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rp5sb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5353863-ec39-4357-9b86-9be42ca17916\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99c893782bcbcbd370030fe164c6849abe258ce88db65e3d22c269d20d0bef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c73fb39c6c9db93acb0fdc34044b1cafc05046a470e3897eeaa7fac8107691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c73fb39c6c9db93acb0fdc34044b1cafc05046a470e3897eeaa7fac8107691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd0a7bc51ee2198f74d0adbfa358d772dc5176a7671b6a551157ad301348181d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd0a7bc51ee2198f74d0adbfa358d772dc5176a7671b6a551157ad301348181d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dbe202f26fa5c09ba45db2a0bbb7fd7796b91e665df4529eddfdabd6ce083fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dbe202f26fa5c09ba45db2a0bbb7fd7796b91e665df4529eddfdabd6ce083fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rp5sb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:16Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.710164 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4v79q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd8c0730-67df-445e-a6ce-c2edce5d9c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f27098d3c41dbb10f76ba04a8a989e91ff3eb6fe0fb0ca746e33417839235c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45q8c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34680c760b5c6a6e2a521731e962301b54aa3184d5d66792fb43e991c6502a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45q8c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4v79q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:16Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.746325 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.746381 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.746397 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.746419 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.746436 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:16Z","lastTransitionTime":"2026-01-31T14:42:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.850312 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.850372 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.850391 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.850420 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.850448 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:16Z","lastTransitionTime":"2026-01-31T14:42:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.953895 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.953961 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.953981 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.954005 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:16 crc kubenswrapper[4751]: I0131 14:42:16.954025 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:16Z","lastTransitionTime":"2026-01-31T14:42:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:17 crc kubenswrapper[4751]: I0131 14:42:17.057206 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:17 crc kubenswrapper[4751]: I0131 14:42:17.057281 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:17 crc kubenswrapper[4751]: I0131 14:42:17.057307 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:17 crc kubenswrapper[4751]: I0131 14:42:17.057336 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:17 crc kubenswrapper[4751]: I0131 14:42:17.057356 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:17Z","lastTransitionTime":"2026-01-31T14:42:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:17 crc kubenswrapper[4751]: I0131 14:42:17.160618 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:17 crc kubenswrapper[4751]: I0131 14:42:17.160705 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:17 crc kubenswrapper[4751]: I0131 14:42:17.160731 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:17 crc kubenswrapper[4751]: I0131 14:42:17.160763 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:17 crc kubenswrapper[4751]: I0131 14:42:17.160781 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:17Z","lastTransitionTime":"2026-01-31T14:42:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:17 crc kubenswrapper[4751]: I0131 14:42:17.263932 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:17 crc kubenswrapper[4751]: I0131 14:42:17.263991 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:17 crc kubenswrapper[4751]: I0131 14:42:17.264009 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:17 crc kubenswrapper[4751]: I0131 14:42:17.264034 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:17 crc kubenswrapper[4751]: I0131 14:42:17.264055 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:17Z","lastTransitionTime":"2026-01-31T14:42:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:17 crc kubenswrapper[4751]: I0131 14:42:17.366392 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:17 crc kubenswrapper[4751]: I0131 14:42:17.366475 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:17 crc kubenswrapper[4751]: I0131 14:42:17.366498 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:17 crc kubenswrapper[4751]: I0131 14:42:17.366526 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:17 crc kubenswrapper[4751]: I0131 14:42:17.366544 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:17Z","lastTransitionTime":"2026-01-31T14:42:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:17 crc kubenswrapper[4751]: I0131 14:42:17.374670 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 23:42:51.95951734 +0000 UTC Jan 31 14:42:17 crc kubenswrapper[4751]: I0131 14:42:17.405381 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:42:17 crc kubenswrapper[4751]: I0131 14:42:17.405468 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:42:17 crc kubenswrapper[4751]: I0131 14:42:17.405475 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:42:17 crc kubenswrapper[4751]: I0131 14:42:17.405548 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:42:17 crc kubenswrapper[4751]: E0131 14:42:17.406363 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 14:42:17 crc kubenswrapper[4751]: E0131 14:42:17.406208 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 14:42:17 crc kubenswrapper[4751]: E0131 14:42:17.406507 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xtn6l" podUID="68aeb9c7-d3c3-4c34-96ab-bb947421c504" Jan 31 14:42:17 crc kubenswrapper[4751]: E0131 14:42:17.406634 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 14:42:17 crc kubenswrapper[4751]: I0131 14:42:17.469232 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:17 crc kubenswrapper[4751]: I0131 14:42:17.469298 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:17 crc kubenswrapper[4751]: I0131 14:42:17.469315 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:17 crc kubenswrapper[4751]: I0131 14:42:17.469345 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:17 crc kubenswrapper[4751]: I0131 14:42:17.469363 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:17Z","lastTransitionTime":"2026-01-31T14:42:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:17 crc kubenswrapper[4751]: I0131 14:42:17.571916 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:17 crc kubenswrapper[4751]: I0131 14:42:17.571969 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:17 crc kubenswrapper[4751]: I0131 14:42:17.571985 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:17 crc kubenswrapper[4751]: I0131 14:42:17.572010 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:17 crc kubenswrapper[4751]: I0131 14:42:17.572028 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:17Z","lastTransitionTime":"2026-01-31T14:42:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:17 crc kubenswrapper[4751]: I0131 14:42:17.642124 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/68aeb9c7-d3c3-4c34-96ab-bb947421c504-metrics-certs\") pod \"network-metrics-daemon-xtn6l\" (UID: \"68aeb9c7-d3c3-4c34-96ab-bb947421c504\") " pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:42:17 crc kubenswrapper[4751]: E0131 14:42:17.642354 4751 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 14:42:17 crc kubenswrapper[4751]: E0131 14:42:17.642458 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68aeb9c7-d3c3-4c34-96ab-bb947421c504-metrics-certs podName:68aeb9c7-d3c3-4c34-96ab-bb947421c504 nodeName:}" failed. No retries permitted until 2026-01-31 14:42:19.642427097 +0000 UTC m=+44.017140012 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/68aeb9c7-d3c3-4c34-96ab-bb947421c504-metrics-certs") pod "network-metrics-daemon-xtn6l" (UID: "68aeb9c7-d3c3-4c34-96ab-bb947421c504") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 14:42:17 crc kubenswrapper[4751]: I0131 14:42:17.675659 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:17 crc kubenswrapper[4751]: I0131 14:42:17.675718 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:17 crc kubenswrapper[4751]: I0131 14:42:17.675735 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:17 crc kubenswrapper[4751]: I0131 14:42:17.675761 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:17 crc kubenswrapper[4751]: I0131 14:42:17.675780 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:17Z","lastTransitionTime":"2026-01-31T14:42:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:17 crc kubenswrapper[4751]: I0131 14:42:17.778874 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:17 crc kubenswrapper[4751]: I0131 14:42:17.778934 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:17 crc kubenswrapper[4751]: I0131 14:42:17.778956 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:17 crc kubenswrapper[4751]: I0131 14:42:17.778985 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:17 crc kubenswrapper[4751]: I0131 14:42:17.779008 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:17Z","lastTransitionTime":"2026-01-31T14:42:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:17 crc kubenswrapper[4751]: I0131 14:42:17.882410 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:17 crc kubenswrapper[4751]: I0131 14:42:17.882469 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:17 crc kubenswrapper[4751]: I0131 14:42:17.882571 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:17 crc kubenswrapper[4751]: I0131 14:42:17.882607 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:17 crc kubenswrapper[4751]: I0131 14:42:17.882631 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:17Z","lastTransitionTime":"2026-01-31T14:42:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:17 crc kubenswrapper[4751]: I0131 14:42:17.984918 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:17 crc kubenswrapper[4751]: I0131 14:42:17.984959 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:17 crc kubenswrapper[4751]: I0131 14:42:17.984971 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:17 crc kubenswrapper[4751]: I0131 14:42:17.984989 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:17 crc kubenswrapper[4751]: I0131 14:42:17.985002 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:17Z","lastTransitionTime":"2026-01-31T14:42:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:18 crc kubenswrapper[4751]: I0131 14:42:18.088179 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:18 crc kubenswrapper[4751]: I0131 14:42:18.088269 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:18 crc kubenswrapper[4751]: I0131 14:42:18.088290 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:18 crc kubenswrapper[4751]: I0131 14:42:18.088314 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:18 crc kubenswrapper[4751]: I0131 14:42:18.088330 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:18Z","lastTransitionTime":"2026-01-31T14:42:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:18 crc kubenswrapper[4751]: I0131 14:42:18.191722 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:18 crc kubenswrapper[4751]: I0131 14:42:18.191774 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:18 crc kubenswrapper[4751]: I0131 14:42:18.191792 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:18 crc kubenswrapper[4751]: I0131 14:42:18.191816 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:18 crc kubenswrapper[4751]: I0131 14:42:18.191836 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:18Z","lastTransitionTime":"2026-01-31T14:42:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:18 crc kubenswrapper[4751]: I0131 14:42:18.294530 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:18 crc kubenswrapper[4751]: I0131 14:42:18.294615 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:18 crc kubenswrapper[4751]: I0131 14:42:18.294636 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:18 crc kubenswrapper[4751]: I0131 14:42:18.294659 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:18 crc kubenswrapper[4751]: I0131 14:42:18.294677 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:18Z","lastTransitionTime":"2026-01-31T14:42:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:18 crc kubenswrapper[4751]: I0131 14:42:18.374846 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 15:37:36.825698702 +0000 UTC Jan 31 14:42:18 crc kubenswrapper[4751]: I0131 14:42:18.397237 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:18 crc kubenswrapper[4751]: I0131 14:42:18.397286 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:18 crc kubenswrapper[4751]: I0131 14:42:18.397302 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:18 crc kubenswrapper[4751]: I0131 14:42:18.397324 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:18 crc kubenswrapper[4751]: I0131 14:42:18.397342 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:18Z","lastTransitionTime":"2026-01-31T14:42:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:18 crc kubenswrapper[4751]: I0131 14:42:18.500399 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:18 crc kubenswrapper[4751]: I0131 14:42:18.500464 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:18 crc kubenswrapper[4751]: I0131 14:42:18.500486 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:18 crc kubenswrapper[4751]: I0131 14:42:18.500532 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:18 crc kubenswrapper[4751]: I0131 14:42:18.500557 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:18Z","lastTransitionTime":"2026-01-31T14:42:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:18 crc kubenswrapper[4751]: I0131 14:42:18.603473 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:18 crc kubenswrapper[4751]: I0131 14:42:18.603542 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:18 crc kubenswrapper[4751]: I0131 14:42:18.603560 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:18 crc kubenswrapper[4751]: I0131 14:42:18.603584 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:18 crc kubenswrapper[4751]: I0131 14:42:18.603602 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:18Z","lastTransitionTime":"2026-01-31T14:42:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:18 crc kubenswrapper[4751]: I0131 14:42:18.713707 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:18 crc kubenswrapper[4751]: I0131 14:42:18.713758 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:18 crc kubenswrapper[4751]: I0131 14:42:18.713770 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:18 crc kubenswrapper[4751]: I0131 14:42:18.713787 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:18 crc kubenswrapper[4751]: I0131 14:42:18.713798 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:18Z","lastTransitionTime":"2026-01-31T14:42:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:18 crc kubenswrapper[4751]: I0131 14:42:18.816432 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:18 crc kubenswrapper[4751]: I0131 14:42:18.816490 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:18 crc kubenswrapper[4751]: I0131 14:42:18.816512 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:18 crc kubenswrapper[4751]: I0131 14:42:18.816542 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:18 crc kubenswrapper[4751]: I0131 14:42:18.816564 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:18Z","lastTransitionTime":"2026-01-31T14:42:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:18 crc kubenswrapper[4751]: I0131 14:42:18.919925 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:18 crc kubenswrapper[4751]: I0131 14:42:18.919998 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:18 crc kubenswrapper[4751]: I0131 14:42:18.920021 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:18 crc kubenswrapper[4751]: I0131 14:42:18.920049 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:18 crc kubenswrapper[4751]: I0131 14:42:18.920125 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:18Z","lastTransitionTime":"2026-01-31T14:42:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:19 crc kubenswrapper[4751]: I0131 14:42:19.023358 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:19 crc kubenswrapper[4751]: I0131 14:42:19.023427 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:19 crc kubenswrapper[4751]: I0131 14:42:19.023445 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:19 crc kubenswrapper[4751]: I0131 14:42:19.023470 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:19 crc kubenswrapper[4751]: I0131 14:42:19.023487 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:19Z","lastTransitionTime":"2026-01-31T14:42:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:19 crc kubenswrapper[4751]: I0131 14:42:19.127773 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:19 crc kubenswrapper[4751]: I0131 14:42:19.127863 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:19 crc kubenswrapper[4751]: I0131 14:42:19.127881 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:19 crc kubenswrapper[4751]: I0131 14:42:19.127905 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:19 crc kubenswrapper[4751]: I0131 14:42:19.127923 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:19Z","lastTransitionTime":"2026-01-31T14:42:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:19 crc kubenswrapper[4751]: I0131 14:42:19.231225 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:19 crc kubenswrapper[4751]: I0131 14:42:19.231285 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:19 crc kubenswrapper[4751]: I0131 14:42:19.231301 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:19 crc kubenswrapper[4751]: I0131 14:42:19.231324 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:19 crc kubenswrapper[4751]: I0131 14:42:19.231343 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:19Z","lastTransitionTime":"2026-01-31T14:42:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:19 crc kubenswrapper[4751]: I0131 14:42:19.334580 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:19 crc kubenswrapper[4751]: I0131 14:42:19.334644 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:19 crc kubenswrapper[4751]: I0131 14:42:19.334663 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:19 crc kubenswrapper[4751]: I0131 14:42:19.334688 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:19 crc kubenswrapper[4751]: I0131 14:42:19.334706 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:19Z","lastTransitionTime":"2026-01-31T14:42:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:19 crc kubenswrapper[4751]: I0131 14:42:19.375586 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 16:51:05.085805033 +0000 UTC Jan 31 14:42:19 crc kubenswrapper[4751]: I0131 14:42:19.404908 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:42:19 crc kubenswrapper[4751]: I0131 14:42:19.404987 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:42:19 crc kubenswrapper[4751]: E0131 14:42:19.405423 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xtn6l" podUID="68aeb9c7-d3c3-4c34-96ab-bb947421c504" Jan 31 14:42:19 crc kubenswrapper[4751]: I0131 14:42:19.405063 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:42:19 crc kubenswrapper[4751]: E0131 14:42:19.405539 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 14:42:19 crc kubenswrapper[4751]: I0131 14:42:19.405042 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:42:19 crc kubenswrapper[4751]: E0131 14:42:19.405629 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 14:42:19 crc kubenswrapper[4751]: E0131 14:42:19.405434 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 14:42:19 crc kubenswrapper[4751]: I0131 14:42:19.437168 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:19 crc kubenswrapper[4751]: I0131 14:42:19.437207 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:19 crc kubenswrapper[4751]: I0131 14:42:19.437215 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:19 crc kubenswrapper[4751]: I0131 14:42:19.437229 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:19 crc kubenswrapper[4751]: I0131 14:42:19.437239 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:19Z","lastTransitionTime":"2026-01-31T14:42:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:19 crc kubenswrapper[4751]: I0131 14:42:19.540362 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:19 crc kubenswrapper[4751]: I0131 14:42:19.540435 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:19 crc kubenswrapper[4751]: I0131 14:42:19.540457 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:19 crc kubenswrapper[4751]: I0131 14:42:19.540488 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:19 crc kubenswrapper[4751]: I0131 14:42:19.540509 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:19Z","lastTransitionTime":"2026-01-31T14:42:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:19 crc kubenswrapper[4751]: I0131 14:42:19.643553 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:19 crc kubenswrapper[4751]: I0131 14:42:19.643623 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:19 crc kubenswrapper[4751]: I0131 14:42:19.643640 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:19 crc kubenswrapper[4751]: I0131 14:42:19.643665 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:19 crc kubenswrapper[4751]: I0131 14:42:19.643685 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:19Z","lastTransitionTime":"2026-01-31T14:42:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:19 crc kubenswrapper[4751]: I0131 14:42:19.665218 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/68aeb9c7-d3c3-4c34-96ab-bb947421c504-metrics-certs\") pod \"network-metrics-daemon-xtn6l\" (UID: \"68aeb9c7-d3c3-4c34-96ab-bb947421c504\") " pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:42:19 crc kubenswrapper[4751]: E0131 14:42:19.665485 4751 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 14:42:19 crc kubenswrapper[4751]: E0131 14:42:19.665613 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68aeb9c7-d3c3-4c34-96ab-bb947421c504-metrics-certs podName:68aeb9c7-d3c3-4c34-96ab-bb947421c504 nodeName:}" failed. No retries permitted until 2026-01-31 14:42:23.665584711 +0000 UTC m=+48.040297636 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/68aeb9c7-d3c3-4c34-96ab-bb947421c504-metrics-certs") pod "network-metrics-daemon-xtn6l" (UID: "68aeb9c7-d3c3-4c34-96ab-bb947421c504") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 14:42:19 crc kubenswrapper[4751]: I0131 14:42:19.747236 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:19 crc kubenswrapper[4751]: I0131 14:42:19.747547 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:19 crc kubenswrapper[4751]: I0131 14:42:19.747821 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:19 crc kubenswrapper[4751]: I0131 14:42:19.748167 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:19 crc kubenswrapper[4751]: I0131 14:42:19.748730 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:19Z","lastTransitionTime":"2026-01-31T14:42:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:19 crc kubenswrapper[4751]: I0131 14:42:19.852101 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:19 crc kubenswrapper[4751]: I0131 14:42:19.852152 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:19 crc kubenswrapper[4751]: I0131 14:42:19.852170 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:19 crc kubenswrapper[4751]: I0131 14:42:19.852193 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:19 crc kubenswrapper[4751]: I0131 14:42:19.852212 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:19Z","lastTransitionTime":"2026-01-31T14:42:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:19 crc kubenswrapper[4751]: I0131 14:42:19.955704 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:19 crc kubenswrapper[4751]: I0131 14:42:19.955766 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:19 crc kubenswrapper[4751]: I0131 14:42:19.955784 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:19 crc kubenswrapper[4751]: I0131 14:42:19.955808 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:19 crc kubenswrapper[4751]: I0131 14:42:19.955825 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:19Z","lastTransitionTime":"2026-01-31T14:42:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:20 crc kubenswrapper[4751]: I0131 14:42:20.060063 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:20 crc kubenswrapper[4751]: I0131 14:42:20.060431 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:20 crc kubenswrapper[4751]: I0131 14:42:20.060628 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:20 crc kubenswrapper[4751]: I0131 14:42:20.060847 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:20 crc kubenswrapper[4751]: I0131 14:42:20.060993 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:20Z","lastTransitionTime":"2026-01-31T14:42:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:20 crc kubenswrapper[4751]: I0131 14:42:20.164328 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:20 crc kubenswrapper[4751]: I0131 14:42:20.164777 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:20 crc kubenswrapper[4751]: I0131 14:42:20.165042 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:20 crc kubenswrapper[4751]: I0131 14:42:20.165634 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:20 crc kubenswrapper[4751]: I0131 14:42:20.166157 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:20Z","lastTransitionTime":"2026-01-31T14:42:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:20 crc kubenswrapper[4751]: I0131 14:42:20.269273 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:20 crc kubenswrapper[4751]: I0131 14:42:20.269624 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:20 crc kubenswrapper[4751]: I0131 14:42:20.269870 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:20 crc kubenswrapper[4751]: I0131 14:42:20.270141 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:20 crc kubenswrapper[4751]: I0131 14:42:20.270302 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:20Z","lastTransitionTime":"2026-01-31T14:42:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:20 crc kubenswrapper[4751]: I0131 14:42:20.374521 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:20 crc kubenswrapper[4751]: I0131 14:42:20.374584 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:20 crc kubenswrapper[4751]: I0131 14:42:20.374602 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:20 crc kubenswrapper[4751]: I0131 14:42:20.374633 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:20 crc kubenswrapper[4751]: I0131 14:42:20.374651 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:20Z","lastTransitionTime":"2026-01-31T14:42:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:20 crc kubenswrapper[4751]: I0131 14:42:20.375715 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 09:10:57.416654753 +0000 UTC Jan 31 14:42:20 crc kubenswrapper[4751]: I0131 14:42:20.477311 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:20 crc kubenswrapper[4751]: I0131 14:42:20.477376 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:20 crc kubenswrapper[4751]: I0131 14:42:20.477398 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:20 crc kubenswrapper[4751]: I0131 14:42:20.477428 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:20 crc kubenswrapper[4751]: I0131 14:42:20.477453 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:20Z","lastTransitionTime":"2026-01-31T14:42:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:20 crc kubenswrapper[4751]: I0131 14:42:20.580188 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:20 crc kubenswrapper[4751]: I0131 14:42:20.580238 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:20 crc kubenswrapper[4751]: I0131 14:42:20.580254 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:20 crc kubenswrapper[4751]: I0131 14:42:20.580277 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:20 crc kubenswrapper[4751]: I0131 14:42:20.580295 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:20Z","lastTransitionTime":"2026-01-31T14:42:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:20 crc kubenswrapper[4751]: I0131 14:42:20.683868 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:20 crc kubenswrapper[4751]: I0131 14:42:20.683925 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:20 crc kubenswrapper[4751]: I0131 14:42:20.683941 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:20 crc kubenswrapper[4751]: I0131 14:42:20.683964 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:20 crc kubenswrapper[4751]: I0131 14:42:20.683982 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:20Z","lastTransitionTime":"2026-01-31T14:42:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:20 crc kubenswrapper[4751]: I0131 14:42:20.787244 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:20 crc kubenswrapper[4751]: I0131 14:42:20.787301 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:20 crc kubenswrapper[4751]: I0131 14:42:20.787319 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:20 crc kubenswrapper[4751]: I0131 14:42:20.787344 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:20 crc kubenswrapper[4751]: I0131 14:42:20.787360 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:20Z","lastTransitionTime":"2026-01-31T14:42:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:20 crc kubenswrapper[4751]: I0131 14:42:20.889899 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:20 crc kubenswrapper[4751]: I0131 14:42:20.889951 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:20 crc kubenswrapper[4751]: I0131 14:42:20.889969 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:20 crc kubenswrapper[4751]: I0131 14:42:20.889993 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:20 crc kubenswrapper[4751]: I0131 14:42:20.890010 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:20Z","lastTransitionTime":"2026-01-31T14:42:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:20 crc kubenswrapper[4751]: I0131 14:42:20.992639 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:20 crc kubenswrapper[4751]: I0131 14:42:20.992704 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:20 crc kubenswrapper[4751]: I0131 14:42:20.992721 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:20 crc kubenswrapper[4751]: I0131 14:42:20.992744 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:20 crc kubenswrapper[4751]: I0131 14:42:20.992761 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:20Z","lastTransitionTime":"2026-01-31T14:42:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:21 crc kubenswrapper[4751]: I0131 14:42:21.098634 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:21 crc kubenswrapper[4751]: I0131 14:42:21.098687 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:21 crc kubenswrapper[4751]: I0131 14:42:21.098702 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:21 crc kubenswrapper[4751]: I0131 14:42:21.098723 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:21 crc kubenswrapper[4751]: I0131 14:42:21.098743 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:21Z","lastTransitionTime":"2026-01-31T14:42:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:21 crc kubenswrapper[4751]: I0131 14:42:21.201482 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:21 crc kubenswrapper[4751]: I0131 14:42:21.201548 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:21 crc kubenswrapper[4751]: I0131 14:42:21.201574 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:21 crc kubenswrapper[4751]: I0131 14:42:21.201601 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:21 crc kubenswrapper[4751]: I0131 14:42:21.201623 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:21Z","lastTransitionTime":"2026-01-31T14:42:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:21 crc kubenswrapper[4751]: I0131 14:42:21.305191 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:21 crc kubenswrapper[4751]: I0131 14:42:21.305549 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:21 crc kubenswrapper[4751]: I0131 14:42:21.305721 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:21 crc kubenswrapper[4751]: I0131 14:42:21.305934 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:21 crc kubenswrapper[4751]: I0131 14:42:21.306154 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:21Z","lastTransitionTime":"2026-01-31T14:42:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:21 crc kubenswrapper[4751]: I0131 14:42:21.376019 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 17:21:43.91524379 +0000 UTC Jan 31 14:42:21 crc kubenswrapper[4751]: I0131 14:42:21.405592 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:42:21 crc kubenswrapper[4751]: E0131 14:42:21.405999 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 14:42:21 crc kubenswrapper[4751]: I0131 14:42:21.405722 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:42:21 crc kubenswrapper[4751]: E0131 14:42:21.406396 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 14:42:21 crc kubenswrapper[4751]: I0131 14:42:21.405726 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:42:21 crc kubenswrapper[4751]: I0131 14:42:21.405664 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:42:21 crc kubenswrapper[4751]: E0131 14:42:21.406891 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 14:42:21 crc kubenswrapper[4751]: E0131 14:42:21.407052 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xtn6l" podUID="68aeb9c7-d3c3-4c34-96ab-bb947421c504" Jan 31 14:42:21 crc kubenswrapper[4751]: I0131 14:42:21.411420 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:21 crc kubenswrapper[4751]: I0131 14:42:21.411498 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:21 crc kubenswrapper[4751]: I0131 14:42:21.411515 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:21 crc kubenswrapper[4751]: I0131 14:42:21.411534 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:21 crc kubenswrapper[4751]: I0131 14:42:21.411549 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:21Z","lastTransitionTime":"2026-01-31T14:42:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:21 crc kubenswrapper[4751]: I0131 14:42:21.520215 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:21 crc kubenswrapper[4751]: I0131 14:42:21.520573 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:21 crc kubenswrapper[4751]: I0131 14:42:21.520747 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:21 crc kubenswrapper[4751]: I0131 14:42:21.521002 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:21 crc kubenswrapper[4751]: I0131 14:42:21.521248 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:21Z","lastTransitionTime":"2026-01-31T14:42:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:21 crc kubenswrapper[4751]: I0131 14:42:21.624977 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:21 crc kubenswrapper[4751]: I0131 14:42:21.625027 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:21 crc kubenswrapper[4751]: I0131 14:42:21.625043 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:21 crc kubenswrapper[4751]: I0131 14:42:21.625101 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:21 crc kubenswrapper[4751]: I0131 14:42:21.625119 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:21Z","lastTransitionTime":"2026-01-31T14:42:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:21 crc kubenswrapper[4751]: I0131 14:42:21.727943 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:21 crc kubenswrapper[4751]: I0131 14:42:21.728023 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:21 crc kubenswrapper[4751]: I0131 14:42:21.728044 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:21 crc kubenswrapper[4751]: I0131 14:42:21.728107 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:21 crc kubenswrapper[4751]: I0131 14:42:21.728125 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:21Z","lastTransitionTime":"2026-01-31T14:42:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:21 crc kubenswrapper[4751]: I0131 14:42:21.831153 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:21 crc kubenswrapper[4751]: I0131 14:42:21.831643 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:21 crc kubenswrapper[4751]: I0131 14:42:21.831780 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:21 crc kubenswrapper[4751]: I0131 14:42:21.831898 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:21 crc kubenswrapper[4751]: I0131 14:42:21.832015 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:21Z","lastTransitionTime":"2026-01-31T14:42:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:21 crc kubenswrapper[4751]: I0131 14:42:21.935345 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:21 crc kubenswrapper[4751]: I0131 14:42:21.935664 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:21 crc kubenswrapper[4751]: I0131 14:42:21.935810 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:21 crc kubenswrapper[4751]: I0131 14:42:21.935946 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:21 crc kubenswrapper[4751]: I0131 14:42:21.936062 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:21Z","lastTransitionTime":"2026-01-31T14:42:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.038817 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.038865 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.038887 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.038917 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.038937 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:22Z","lastTransitionTime":"2026-01-31T14:42:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.142473 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.142527 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.142544 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.142566 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.142584 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:22Z","lastTransitionTime":"2026-01-31T14:42:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.246042 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.246134 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.246152 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.246176 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.246194 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:22Z","lastTransitionTime":"2026-01-31T14:42:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.349423 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.349483 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.349505 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.349539 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.349561 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:22Z","lastTransitionTime":"2026-01-31T14:42:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.377453 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 09:31:55.38334444 +0000 UTC Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.452891 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.452972 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.452989 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.453012 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.453030 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:22Z","lastTransitionTime":"2026-01-31T14:42:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.556421 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.556516 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.556536 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.556562 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.556579 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:22Z","lastTransitionTime":"2026-01-31T14:42:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.660259 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.660314 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.660333 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.660355 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.660372 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:22Z","lastTransitionTime":"2026-01-31T14:42:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.762668 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.762715 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.762732 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.762754 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.762771 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:22Z","lastTransitionTime":"2026-01-31T14:42:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.788368 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.788441 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.788463 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.788492 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.788514 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:22Z","lastTransitionTime":"2026-01-31T14:42:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:22 crc kubenswrapper[4751]: E0131 14:42:22.811634 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2bc08d22-1e39-4800-b402-ea260cc19637\\\",\\\"systemUUID\\\":\\\"ad4951a2-dbbf-4c4e-af3f-cce3c25b01ef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:22Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.817264 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.817339 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.817366 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.817395 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.817416 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:22Z","lastTransitionTime":"2026-01-31T14:42:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:22 crc kubenswrapper[4751]: E0131 14:42:22.833614 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2bc08d22-1e39-4800-b402-ea260cc19637\\\",\\\"systemUUID\\\":\\\"ad4951a2-dbbf-4c4e-af3f-cce3c25b01ef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:22Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.838596 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.838664 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.838687 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.838717 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.838737 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:22Z","lastTransitionTime":"2026-01-31T14:42:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:22 crc kubenswrapper[4751]: E0131 14:42:22.861179 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2bc08d22-1e39-4800-b402-ea260cc19637\\\",\\\"systemUUID\\\":\\\"ad4951a2-dbbf-4c4e-af3f-cce3c25b01ef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:22Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.867198 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.867316 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.867361 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.867397 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.867421 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:22Z","lastTransitionTime":"2026-01-31T14:42:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:22 crc kubenswrapper[4751]: E0131 14:42:22.886693 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2bc08d22-1e39-4800-b402-ea260cc19637\\\",\\\"systemUUID\\\":\\\"ad4951a2-dbbf-4c4e-af3f-cce3c25b01ef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:22Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.891312 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.891528 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.891556 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.891579 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.891597 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:22Z","lastTransitionTime":"2026-01-31T14:42:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:22 crc kubenswrapper[4751]: E0131 14:42:22.907385 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2bc08d22-1e39-4800-b402-ea260cc19637\\\",\\\"systemUUID\\\":\\\"ad4951a2-dbbf-4c4e-af3f-cce3c25b01ef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:22Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:22 crc kubenswrapper[4751]: E0131 14:42:22.907605 4751 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.910001 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.910063 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.910111 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.910135 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:22 crc kubenswrapper[4751]: I0131 14:42:22.910154 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:22Z","lastTransitionTime":"2026-01-31T14:42:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:23 crc kubenswrapper[4751]: I0131 14:42:23.012908 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:23 crc kubenswrapper[4751]: I0131 14:42:23.012987 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:23 crc kubenswrapper[4751]: I0131 14:42:23.013010 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:23 crc kubenswrapper[4751]: I0131 14:42:23.013043 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:23 crc kubenswrapper[4751]: I0131 14:42:23.013065 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:23Z","lastTransitionTime":"2026-01-31T14:42:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:23 crc kubenswrapper[4751]: I0131 14:42:23.116351 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:23 crc kubenswrapper[4751]: I0131 14:42:23.116707 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:23 crc kubenswrapper[4751]: I0131 14:42:23.116792 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:23 crc kubenswrapper[4751]: I0131 14:42:23.116886 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:23 crc kubenswrapper[4751]: I0131 14:42:23.116961 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:23Z","lastTransitionTime":"2026-01-31T14:42:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:23 crc kubenswrapper[4751]: I0131 14:42:23.219562 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:23 crc kubenswrapper[4751]: I0131 14:42:23.219637 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:23 crc kubenswrapper[4751]: I0131 14:42:23.219656 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:23 crc kubenswrapper[4751]: I0131 14:42:23.219681 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:23 crc kubenswrapper[4751]: I0131 14:42:23.219698 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:23Z","lastTransitionTime":"2026-01-31T14:42:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:23 crc kubenswrapper[4751]: I0131 14:42:23.322740 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:23 crc kubenswrapper[4751]: I0131 14:42:23.322810 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:23 crc kubenswrapper[4751]: I0131 14:42:23.322832 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:23 crc kubenswrapper[4751]: I0131 14:42:23.322859 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:23 crc kubenswrapper[4751]: I0131 14:42:23.322876 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:23Z","lastTransitionTime":"2026-01-31T14:42:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:23 crc kubenswrapper[4751]: I0131 14:42:23.378456 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 17:41:18.316144547 +0000 UTC Jan 31 14:42:23 crc kubenswrapper[4751]: I0131 14:42:23.405151 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:42:23 crc kubenswrapper[4751]: I0131 14:42:23.405181 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:42:23 crc kubenswrapper[4751]: I0131 14:42:23.405211 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:42:23 crc kubenswrapper[4751]: I0131 14:42:23.405367 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:42:23 crc kubenswrapper[4751]: E0131 14:42:23.405522 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xtn6l" podUID="68aeb9c7-d3c3-4c34-96ab-bb947421c504" Jan 31 14:42:23 crc kubenswrapper[4751]: E0131 14:42:23.405641 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 14:42:23 crc kubenswrapper[4751]: E0131 14:42:23.405774 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 14:42:23 crc kubenswrapper[4751]: E0131 14:42:23.405898 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 14:42:23 crc kubenswrapper[4751]: I0131 14:42:23.425299 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:23 crc kubenswrapper[4751]: I0131 14:42:23.425357 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:23 crc kubenswrapper[4751]: I0131 14:42:23.425407 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:23 crc kubenswrapper[4751]: I0131 14:42:23.425430 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:23 crc kubenswrapper[4751]: I0131 14:42:23.425447 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:23Z","lastTransitionTime":"2026-01-31T14:42:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:23 crc kubenswrapper[4751]: I0131 14:42:23.528347 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:23 crc kubenswrapper[4751]: I0131 14:42:23.528442 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:23 crc kubenswrapper[4751]: I0131 14:42:23.528493 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:23 crc kubenswrapper[4751]: I0131 14:42:23.528519 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:23 crc kubenswrapper[4751]: I0131 14:42:23.528573 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:23Z","lastTransitionTime":"2026-01-31T14:42:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:23 crc kubenswrapper[4751]: I0131 14:42:23.631469 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:23 crc kubenswrapper[4751]: I0131 14:42:23.631514 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:23 crc kubenswrapper[4751]: I0131 14:42:23.631524 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:23 crc kubenswrapper[4751]: I0131 14:42:23.631538 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:23 crc kubenswrapper[4751]: I0131 14:42:23.631546 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:23Z","lastTransitionTime":"2026-01-31T14:42:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:23 crc kubenswrapper[4751]: I0131 14:42:23.718237 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/68aeb9c7-d3c3-4c34-96ab-bb947421c504-metrics-certs\") pod \"network-metrics-daemon-xtn6l\" (UID: \"68aeb9c7-d3c3-4c34-96ab-bb947421c504\") " pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:42:23 crc kubenswrapper[4751]: E0131 14:42:23.718497 4751 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 14:42:23 crc kubenswrapper[4751]: E0131 14:42:23.718650 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68aeb9c7-d3c3-4c34-96ab-bb947421c504-metrics-certs podName:68aeb9c7-d3c3-4c34-96ab-bb947421c504 nodeName:}" failed. No retries permitted until 2026-01-31 14:42:31.718588932 +0000 UTC m=+56.093301857 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/68aeb9c7-d3c3-4c34-96ab-bb947421c504-metrics-certs") pod "network-metrics-daemon-xtn6l" (UID: "68aeb9c7-d3c3-4c34-96ab-bb947421c504") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 14:42:23 crc kubenswrapper[4751]: I0131 14:42:23.734663 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:23 crc kubenswrapper[4751]: I0131 14:42:23.734725 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:23 crc kubenswrapper[4751]: I0131 14:42:23.734743 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:23 crc kubenswrapper[4751]: I0131 14:42:23.734767 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:23 crc kubenswrapper[4751]: I0131 14:42:23.734785 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:23Z","lastTransitionTime":"2026-01-31T14:42:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:23 crc kubenswrapper[4751]: I0131 14:42:23.837794 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:23 crc kubenswrapper[4751]: I0131 14:42:23.837844 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:23 crc kubenswrapper[4751]: I0131 14:42:23.837859 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:23 crc kubenswrapper[4751]: I0131 14:42:23.837877 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:23 crc kubenswrapper[4751]: I0131 14:42:23.837888 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:23Z","lastTransitionTime":"2026-01-31T14:42:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:23 crc kubenswrapper[4751]: I0131 14:42:23.941515 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:23 crc kubenswrapper[4751]: I0131 14:42:23.941589 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:23 crc kubenswrapper[4751]: I0131 14:42:23.941609 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:23 crc kubenswrapper[4751]: I0131 14:42:23.941633 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:23 crc kubenswrapper[4751]: I0131 14:42:23.941649 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:23Z","lastTransitionTime":"2026-01-31T14:42:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:24 crc kubenswrapper[4751]: I0131 14:42:24.045161 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:24 crc kubenswrapper[4751]: I0131 14:42:24.045230 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:24 crc kubenswrapper[4751]: I0131 14:42:24.045253 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:24 crc kubenswrapper[4751]: I0131 14:42:24.045281 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:24 crc kubenswrapper[4751]: I0131 14:42:24.045302 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:24Z","lastTransitionTime":"2026-01-31T14:42:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:24 crc kubenswrapper[4751]: I0131 14:42:24.148141 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:24 crc kubenswrapper[4751]: I0131 14:42:24.148214 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:24 crc kubenswrapper[4751]: I0131 14:42:24.148238 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:24 crc kubenswrapper[4751]: I0131 14:42:24.148267 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:24 crc kubenswrapper[4751]: I0131 14:42:24.148290 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:24Z","lastTransitionTime":"2026-01-31T14:42:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:24 crc kubenswrapper[4751]: I0131 14:42:24.251528 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:24 crc kubenswrapper[4751]: I0131 14:42:24.251579 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:24 crc kubenswrapper[4751]: I0131 14:42:24.251598 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:24 crc kubenswrapper[4751]: I0131 14:42:24.251621 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:24 crc kubenswrapper[4751]: I0131 14:42:24.251638 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:24Z","lastTransitionTime":"2026-01-31T14:42:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:24 crc kubenswrapper[4751]: I0131 14:42:24.354812 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:24 crc kubenswrapper[4751]: I0131 14:42:24.354894 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:24 crc kubenswrapper[4751]: I0131 14:42:24.354917 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:24 crc kubenswrapper[4751]: I0131 14:42:24.354947 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:24 crc kubenswrapper[4751]: I0131 14:42:24.354968 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:24Z","lastTransitionTime":"2026-01-31T14:42:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:24 crc kubenswrapper[4751]: I0131 14:42:24.379188 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 00:40:19.095011238 +0000 UTC Jan 31 14:42:24 crc kubenswrapper[4751]: I0131 14:42:24.458414 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:24 crc kubenswrapper[4751]: I0131 14:42:24.458527 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:24 crc kubenswrapper[4751]: I0131 14:42:24.458601 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:24 crc kubenswrapper[4751]: I0131 14:42:24.458631 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:24 crc kubenswrapper[4751]: I0131 14:42:24.458701 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:24Z","lastTransitionTime":"2026-01-31T14:42:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:24 crc kubenswrapper[4751]: I0131 14:42:24.561535 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:24 crc kubenswrapper[4751]: I0131 14:42:24.561590 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:24 crc kubenswrapper[4751]: I0131 14:42:24.561607 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:24 crc kubenswrapper[4751]: I0131 14:42:24.561628 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:24 crc kubenswrapper[4751]: I0131 14:42:24.561646 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:24Z","lastTransitionTime":"2026-01-31T14:42:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:24 crc kubenswrapper[4751]: I0131 14:42:24.664911 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:24 crc kubenswrapper[4751]: I0131 14:42:24.664962 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:24 crc kubenswrapper[4751]: I0131 14:42:24.664978 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:24 crc kubenswrapper[4751]: I0131 14:42:24.665000 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:24 crc kubenswrapper[4751]: I0131 14:42:24.665019 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:24Z","lastTransitionTime":"2026-01-31T14:42:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:24 crc kubenswrapper[4751]: I0131 14:42:24.767736 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:24 crc kubenswrapper[4751]: I0131 14:42:24.767779 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:24 crc kubenswrapper[4751]: I0131 14:42:24.767795 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:24 crc kubenswrapper[4751]: I0131 14:42:24.767817 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:24 crc kubenswrapper[4751]: I0131 14:42:24.767835 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:24Z","lastTransitionTime":"2026-01-31T14:42:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:24 crc kubenswrapper[4751]: I0131 14:42:24.871296 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:24 crc kubenswrapper[4751]: I0131 14:42:24.871367 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:24 crc kubenswrapper[4751]: I0131 14:42:24.871388 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:24 crc kubenswrapper[4751]: I0131 14:42:24.871416 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:24 crc kubenswrapper[4751]: I0131 14:42:24.871437 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:24Z","lastTransitionTime":"2026-01-31T14:42:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:24 crc kubenswrapper[4751]: I0131 14:42:24.974155 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:24 crc kubenswrapper[4751]: I0131 14:42:24.974219 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:24 crc kubenswrapper[4751]: I0131 14:42:24.974244 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:24 crc kubenswrapper[4751]: I0131 14:42:24.974286 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:24 crc kubenswrapper[4751]: I0131 14:42:24.974308 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:24Z","lastTransitionTime":"2026-01-31T14:42:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:25 crc kubenswrapper[4751]: I0131 14:42:25.076654 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:25 crc kubenswrapper[4751]: I0131 14:42:25.076724 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:25 crc kubenswrapper[4751]: I0131 14:42:25.076744 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:25 crc kubenswrapper[4751]: I0131 14:42:25.076769 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:25 crc kubenswrapper[4751]: I0131 14:42:25.076787 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:25Z","lastTransitionTime":"2026-01-31T14:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:25 crc kubenswrapper[4751]: I0131 14:42:25.179433 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:25 crc kubenswrapper[4751]: I0131 14:42:25.179489 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:25 crc kubenswrapper[4751]: I0131 14:42:25.179507 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:25 crc kubenswrapper[4751]: I0131 14:42:25.179558 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:25 crc kubenswrapper[4751]: I0131 14:42:25.179573 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:25Z","lastTransitionTime":"2026-01-31T14:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:25 crc kubenswrapper[4751]: I0131 14:42:25.282435 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:25 crc kubenswrapper[4751]: I0131 14:42:25.282497 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:25 crc kubenswrapper[4751]: I0131 14:42:25.282514 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:25 crc kubenswrapper[4751]: I0131 14:42:25.282537 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:25 crc kubenswrapper[4751]: I0131 14:42:25.282566 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:25Z","lastTransitionTime":"2026-01-31T14:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:25 crc kubenswrapper[4751]: I0131 14:42:25.379789 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 08:17:31.895660533 +0000 UTC Jan 31 14:42:25 crc kubenswrapper[4751]: I0131 14:42:25.385359 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:25 crc kubenswrapper[4751]: I0131 14:42:25.385412 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:25 crc kubenswrapper[4751]: I0131 14:42:25.385429 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:25 crc kubenswrapper[4751]: I0131 14:42:25.385452 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:25 crc kubenswrapper[4751]: I0131 14:42:25.385469 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:25Z","lastTransitionTime":"2026-01-31T14:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:25 crc kubenswrapper[4751]: I0131 14:42:25.404811 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:42:25 crc kubenswrapper[4751]: I0131 14:42:25.404859 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:42:25 crc kubenswrapper[4751]: I0131 14:42:25.404950 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:42:25 crc kubenswrapper[4751]: E0131 14:42:25.405016 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 14:42:25 crc kubenswrapper[4751]: I0131 14:42:25.405041 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:42:25 crc kubenswrapper[4751]: E0131 14:42:25.405204 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 14:42:25 crc kubenswrapper[4751]: E0131 14:42:25.405315 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 14:42:25 crc kubenswrapper[4751]: E0131 14:42:25.405410 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xtn6l" podUID="68aeb9c7-d3c3-4c34-96ab-bb947421c504" Jan 31 14:42:25 crc kubenswrapper[4751]: I0131 14:42:25.488586 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:25 crc kubenswrapper[4751]: I0131 14:42:25.488631 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:25 crc kubenswrapper[4751]: I0131 14:42:25.488647 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:25 crc kubenswrapper[4751]: I0131 14:42:25.488669 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:25 crc kubenswrapper[4751]: I0131 14:42:25.488688 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:25Z","lastTransitionTime":"2026-01-31T14:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:25 crc kubenswrapper[4751]: I0131 14:42:25.591395 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:25 crc kubenswrapper[4751]: I0131 14:42:25.591465 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:25 crc kubenswrapper[4751]: I0131 14:42:25.591490 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:25 crc kubenswrapper[4751]: I0131 14:42:25.591521 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:25 crc kubenswrapper[4751]: I0131 14:42:25.591542 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:25Z","lastTransitionTime":"2026-01-31T14:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:25 crc kubenswrapper[4751]: I0131 14:42:25.694768 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:25 crc kubenswrapper[4751]: I0131 14:42:25.694811 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:25 crc kubenswrapper[4751]: I0131 14:42:25.694822 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:25 crc kubenswrapper[4751]: I0131 14:42:25.694840 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:25 crc kubenswrapper[4751]: I0131 14:42:25.694853 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:25Z","lastTransitionTime":"2026-01-31T14:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:25 crc kubenswrapper[4751]: I0131 14:42:25.797942 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:25 crc kubenswrapper[4751]: I0131 14:42:25.798011 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:25 crc kubenswrapper[4751]: I0131 14:42:25.798029 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:25 crc kubenswrapper[4751]: I0131 14:42:25.798054 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:25 crc kubenswrapper[4751]: I0131 14:42:25.798101 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:25Z","lastTransitionTime":"2026-01-31T14:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:25 crc kubenswrapper[4751]: I0131 14:42:25.900627 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:25 crc kubenswrapper[4751]: I0131 14:42:25.900656 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:25 crc kubenswrapper[4751]: I0131 14:42:25.900664 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:25 crc kubenswrapper[4751]: I0131 14:42:25.900677 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:25 crc kubenswrapper[4751]: I0131 14:42:25.900686 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:25Z","lastTransitionTime":"2026-01-31T14:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.003519 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.003575 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.003593 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.003682 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.003704 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:26Z","lastTransitionTime":"2026-01-31T14:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.106954 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.107016 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.107035 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.107059 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.107103 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:26Z","lastTransitionTime":"2026-01-31T14:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.209648 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.209687 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.209698 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.209714 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.209726 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:26Z","lastTransitionTime":"2026-01-31T14:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.312588 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.312648 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.312664 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.312687 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.312704 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:26Z","lastTransitionTime":"2026-01-31T14:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.380513 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 12:14:25.956807236 +0000 UTC Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.415439 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.415498 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.415515 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.415539 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.415556 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:26Z","lastTransitionTime":"2026-01-31T14:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.426773 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7adba94225f6d18961ccad4eab46916b607a2bff977a48d36fee6810aa4b7293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11d68e8091e3d18fff0b3c8f744c2e9002b6307e17815b221af7123715d8abf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:26Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.443152 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-68hvr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658471aa-68b2-478e-9522-ef5533009174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5c5f03d09a887caf686dc5e5d446c4589c7e577cd65e9329cd73d1f33af3d53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbl8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-68hvr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:26Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.467196 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3446cde960ab284f70a338a9a4bfee334fdbdef503d868692fb847f4df6acba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20273cc05bc490f8dd5258dccd02aa0e97d1d22c159ce2c177594920f4af83de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://701178503ef461547e687442de0607af20cee8bca075347d1d1cb2a439562232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5623c591d9a6806335a9b671648fc172e749568a7ad29c3bd41f1a990ba56010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://357afc89f219ac34037aaeb7086076b36f05907502c53006aeb588eaf440cc74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8f6719bdf2fef7e21f7c5cdcb3de4b4f08e69bd62e22cb10fe157f638b3c148\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbb2056238b11891fff35b2c74e6a809f7d369d116a25803ac32725568d9461a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c599de37a76f4a9f00441a0b18a38e5315e42c49b315308d22f67c4cc68a8794\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T14:42:11Z\\\",\\\"message\\\":\\\"d (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 14:42:11.894312 6073 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0131 14:42:11.894354 6073 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0131 14:42:11.894387 6073 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0131 14:42:11.894536 6073 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0131 14:42:11.895132 6073 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0131 14:42:11.895157 6073 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0131 14:42:11.895178 6073 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0131 14:42:11.895199 6073 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0131 14:42:11.895209 6073 handler.go:208] Removed *v1.Node event handler 2\\\\nI0131 14:42:11.895239 6073 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0131 14:42:11.895309 6073 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0131 14:42:11.895321 6073 handler.go:208] Removed *v1.Node event handler 7\\\\nI0131 14:42:11.895334 6073 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0131 14:42:11.895316 6073 factory.go:656] Stopping watch factory\\\\nI0131 14:42:11.895369 6073 ovnkube.go:599] Stopped ovnkube\\\\nI0131 14:42:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbb2056238b11891fff35b2c74e6a809f7d369d116a25803ac32725568d9461a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T14:42:13Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 14:42:13.656951 6206 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0131 14:42:13.657012 6206 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0131 14:42:13.657046 6206 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0131 14:42:13.657054 6206 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0131 14:42:13.657134 6206 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0131 14:42:13.657145 6206 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0131 14:42:13.657168 6206 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0131 14:42:13.657186 6206 factory.go:656] Stopping watch factory\\\\nI0131 14:42:13.657200 6206 ovnkube.go:599] Stopped ovnkube\\\\nI0131 14:42:13.657232 6206 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0131 14:42:13.657247 6206 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0131 14:42:13.657256 6206 handler.go:208] Removed *v1.Node event handler 2\\\\nI0131 14:42:13.657266 6206 handler.go:208] Removed *v1.Node event handler 7\\\\nI0131 14:42:13.657275 6206 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0131 14:42:13.657286 6206 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0131 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7f5e784895f315aa4d8fb6039b9e70c46f977a80fdd5d39ead81d75048abd6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n8cdt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:26Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.490502 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d4eee728dbd9b6b0ade5149fd481cb3a688a6856f56a3a25a6cb49c944ee27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:26Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.511021 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:26Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.518771 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.518834 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.518852 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.518876 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.518894 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:26Z","lastTransitionTime":"2026-01-31T14:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.531536 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:26Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.548531 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c170e8-22c9-43a9-8b34-9d626c2ccddc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d86496066c7ba97ec9852c160ecb2e25f49cb0b7ae16e667f6279be42fd52679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3956d143be77f4a50143f9678eb51ab7871e250cae73d87c9e7fce2575e466c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wpj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:26Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.570871 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6afb84a0-c564-45aa-b7a1-cd6f8273fe45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e4fe03f50f95359d284f7d5aebb0df697fba39743cf83857c7f17893cb2bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08603a998264e94ef7871338a47019047495685fb867c5fa4f290f8a990dc6ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddd61fd469b2710e86edf5644cff4c4272ce73a59b6578c4cf941c5408e2d218\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df6c6abfaa0e16e8c6de87da9d3584b57a9e83e9dee03f98bb9de38cf71cc19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f93de7a002a1e6a281d90042a003acf677ad136b29da511ead9acd31eb7ab2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T14:41:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 14:41:50.030836 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 14:41:50.033684 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-814856940/tls.crt::/tmp/serving-cert-814856940/tls.key\\\\\\\"\\\\nI0131 14:41:55.828999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 14:41:55.832995 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 14:41:55.833028 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 14:41:55.833056 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 14:41:55.833065 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 14:41:55.854492 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 14:41:55.854645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 14:41:55.854833 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 14:41:55.854881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 14:41:55.854891 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 14:41:55.855031 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 14:41:55.858196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d34543b9a432a0dd029c18b90dd065749cf5a74b9221dcedf22a810afa749a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:26Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.588537 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4v79q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd8c0730-67df-445e-a6ce-c2edce5d9c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f27098d3c41dbb10f76ba04a8a989e91ff3eb6fe0fb0ca746e33417839235c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45q8c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34680c760b5c6a6e2a521731e962301b54aa3184d5d66792fb43e991c6502a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45q8c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4v79q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:26Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.611037 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b7a3883-0c5a-4116-adb0-ad25f69cd7f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daae5a3d70247f744d33c832eaf3080a91e73ffe0b5327d111ec8e87175becdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37355e2b9d95a941be3552b713f7fe96973a49528f7a979524779b263412853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9317b698da5ee8a5f40c046d810783e523f27b6806319023e57e08f5cae267c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff7cd215439083e1e83ef38a79946be1c7e9bbc20d66cdb931c456a21bbdf17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:26Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.622991 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.623377 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.623577 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.623791 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.623919 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:26Z","lastTransitionTime":"2026-01-31T14:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.632684 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc4ac60a9db8d22605139ad928fb8fd667aaba0b8563be03e796ad077fb525bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:26Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.657463 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rtthp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7dd989b-33df-4562-a60b-f273428fea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ef2aa95d0f98ef0706040984786e2b48337984b508447f1b49b155ffb5f4608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwrtf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rtthp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:26Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.683960 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rp5sb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5353863-ec39-4357-9b86-9be42ca17916\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99c893782bcbcbd370030fe164c6849abe258ce88db65e3d22c269d20d0bef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c73fb39c6c9db93acb0fdc34044b1cafc05046a470e3897eeaa7fac8107691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c73fb39c6c9db93acb0fdc34044b1cafc05046a470e3897eeaa7fac8107691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd0a7bc51ee2198f74d0adbfa358d772dc5176a7671b6a551157ad301348181d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd0a7bc51ee2198f74d0adbfa358d772dc5176a7671b6a551157ad301348181d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dbe202f26fa5c09ba45db2a0bbb7fd7796b91e665df4529eddfdabd6ce083fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dbe202f26fa5c09ba45db2a0bbb7fd7796b91e665df4529eddfdabd6ce083fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rp5sb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:26Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.706896 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:26Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.722539 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lxrfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b895e2a-7887-41c3-b641-9c72bb085dda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d0546721c181de3a5f2f4e791dcaeba69f94467bfbf08b4470332a0b4b49fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9hbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lxrfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:26Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.726533 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.726697 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.726820 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.726941 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.727044 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:26Z","lastTransitionTime":"2026-01-31T14:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.739703 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xtn6l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68aeb9c7-d3c3-4c34-96ab-bb947421c504\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hljmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hljmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xtn6l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:26Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.830156 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.830209 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.830227 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.830250 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.830268 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:26Z","lastTransitionTime":"2026-01-31T14:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.933421 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.933777 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.934088 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.934213 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:26 crc kubenswrapper[4751]: I0131 14:42:26.934320 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:26Z","lastTransitionTime":"2026-01-31T14:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:27 crc kubenswrapper[4751]: I0131 14:42:27.036948 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:27 crc kubenswrapper[4751]: I0131 14:42:27.037271 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:27 crc kubenswrapper[4751]: I0131 14:42:27.037411 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:27 crc kubenswrapper[4751]: I0131 14:42:27.037769 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:27 crc kubenswrapper[4751]: I0131 14:42:27.038045 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:27Z","lastTransitionTime":"2026-01-31T14:42:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:27 crc kubenswrapper[4751]: I0131 14:42:27.140608 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:27 crc kubenswrapper[4751]: I0131 14:42:27.140637 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:27 crc kubenswrapper[4751]: I0131 14:42:27.140645 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:27 crc kubenswrapper[4751]: I0131 14:42:27.140657 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:27 crc kubenswrapper[4751]: I0131 14:42:27.140666 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:27Z","lastTransitionTime":"2026-01-31T14:42:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:27 crc kubenswrapper[4751]: I0131 14:42:27.242872 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:27 crc kubenswrapper[4751]: I0131 14:42:27.242934 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:27 crc kubenswrapper[4751]: I0131 14:42:27.242950 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:27 crc kubenswrapper[4751]: I0131 14:42:27.242972 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:27 crc kubenswrapper[4751]: I0131 14:42:27.242986 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:27Z","lastTransitionTime":"2026-01-31T14:42:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:27 crc kubenswrapper[4751]: I0131 14:42:27.345988 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:27 crc kubenswrapper[4751]: I0131 14:42:27.346094 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:27 crc kubenswrapper[4751]: I0131 14:42:27.346112 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:27 crc kubenswrapper[4751]: I0131 14:42:27.346134 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:27 crc kubenswrapper[4751]: I0131 14:42:27.346153 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:27Z","lastTransitionTime":"2026-01-31T14:42:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:27 crc kubenswrapper[4751]: I0131 14:42:27.381592 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 15:09:14.647209082 +0000 UTC Jan 31 14:42:27 crc kubenswrapper[4751]: I0131 14:42:27.405512 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:42:27 crc kubenswrapper[4751]: I0131 14:42:27.405589 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:42:27 crc kubenswrapper[4751]: I0131 14:42:27.405532 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:42:27 crc kubenswrapper[4751]: I0131 14:42:27.405518 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:42:27 crc kubenswrapper[4751]: E0131 14:42:27.405690 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 14:42:27 crc kubenswrapper[4751]: E0131 14:42:27.405897 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xtn6l" podUID="68aeb9c7-d3c3-4c34-96ab-bb947421c504" Jan 31 14:42:27 crc kubenswrapper[4751]: E0131 14:42:27.405972 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 14:42:27 crc kubenswrapper[4751]: E0131 14:42:27.406097 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 14:42:27 crc kubenswrapper[4751]: I0131 14:42:27.448466 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:27 crc kubenswrapper[4751]: I0131 14:42:27.448493 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:27 crc kubenswrapper[4751]: I0131 14:42:27.448502 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:27 crc kubenswrapper[4751]: I0131 14:42:27.448516 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:27 crc kubenswrapper[4751]: I0131 14:42:27.448525 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:27Z","lastTransitionTime":"2026-01-31T14:42:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:27 crc kubenswrapper[4751]: I0131 14:42:27.551393 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:27 crc kubenswrapper[4751]: I0131 14:42:27.551436 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:27 crc kubenswrapper[4751]: I0131 14:42:27.551449 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:27 crc kubenswrapper[4751]: I0131 14:42:27.551465 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:27 crc kubenswrapper[4751]: I0131 14:42:27.551479 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:27Z","lastTransitionTime":"2026-01-31T14:42:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:27 crc kubenswrapper[4751]: I0131 14:42:27.654340 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:27 crc kubenswrapper[4751]: I0131 14:42:27.654409 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:27 crc kubenswrapper[4751]: I0131 14:42:27.654430 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:27 crc kubenswrapper[4751]: I0131 14:42:27.654459 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:27 crc kubenswrapper[4751]: I0131 14:42:27.654480 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:27Z","lastTransitionTime":"2026-01-31T14:42:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:27 crc kubenswrapper[4751]: I0131 14:42:27.757711 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:27 crc kubenswrapper[4751]: I0131 14:42:27.757785 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:27 crc kubenswrapper[4751]: I0131 14:42:27.757806 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:27 crc kubenswrapper[4751]: I0131 14:42:27.757832 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:27 crc kubenswrapper[4751]: I0131 14:42:27.757849 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:27Z","lastTransitionTime":"2026-01-31T14:42:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:27 crc kubenswrapper[4751]: I0131 14:42:27.860581 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:27 crc kubenswrapper[4751]: I0131 14:42:27.860631 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:27 crc kubenswrapper[4751]: I0131 14:42:27.860648 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:27 crc kubenswrapper[4751]: I0131 14:42:27.860671 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:27 crc kubenswrapper[4751]: I0131 14:42:27.860719 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:27Z","lastTransitionTime":"2026-01-31T14:42:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:27 crc kubenswrapper[4751]: I0131 14:42:27.963734 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:27 crc kubenswrapper[4751]: I0131 14:42:27.963785 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:27 crc kubenswrapper[4751]: I0131 14:42:27.963801 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:27 crc kubenswrapper[4751]: I0131 14:42:27.963826 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:27 crc kubenswrapper[4751]: I0131 14:42:27.963843 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:27Z","lastTransitionTime":"2026-01-31T14:42:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.066732 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.066811 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.066834 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.066865 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.066890 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:28Z","lastTransitionTime":"2026-01-31T14:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.170518 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.170592 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.170611 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.170635 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.170652 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:28Z","lastTransitionTime":"2026-01-31T14:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.273835 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.273897 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.273921 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.273949 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.273972 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:28Z","lastTransitionTime":"2026-01-31T14:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.377381 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.377450 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.377467 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.377491 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.377512 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:28Z","lastTransitionTime":"2026-01-31T14:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.382149 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 02:16:35.597995444 +0000 UTC Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.406230 4751 scope.go:117] "RemoveContainer" containerID="bbb2056238b11891fff35b2c74e6a809f7d369d116a25803ac32725568d9461a" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.428410 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc4ac60a9db8d22605139ad928fb8fd667aaba0b8563be03e796ad077fb525bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:28Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.459566 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rtthp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7dd989b-33df-4562-a60b-f273428fea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ef2aa95d0f98ef0706040984786e2b48337984b508447f1b49b155ffb5f4608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwrtf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rtthp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:28Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.476769 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rp5sb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5353863-ec39-4357-9b86-9be42ca17916\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99c893782bcbcbd370030fe164c6849abe258ce88db65e3d22c269d20d0bef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c73fb39c6c9db93acb0fdc34044b1cafc05046a470e3897eeaa7fac8107691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c73fb39c6c9db93acb0fdc34044b1cafc05046a470e3897eeaa7fac8107691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd0a7bc51ee2198f74d0adbfa358d772dc5176a7671b6a551157ad301348181d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd0a7bc51ee2198f74d0adbfa358d772dc5176a7671b6a551157ad301348181d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dbe202f26fa5c09ba45db2a0bbb7fd7796b91e665df4529eddfdabd6ce083fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dbe202f26fa5c09ba45db2a0bbb7fd7796b91e665df4529eddfdabd6ce083fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rp5sb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:28Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.479664 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.479691 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.479698 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.479711 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.479719 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:28Z","lastTransitionTime":"2026-01-31T14:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.486785 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4v79q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd8c0730-67df-445e-a6ce-c2edce5d9c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f27098d3c41dbb10f76ba04a8a989e91ff3eb6fe0fb0ca746e33417839235c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45q8c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34680c760b5c6a6e2a521731e962301b54aa3184d5d66792fb43e991c6502a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45q8c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4v79q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:28Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.497937 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b7a3883-0c5a-4116-adb0-ad25f69cd7f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daae5a3d70247f744d33c832eaf3080a91e73ffe0b5327d111ec8e87175becdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37355e2b9d95a941be3552b713f7fe96973a49528f7a979524779b263412853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9317b698da5ee8a5f40c046d810783e523f27b6806319023e57e08f5cae267c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff7cd215439083e1e83ef38a79946be1c7e9bbc20d66cdb931c456a21bbdf17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:28Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.507194 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lxrfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b895e2a-7887-41c3-b641-9c72bb085dda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d0546721c181de3a5f2f4e791dcaeba69f94467bfbf08b4470332a0b4b49fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9hbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lxrfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:28Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.516882 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xtn6l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68aeb9c7-d3c3-4c34-96ab-bb947421c504\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hljmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hljmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xtn6l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:28Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.529317 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:28Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.539219 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-68hvr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658471aa-68b2-478e-9522-ef5533009174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5c5f03d09a887caf686dc5e5d446c4589c7e577cd65e9329cd73d1f33af3d53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbl8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-68hvr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:28Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.562088 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3446cde960ab284f70a338a9a4bfee334fdbdef503d868692fb847f4df6acba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20273cc05bc490f8dd5258dccd02aa0e97d1d22c159ce2c177594920f4af83de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://701178503ef461547e687442de0607af20cee8bca075347d1d1cb2a439562232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5623c591d9a6806335a9b671648fc172e749568a7ad29c3bd41f1a990ba56010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://357afc89f219ac34037aaeb7086076b36f05907502c53006aeb588eaf440cc74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8f6719bdf2fef7e21f7c5cdcb3de4b4f08e69bd62e22cb10fe157f638b3c148\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbb2056238b11891fff35b2c74e6a809f7d369d116a25803ac32725568d9461a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbb2056238b11891fff35b2c74e6a809f7d369d116a25803ac32725568d9461a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T14:42:13Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 14:42:13.656951 6206 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0131 14:42:13.657012 6206 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0131 14:42:13.657046 6206 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0131 14:42:13.657054 6206 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0131 14:42:13.657134 6206 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0131 14:42:13.657145 6206 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0131 14:42:13.657168 6206 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0131 14:42:13.657186 6206 factory.go:656] Stopping watch factory\\\\nI0131 14:42:13.657200 6206 ovnkube.go:599] Stopped ovnkube\\\\nI0131 14:42:13.657232 6206 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0131 14:42:13.657247 6206 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0131 14:42:13.657256 6206 handler.go:208] Removed *v1.Node event handler 2\\\\nI0131 14:42:13.657266 6206 handler.go:208] Removed *v1.Node event handler 7\\\\nI0131 14:42:13.657275 6206 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0131 14:42:13.657286 6206 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0131 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-n8cdt_openshift-ovn-kubernetes(ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7f5e784895f315aa4d8fb6039b9e70c46f977a80fdd5d39ead81d75048abd6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n8cdt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:28Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.579519 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7adba94225f6d18961ccad4eab46916b607a2bff977a48d36fee6810aa4b7293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11d68e8091e3d18fff0b3c8f744c2e9002b6307e17815b221af7123715d8abf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:28Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.582374 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.582398 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.582406 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.582419 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.582427 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:28Z","lastTransitionTime":"2026-01-31T14:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.597814 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6afb84a0-c564-45aa-b7a1-cd6f8273fe45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e4fe03f50f95359d284f7d5aebb0df697fba39743cf83857c7f17893cb2bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08603a998264e94ef7871338a47019047495685fb867c5fa4f290f8a990dc6ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddd61fd469b2710e86edf5644cff4c4272ce73a59b6578c4cf941c5408e2d218\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df6c6abfaa0e16e8c6de87da9d3584b57a9e83e9dee03f98bb9de38cf71cc19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f93de7a002a1e6a281d90042a003acf677ad136b29da511ead9acd31eb7ab2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T14:41:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 14:41:50.030836 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 14:41:50.033684 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-814856940/tls.crt::/tmp/serving-cert-814856940/tls.key\\\\\\\"\\\\nI0131 14:41:55.828999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 14:41:55.832995 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 14:41:55.833028 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 14:41:55.833056 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 14:41:55.833065 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 14:41:55.854492 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 14:41:55.854645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 14:41:55.854833 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 14:41:55.854881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 14:41:55.854891 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 14:41:55.855031 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 14:41:55.858196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d34543b9a432a0dd029c18b90dd065749cf5a74b9221dcedf22a810afa749a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:28Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.616492 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d4eee728dbd9b6b0ade5149fd481cb3a688a6856f56a3a25a6cb49c944ee27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:28Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.633329 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:28Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.648143 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:28Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.678251 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c170e8-22c9-43a9-8b34-9d626c2ccddc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d86496066c7ba97ec9852c160ecb2e25f49cb0b7ae16e667f6279be42fd52679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3956d143be77f4a50143f9678eb51ab7871e250cae73d87c9e7fce2575e466c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wpj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:28Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.703085 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.703126 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.703138 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.703155 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.703166 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:28Z","lastTransitionTime":"2026-01-31T14:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.791701 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n8cdt_ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b/ovnkube-controller/1.log" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.808415 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.808493 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.808518 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.808547 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.808571 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:28Z","lastTransitionTime":"2026-01-31T14:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.808717 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" event={"ID":"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b","Type":"ContainerStarted","Data":"307462c29efce1de33c4d50cdbc74a6cb86d00b128964370024b108640b05b5e"} Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.808922 4751 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.825465 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rtthp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7dd989b-33df-4562-a60b-f273428fea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ef2aa95d0f98ef0706040984786e2b48337984b508447f1b49b155ffb5f4608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwrtf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rtthp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:28Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.848235 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rp5sb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5353863-ec39-4357-9b86-9be42ca17916\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99c893782bcbcbd370030fe164c6849abe258ce88db65e3d22c269d20d0bef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c73fb39c6c9db93acb0fdc34044b1cafc05046a470e3897eeaa7fac8107691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c73fb39c6c9db93acb0fdc34044b1cafc05046a470e3897eeaa7fac8107691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd0a7bc51ee2198f74d0adbfa358d772dc5176a7671b6a551157ad301348181d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd0a7bc51ee2198f74d0adbfa358d772dc5176a7671b6a551157ad301348181d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dbe202f26fa5c09ba45db2a0bbb7fd7796b91e665df4529eddfdabd6ce083fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dbe202f26fa5c09ba45db2a0bbb7fd7796b91e665df4529eddfdabd6ce083fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rp5sb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:28Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.863879 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4v79q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd8c0730-67df-445e-a6ce-c2edce5d9c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f27098d3c41dbb10f76ba04a8a989e91ff3eb6fe0fb0ca746e33417839235c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45q8c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34680c760b5c6a6e2a521731e962301b54aa3184d5d66792fb43e991c6502a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45q8c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4v79q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:28Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.883778 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b7a3883-0c5a-4116-adb0-ad25f69cd7f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daae5a3d70247f744d33c832eaf3080a91e73ffe0b5327d111ec8e87175becdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37355e2b9d95a941be3552b713f7fe96973a49528f7a979524779b263412853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9317b698da5ee8a5f40c046d810783e523f27b6806319023e57e08f5cae267c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff7cd215439083e1e83ef38a79946be1c7e9bbc20d66cdb931c456a21bbdf17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:28Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.898410 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc4ac60a9db8d22605139ad928fb8fd667aaba0b8563be03e796ad077fb525bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:28Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.911056 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.911144 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.911152 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.911165 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.911173 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:28Z","lastTransitionTime":"2026-01-31T14:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.912144 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xtn6l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68aeb9c7-d3c3-4c34-96ab-bb947421c504\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hljmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hljmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xtn6l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:28Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.917115 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.924919 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.927025 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:28Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.938354 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lxrfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b895e2a-7887-41c3-b641-9c72bb085dda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d0546721c181de3a5f2f4e791dcaeba69f94467bfbf08b4470332a0b4b49fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9hbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lxrfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:28Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.957917 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3446cde960ab284f70a338a9a4bfee334fdbdef503d868692fb847f4df6acba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20273cc05bc490f8dd5258dccd02aa0e97d1d22c159ce2c177594920f4af83de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://701178503ef461547e687442de0607af20cee8bca075347d1d1cb2a439562232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5623c591d9a6806335a9b671648fc172e749568a7ad29c3bd41f1a990ba56010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://357afc89f219ac34037aaeb7086076b36f05907502c53006aeb588eaf440cc74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8f6719bdf2fef7e21f7c5cdcb3de4b4f08e69bd62e22cb10fe157f638b3c148\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://307462c29efce1de33c4d50cdbc74a6cb86d00b128964370024b108640b05b5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbb2056238b11891fff35b2c74e6a809f7d369d116a25803ac32725568d9461a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T14:42:13Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 14:42:13.656951 6206 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0131 14:42:13.657012 6206 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0131 14:42:13.657046 6206 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0131 14:42:13.657054 6206 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0131 14:42:13.657134 6206 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0131 14:42:13.657145 6206 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0131 14:42:13.657168 6206 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0131 14:42:13.657186 6206 factory.go:656] Stopping watch factory\\\\nI0131 14:42:13.657200 6206 ovnkube.go:599] Stopped ovnkube\\\\nI0131 14:42:13.657232 6206 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0131 14:42:13.657247 6206 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0131 14:42:13.657256 6206 handler.go:208] Removed *v1.Node event handler 2\\\\nI0131 14:42:13.657266 6206 handler.go:208] Removed *v1.Node event handler 7\\\\nI0131 14:42:13.657275 6206 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0131 14:42:13.657286 6206 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0131 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7f5e784895f315aa4d8fb6039b9e70c46f977a80fdd5d39ead81d75048abd6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n8cdt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:28Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.973432 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7adba94225f6d18961ccad4eab46916b607a2bff977a48d36fee6810aa4b7293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11d68e8091e3d18fff0b3c8f744c2e9002b6307e17815b221af7123715d8abf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:28Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:28 crc kubenswrapper[4751]: I0131 14:42:28.994623 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-68hvr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658471aa-68b2-478e-9522-ef5533009174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5c5f03d09a887caf686dc5e5d446c4589c7e577cd65e9329cd73d1f33af3d53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbl8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-68hvr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:28Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.013571 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.013608 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.013616 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.013632 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.013642 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:29Z","lastTransitionTime":"2026-01-31T14:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.023929 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6afb84a0-c564-45aa-b7a1-cd6f8273fe45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e4fe03f50f95359d284f7d5aebb0df697fba39743cf83857c7f17893cb2bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08603a998264e94ef7871338a47019047495685fb867c5fa4f290f8a990dc6ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddd61fd469b2710e86edf5644cff4c4272ce73a59b6578c4cf941c5408e2d218\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df6c6abfaa0e16e8c6de87da9d3584b57a9e83e9dee03f98bb9de38cf71cc19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f93de7a002a1e6a281d90042a003acf677ad136b29da511ead9acd31eb7ab2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T14:41:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 14:41:50.030836 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 14:41:50.033684 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-814856940/tls.crt::/tmp/serving-cert-814856940/tls.key\\\\\\\"\\\\nI0131 14:41:55.828999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 14:41:55.832995 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 14:41:55.833028 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 14:41:55.833056 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 14:41:55.833065 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 14:41:55.854492 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 14:41:55.854645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 14:41:55.854833 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 14:41:55.854881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 14:41:55.854891 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 14:41:55.855031 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 14:41:55.858196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d34543b9a432a0dd029c18b90dd065749cf5a74b9221dcedf22a810afa749a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:29Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.072785 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d4eee728dbd9b6b0ade5149fd481cb3a688a6856f56a3a25a6cb49c944ee27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:29Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.086523 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:29Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.098566 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:29Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.108010 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c170e8-22c9-43a9-8b34-9d626c2ccddc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d86496066c7ba97ec9852c160ecb2e25f49cb0b7ae16e667f6279be42fd52679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3956d143be77f4a50143f9678eb51ab7871e250cae73d87c9e7fce2575e466c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wpj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:29Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.116173 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.116206 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.116215 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.116228 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.116238 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:29Z","lastTransitionTime":"2026-01-31T14:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.120303 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7adba94225f6d18961ccad4eab46916b607a2bff977a48d36fee6810aa4b7293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11d68e8091e3d18fff0b3c8f744c2e9002b6307e17815b221af7123715d8abf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:29Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.129338 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-68hvr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658471aa-68b2-478e-9522-ef5533009174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5c5f03d09a887caf686dc5e5d446c4589c7e577cd65e9329cd73d1f33af3d53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbl8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-68hvr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:29Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.145628 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3446cde960ab284f70a338a9a4bfee334fdbdef503d868692fb847f4df6acba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20273cc05bc490f8dd5258dccd02aa0e97d1d22c159ce2c177594920f4af83de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://701178503ef461547e687442de0607af20cee8bca075347d1d1cb2a439562232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5623c591d9a6806335a9b671648fc172e749568a7ad29c3bd41f1a990ba56010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://357afc89f219ac34037aaeb7086076b36f05907502c53006aeb588eaf440cc74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8f6719bdf2fef7e21f7c5cdcb3de4b4f08e69bd62e22cb10fe157f638b3c148\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://307462c29efce1de33c4d50cdbc74a6cb86d00b128964370024b108640b05b5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbb2056238b11891fff35b2c74e6a809f7d369d116a25803ac32725568d9461a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T14:42:13Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 14:42:13.656951 6206 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0131 14:42:13.657012 6206 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0131 14:42:13.657046 6206 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0131 14:42:13.657054 6206 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0131 14:42:13.657134 6206 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0131 14:42:13.657145 6206 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0131 14:42:13.657168 6206 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0131 14:42:13.657186 6206 factory.go:656] Stopping watch factory\\\\nI0131 14:42:13.657200 6206 ovnkube.go:599] Stopped ovnkube\\\\nI0131 14:42:13.657232 6206 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0131 14:42:13.657247 6206 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0131 14:42:13.657256 6206 handler.go:208] Removed *v1.Node event handler 2\\\\nI0131 14:42:13.657266 6206 handler.go:208] Removed *v1.Node event handler 7\\\\nI0131 14:42:13.657275 6206 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0131 14:42:13.657286 6206 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0131 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7f5e784895f315aa4d8fb6039b9e70c46f977a80fdd5d39ead81d75048abd6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n8cdt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:29Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.155736 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:29Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.165779 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c170e8-22c9-43a9-8b34-9d626c2ccddc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d86496066c7ba97ec9852c160ecb2e25f49cb0b7ae16e667f6279be42fd52679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3956d143be77f4a50143f9678eb51ab7871e250cae73d87c9e7fce2575e466c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wpj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:29Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.177300 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6afb84a0-c564-45aa-b7a1-cd6f8273fe45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e4fe03f50f95359d284f7d5aebb0df697fba39743cf83857c7f17893cb2bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08603a998264e94ef7871338a47019047495685fb867c5fa4f290f8a990dc6ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddd61fd469b2710e86edf5644cff4c4272ce73a59b6578c4cf941c5408e2d218\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df6c6abfaa0e16e8c6de87da9d3584b57a9e83e9dee03f98bb9de38cf71cc19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f93de7a002a1e6a281d90042a003acf677ad136b29da511ead9acd31eb7ab2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T14:41:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 14:41:50.030836 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 14:41:50.033684 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-814856940/tls.crt::/tmp/serving-cert-814856940/tls.key\\\\\\\"\\\\nI0131 14:41:55.828999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 14:41:55.832995 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 14:41:55.833028 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 14:41:55.833056 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 14:41:55.833065 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 14:41:55.854492 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 14:41:55.854645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 14:41:55.854833 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 14:41:55.854881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 14:41:55.854891 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 14:41:55.855031 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 14:41:55.858196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d34543b9a432a0dd029c18b90dd065749cf5a74b9221dcedf22a810afa749a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:29Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.188309 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64468352-f9fe-48bb-b204-b9f828c06bf8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c56c1e31014f9e3d0be8140f58cff1c752ad4be1c6c60a942bc18320bbd37b6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41a7c5739a571e6f3ec88c3798ad2604382b9320c44ddda3d41681a64c6ab2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a6478c4477b785bcb405d597f1c835faaf4ef7adb3a2bcd6e70cc2e692f44d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cbc9dfffff59a784e871d47bf2ab1b8419b11840eb329ce21c0f7c24e6e77c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cbc9dfffff59a784e871d47bf2ab1b8419b11840eb329ce21c0f7c24e6e77c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:29Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.206323 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d4eee728dbd9b6b0ade5149fd481cb3a688a6856f56a3a25a6cb49c944ee27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:29Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.218679 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:29Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.218824 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.218853 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.218863 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.218877 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.218885 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:29Z","lastTransitionTime":"2026-01-31T14:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.230708 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b7a3883-0c5a-4116-adb0-ad25f69cd7f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daae5a3d70247f744d33c832eaf3080a91e73ffe0b5327d111ec8e87175becdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37355e2b9d95a941be3552b713f7fe96973a49528f7a979524779b263412853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9317b698da5ee8a5f40c046d810783e523f27b6806319023e57e08f5cae267c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff7cd215439083e1e83ef38a79946be1c7e9bbc20d66cdb931c456a21bbdf17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:29Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.241818 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc4ac60a9db8d22605139ad928fb8fd667aaba0b8563be03e796ad077fb525bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:29Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.253662 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rtthp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7dd989b-33df-4562-a60b-f273428fea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ef2aa95d0f98ef0706040984786e2b48337984b508447f1b49b155ffb5f4608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwrtf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rtthp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:29Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.270574 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rp5sb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5353863-ec39-4357-9b86-9be42ca17916\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99c893782bcbcbd370030fe164c6849abe258ce88db65e3d22c269d20d0bef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c73fb39c6c9db93acb0fdc34044b1cafc05046a470e3897eeaa7fac8107691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c73fb39c6c9db93acb0fdc34044b1cafc05046a470e3897eeaa7fac8107691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd0a7bc51ee2198f74d0adbfa358d772dc5176a7671b6a551157ad301348181d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd0a7bc51ee2198f74d0adbfa358d772dc5176a7671b6a551157ad301348181d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dbe202f26fa5c09ba45db2a0bbb7fd7796b91e665df4529eddfdabd6ce083fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dbe202f26fa5c09ba45db2a0bbb7fd7796b91e665df4529eddfdabd6ce083fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rp5sb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:29Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.278976 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.279103 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.279125 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.279147 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.279166 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:42:29 crc kubenswrapper[4751]: E0131 14:42:29.279251 4751 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 14:42:29 crc kubenswrapper[4751]: E0131 14:42:29.279248 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 14:42:29 crc kubenswrapper[4751]: E0131 14:42:29.279281 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 14:42:29 crc kubenswrapper[4751]: E0131 14:42:29.279289 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 14:43:01.279277823 +0000 UTC m=+85.653990708 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 14:42:29 crc kubenswrapper[4751]: E0131 14:42:29.279294 4751 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 14:42:29 crc kubenswrapper[4751]: E0131 14:42:29.279398 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 14:42:29 crc kubenswrapper[4751]: E0131 14:42:29.279408 4751 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 14:42:29 crc kubenswrapper[4751]: E0131 14:42:29.279411 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 14:42:29 crc kubenswrapper[4751]: E0131 14:42:29.279424 4751 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 14:42:29 crc kubenswrapper[4751]: E0131 14:42:29.279444 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 14:43:01.279438418 +0000 UTC m=+85.654151303 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 14:42:29 crc kubenswrapper[4751]: E0131 14:42:29.279468 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-31 14:43:01.279453348 +0000 UTC m=+85.654166233 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 14:42:29 crc kubenswrapper[4751]: E0131 14:42:29.279491 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-31 14:43:01.279485429 +0000 UTC m=+85.654198314 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 14:42:29 crc kubenswrapper[4751]: E0131 14:42:29.279551 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:43:01.2795449 +0000 UTC m=+85.654257785 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.282387 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4v79q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd8c0730-67df-445e-a6ce-c2edce5d9c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f27098d3c41dbb10f76ba04a8a989e91ff3eb6fe0fb0ca746e33417839235c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45q8c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34680c760b5c6a6e2a521731e962301b54aa3184d5d66792fb43e991c6502a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45q8c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4v79q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:29Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.304318 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:29Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.315378 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lxrfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b895e2a-7887-41c3-b641-9c72bb085dda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d0546721c181de3a5f2f4e791dcaeba69f94467bfbf08b4470332a0b4b49fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9hbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lxrfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:29Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.320866 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.320930 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.320947 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.320969 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.320986 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:29Z","lastTransitionTime":"2026-01-31T14:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.330620 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xtn6l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68aeb9c7-d3c3-4c34-96ab-bb947421c504\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hljmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hljmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xtn6l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:29Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.383213 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 17:31:37.820269618 +0000 UTC Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.405665 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.405712 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.405767 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.405771 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:42:29 crc kubenswrapper[4751]: E0131 14:42:29.405844 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 14:42:29 crc kubenswrapper[4751]: E0131 14:42:29.405931 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 14:42:29 crc kubenswrapper[4751]: E0131 14:42:29.406131 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xtn6l" podUID="68aeb9c7-d3c3-4c34-96ab-bb947421c504" Jan 31 14:42:29 crc kubenswrapper[4751]: E0131 14:42:29.406247 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.424232 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.424281 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.424297 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.424316 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.424333 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:29Z","lastTransitionTime":"2026-01-31T14:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.527284 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.527342 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.527358 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.527380 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.527397 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:29Z","lastTransitionTime":"2026-01-31T14:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.630043 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.630139 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.630157 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.630181 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.630199 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:29Z","lastTransitionTime":"2026-01-31T14:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.732617 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.732681 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.732699 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.732725 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.732743 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:29Z","lastTransitionTime":"2026-01-31T14:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.815243 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n8cdt_ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b/ovnkube-controller/2.log" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.816363 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n8cdt_ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b/ovnkube-controller/1.log" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.820708 4751 generic.go:334] "Generic (PLEG): container finished" podID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" containerID="307462c29efce1de33c4d50cdbc74a6cb86d00b128964370024b108640b05b5e" exitCode=1 Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.820786 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" event={"ID":"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b","Type":"ContainerDied","Data":"307462c29efce1de33c4d50cdbc74a6cb86d00b128964370024b108640b05b5e"} Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.820890 4751 scope.go:117] "RemoveContainer" containerID="bbb2056238b11891fff35b2c74e6a809f7d369d116a25803ac32725568d9461a" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.822547 4751 scope.go:117] "RemoveContainer" containerID="307462c29efce1de33c4d50cdbc74a6cb86d00b128964370024b108640b05b5e" Jan 31 14:42:29 crc kubenswrapper[4751]: E0131 14:42:29.822798 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-n8cdt_openshift-ovn-kubernetes(ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" podUID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.835858 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.835896 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.835914 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.835937 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.835955 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:29Z","lastTransitionTime":"2026-01-31T14:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.842462 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b7a3883-0c5a-4116-adb0-ad25f69cd7f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daae5a3d70247f744d33c832eaf3080a91e73ffe0b5327d111ec8e87175becdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37355e2b9d95a941be3552b713f7fe96973a49528f7a979524779b263412853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9317b698da5ee8a5f40c046d810783e523f27b6806319023e57e08f5cae267c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff7cd215439083e1e83ef38a79946be1c7e9bbc20d66cdb931c456a21bbdf17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:29Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.861979 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc4ac60a9db8d22605139ad928fb8fd667aaba0b8563be03e796ad077fb525bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:29Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.883425 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rtthp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7dd989b-33df-4562-a60b-f273428fea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ef2aa95d0f98ef0706040984786e2b48337984b508447f1b49b155ffb5f4608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwrtf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rtthp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:29Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.906106 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rp5sb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5353863-ec39-4357-9b86-9be42ca17916\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99c893782bcbcbd370030fe164c6849abe258ce88db65e3d22c269d20d0bef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c73fb39c6c9db93acb0fdc34044b1cafc05046a470e3897eeaa7fac8107691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c73fb39c6c9db93acb0fdc34044b1cafc05046a470e3897eeaa7fac8107691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd0a7bc51ee2198f74d0adbfa358d772dc5176a7671b6a551157ad301348181d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd0a7bc51ee2198f74d0adbfa358d772dc5176a7671b6a551157ad301348181d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dbe202f26fa5c09ba45db2a0bbb7fd7796b91e665df4529eddfdabd6ce083fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dbe202f26fa5c09ba45db2a0bbb7fd7796b91e665df4529eddfdabd6ce083fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rp5sb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:29Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.923884 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4v79q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd8c0730-67df-445e-a6ce-c2edce5d9c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f27098d3c41dbb10f76ba04a8a989e91ff3eb6fe0fb0ca746e33417839235c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45q8c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34680c760b5c6a6e2a521731e962301b54aa3184d5d66792fb43e991c6502a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45q8c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4v79q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:29Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.942327 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.942408 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.942431 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.942457 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.942484 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:29Z","lastTransitionTime":"2026-01-31T14:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.945717 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:29Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.961817 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lxrfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b895e2a-7887-41c3-b641-9c72bb085dda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d0546721c181de3a5f2f4e791dcaeba69f94467bfbf08b4470332a0b4b49fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9hbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lxrfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:29Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.978783 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xtn6l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68aeb9c7-d3c3-4c34-96ab-bb947421c504\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hljmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hljmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xtn6l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:29Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:29 crc kubenswrapper[4751]: I0131 14:42:29.992955 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7adba94225f6d18961ccad4eab46916b607a2bff977a48d36fee6810aa4b7293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11d68e8091e3d18fff0b3c8f744c2e9002b6307e17815b221af7123715d8abf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:29Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:30 crc kubenswrapper[4751]: I0131 14:42:30.008984 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-68hvr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658471aa-68b2-478e-9522-ef5533009174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5c5f03d09a887caf686dc5e5d446c4589c7e577cd65e9329cd73d1f33af3d53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbl8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-68hvr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:30Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:30 crc kubenswrapper[4751]: I0131 14:42:30.041426 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3446cde960ab284f70a338a9a4bfee334fdbdef503d868692fb847f4df6acba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20273cc05bc490f8dd5258dccd02aa0e97d1d22c159ce2c177594920f4af83de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://701178503ef461547e687442de0607af20cee8bca075347d1d1cb2a439562232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5623c591d9a6806335a9b671648fc172e749568a7ad29c3bd41f1a990ba56010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://357afc89f219ac34037aaeb7086076b36f05907502c53006aeb588eaf440cc74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8f6719bdf2fef7e21f7c5cdcb3de4b4f08e69bd62e22cb10fe157f638b3c148\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://307462c29efce1de33c4d50cdbc74a6cb86d00b128964370024b108640b05b5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbb2056238b11891fff35b2c74e6a809f7d369d116a25803ac32725568d9461a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T14:42:13Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 14:42:13.656951 6206 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0131 14:42:13.657012 6206 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0131 14:42:13.657046 6206 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0131 14:42:13.657054 6206 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0131 14:42:13.657134 6206 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0131 14:42:13.657145 6206 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0131 14:42:13.657168 6206 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0131 14:42:13.657186 6206 factory.go:656] Stopping watch factory\\\\nI0131 14:42:13.657200 6206 ovnkube.go:599] Stopped ovnkube\\\\nI0131 14:42:13.657232 6206 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0131 14:42:13.657247 6206 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0131 14:42:13.657256 6206 handler.go:208] Removed *v1.Node event handler 2\\\\nI0131 14:42:13.657266 6206 handler.go:208] Removed *v1.Node event handler 7\\\\nI0131 14:42:13.657275 6206 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0131 14:42:13.657286 6206 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0131 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://307462c29efce1de33c4d50cdbc74a6cb86d00b128964370024b108640b05b5e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T14:42:29Z\\\",\\\"message\\\":\\\"t:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 14:42:29.388826 6425 services_controller.go:443] Built service openshift-etcd/etcd LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.253\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:2379, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.5.253\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:9979, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0131 14:42:29.388860 6425 services_controller.go:444] Built service openshift-etcd/etcd LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0131 14:42:29.388874 6425 services_controller.go:445] Built service openshift-etcd/etcd LB template configs for network=default: []services.lbConfig(nil)\\\\nF0131 14:42:29.388002 6425 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7f5e784895f315aa4d8fb6039b9e70c46f977a80fdd5d39ead81d75048abd6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n8cdt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:30Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:30 crc kubenswrapper[4751]: I0131 14:42:30.050328 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:30 crc kubenswrapper[4751]: I0131 14:42:30.050376 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:30 crc kubenswrapper[4751]: I0131 14:42:30.050393 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:30 crc kubenswrapper[4751]: I0131 14:42:30.050417 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:30 crc kubenswrapper[4751]: I0131 14:42:30.050436 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:30Z","lastTransitionTime":"2026-01-31T14:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:30 crc kubenswrapper[4751]: I0131 14:42:30.057121 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c170e8-22c9-43a9-8b34-9d626c2ccddc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d86496066c7ba97ec9852c160ecb2e25f49cb0b7ae16e667f6279be42fd52679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3956d143be77f4a50143f9678eb51ab7871e250cae73d87c9e7fce2575e466c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wpj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:30Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:30 crc kubenswrapper[4751]: I0131 14:42:30.079128 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6afb84a0-c564-45aa-b7a1-cd6f8273fe45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e4fe03f50f95359d284f7d5aebb0df697fba39743cf83857c7f17893cb2bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08603a998264e94ef7871338a47019047495685fb867c5fa4f290f8a990dc6ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddd61fd469b2710e86edf5644cff4c4272ce73a59b6578c4cf941c5408e2d218\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df6c6abfaa0e16e8c6de87da9d3584b57a9e83e9dee03f98bb9de38cf71cc19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f93de7a002a1e6a281d90042a003acf677ad136b29da511ead9acd31eb7ab2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T14:41:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 14:41:50.030836 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 14:41:50.033684 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-814856940/tls.crt::/tmp/serving-cert-814856940/tls.key\\\\\\\"\\\\nI0131 14:41:55.828999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 14:41:55.832995 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 14:41:55.833028 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 14:41:55.833056 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 14:41:55.833065 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 14:41:55.854492 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 14:41:55.854645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 14:41:55.854833 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 14:41:55.854881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 14:41:55.854891 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 14:41:55.855031 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 14:41:55.858196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d34543b9a432a0dd029c18b90dd065749cf5a74b9221dcedf22a810afa749a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:30Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:30 crc kubenswrapper[4751]: I0131 14:42:30.096970 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64468352-f9fe-48bb-b204-b9f828c06bf8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c56c1e31014f9e3d0be8140f58cff1c752ad4be1c6c60a942bc18320bbd37b6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41a7c5739a571e6f3ec88c3798ad2604382b9320c44ddda3d41681a64c6ab2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a6478c4477b785bcb405d597f1c835faaf4ef7adb3a2bcd6e70cc2e692f44d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cbc9dfffff59a784e871d47bf2ab1b8419b11840eb329ce21c0f7c24e6e77c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cbc9dfffff59a784e871d47bf2ab1b8419b11840eb329ce21c0f7c24e6e77c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:30Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:30 crc kubenswrapper[4751]: I0131 14:42:30.114476 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d4eee728dbd9b6b0ade5149fd481cb3a688a6856f56a3a25a6cb49c944ee27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:30Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:30 crc kubenswrapper[4751]: I0131 14:42:30.132335 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:30Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:30 crc kubenswrapper[4751]: I0131 14:42:30.152491 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:30Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:30 crc kubenswrapper[4751]: I0131 14:42:30.153130 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:30 crc kubenswrapper[4751]: I0131 14:42:30.153283 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:30 crc kubenswrapper[4751]: I0131 14:42:30.153409 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:30 crc kubenswrapper[4751]: I0131 14:42:30.153507 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:30 crc kubenswrapper[4751]: I0131 14:42:30.153631 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:30Z","lastTransitionTime":"2026-01-31T14:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:30 crc kubenswrapper[4751]: I0131 14:42:30.256740 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:30 crc kubenswrapper[4751]: I0131 14:42:30.256789 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:30 crc kubenswrapper[4751]: I0131 14:42:30.256806 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:30 crc kubenswrapper[4751]: I0131 14:42:30.256830 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:30 crc kubenswrapper[4751]: I0131 14:42:30.256846 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:30Z","lastTransitionTime":"2026-01-31T14:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:30 crc kubenswrapper[4751]: I0131 14:42:30.359809 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:30 crc kubenswrapper[4751]: I0131 14:42:30.360216 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:30 crc kubenswrapper[4751]: I0131 14:42:30.360342 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:30 crc kubenswrapper[4751]: I0131 14:42:30.360496 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:30 crc kubenswrapper[4751]: I0131 14:42:30.360634 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:30Z","lastTransitionTime":"2026-01-31T14:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:30 crc kubenswrapper[4751]: I0131 14:42:30.384164 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 17:57:17.079998756 +0000 UTC Jan 31 14:42:30 crc kubenswrapper[4751]: I0131 14:42:30.464145 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:30 crc kubenswrapper[4751]: I0131 14:42:30.464265 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:30 crc kubenswrapper[4751]: I0131 14:42:30.464288 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:30 crc kubenswrapper[4751]: I0131 14:42:30.464316 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:30 crc kubenswrapper[4751]: I0131 14:42:30.464342 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:30Z","lastTransitionTime":"2026-01-31T14:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:30 crc kubenswrapper[4751]: I0131 14:42:30.568130 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:30 crc kubenswrapper[4751]: I0131 14:42:30.568208 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:30 crc kubenswrapper[4751]: I0131 14:42:30.568232 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:30 crc kubenswrapper[4751]: I0131 14:42:30.568264 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:30 crc kubenswrapper[4751]: I0131 14:42:30.568290 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:30Z","lastTransitionTime":"2026-01-31T14:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:30 crc kubenswrapper[4751]: I0131 14:42:30.672826 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:30 crc kubenswrapper[4751]: I0131 14:42:30.672908 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:30 crc kubenswrapper[4751]: I0131 14:42:30.672934 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:30 crc kubenswrapper[4751]: I0131 14:42:30.672968 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:30 crc kubenswrapper[4751]: I0131 14:42:30.672995 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:30Z","lastTransitionTime":"2026-01-31T14:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:30 crc kubenswrapper[4751]: I0131 14:42:30.775636 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:30 crc kubenswrapper[4751]: I0131 14:42:30.775681 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:30 crc kubenswrapper[4751]: I0131 14:42:30.775693 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:30 crc kubenswrapper[4751]: I0131 14:42:30.775709 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:30 crc kubenswrapper[4751]: I0131 14:42:30.775722 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:30Z","lastTransitionTime":"2026-01-31T14:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:30 crc kubenswrapper[4751]: I0131 14:42:30.826388 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n8cdt_ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b/ovnkube-controller/2.log" Jan 31 14:42:30 crc kubenswrapper[4751]: I0131 14:42:30.878273 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:30 crc kubenswrapper[4751]: I0131 14:42:30.878332 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:30 crc kubenswrapper[4751]: I0131 14:42:30.878349 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:30 crc kubenswrapper[4751]: I0131 14:42:30.878372 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:30 crc kubenswrapper[4751]: I0131 14:42:30.878389 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:30Z","lastTransitionTime":"2026-01-31T14:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:30 crc kubenswrapper[4751]: I0131 14:42:30.981651 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:30 crc kubenswrapper[4751]: I0131 14:42:30.981711 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:30 crc kubenswrapper[4751]: I0131 14:42:30.981728 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:30 crc kubenswrapper[4751]: I0131 14:42:30.981752 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:30 crc kubenswrapper[4751]: I0131 14:42:30.981769 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:30Z","lastTransitionTime":"2026-01-31T14:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:31 crc kubenswrapper[4751]: I0131 14:42:31.084651 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:31 crc kubenswrapper[4751]: I0131 14:42:31.084711 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:31 crc kubenswrapper[4751]: I0131 14:42:31.084728 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:31 crc kubenswrapper[4751]: I0131 14:42:31.084752 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:31 crc kubenswrapper[4751]: I0131 14:42:31.084770 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:31Z","lastTransitionTime":"2026-01-31T14:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:31 crc kubenswrapper[4751]: I0131 14:42:31.187871 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:31 crc kubenswrapper[4751]: I0131 14:42:31.187927 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:31 crc kubenswrapper[4751]: I0131 14:42:31.187939 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:31 crc kubenswrapper[4751]: I0131 14:42:31.187960 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:31 crc kubenswrapper[4751]: I0131 14:42:31.187975 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:31Z","lastTransitionTime":"2026-01-31T14:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:31 crc kubenswrapper[4751]: I0131 14:42:31.290808 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:31 crc kubenswrapper[4751]: I0131 14:42:31.290880 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:31 crc kubenswrapper[4751]: I0131 14:42:31.290898 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:31 crc kubenswrapper[4751]: I0131 14:42:31.290924 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:31 crc kubenswrapper[4751]: I0131 14:42:31.290943 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:31Z","lastTransitionTime":"2026-01-31T14:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:31 crc kubenswrapper[4751]: I0131 14:42:31.385325 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 00:26:51.693853654 +0000 UTC Jan 31 14:42:31 crc kubenswrapper[4751]: I0131 14:42:31.393325 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:31 crc kubenswrapper[4751]: I0131 14:42:31.393383 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:31 crc kubenswrapper[4751]: I0131 14:42:31.393402 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:31 crc kubenswrapper[4751]: I0131 14:42:31.393426 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:31 crc kubenswrapper[4751]: I0131 14:42:31.393443 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:31Z","lastTransitionTime":"2026-01-31T14:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:31 crc kubenswrapper[4751]: I0131 14:42:31.404905 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:42:31 crc kubenswrapper[4751]: I0131 14:42:31.404952 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:42:31 crc kubenswrapper[4751]: I0131 14:42:31.404960 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:42:31 crc kubenswrapper[4751]: E0131 14:42:31.405596 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xtn6l" podUID="68aeb9c7-d3c3-4c34-96ab-bb947421c504" Jan 31 14:42:31 crc kubenswrapper[4751]: I0131 14:42:31.405014 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:42:31 crc kubenswrapper[4751]: E0131 14:42:31.405741 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 14:42:31 crc kubenswrapper[4751]: E0131 14:42:31.405866 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 14:42:31 crc kubenswrapper[4751]: E0131 14:42:31.406261 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 14:42:31 crc kubenswrapper[4751]: I0131 14:42:31.496272 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:31 crc kubenswrapper[4751]: I0131 14:42:31.496669 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:31 crc kubenswrapper[4751]: I0131 14:42:31.496880 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:31 crc kubenswrapper[4751]: I0131 14:42:31.497133 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:31 crc kubenswrapper[4751]: I0131 14:42:31.497305 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:31Z","lastTransitionTime":"2026-01-31T14:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:31 crc kubenswrapper[4751]: I0131 14:42:31.611590 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:31 crc kubenswrapper[4751]: I0131 14:42:31.612318 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:31 crc kubenswrapper[4751]: I0131 14:42:31.612345 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:31 crc kubenswrapper[4751]: I0131 14:42:31.612377 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:31 crc kubenswrapper[4751]: I0131 14:42:31.612399 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:31Z","lastTransitionTime":"2026-01-31T14:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:31 crc kubenswrapper[4751]: I0131 14:42:31.716178 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:31 crc kubenswrapper[4751]: I0131 14:42:31.716480 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:31 crc kubenswrapper[4751]: I0131 14:42:31.716671 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:31 crc kubenswrapper[4751]: I0131 14:42:31.716821 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:31 crc kubenswrapper[4751]: I0131 14:42:31.716988 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:31Z","lastTransitionTime":"2026-01-31T14:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:31 crc kubenswrapper[4751]: I0131 14:42:31.816065 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/68aeb9c7-d3c3-4c34-96ab-bb947421c504-metrics-certs\") pod \"network-metrics-daemon-xtn6l\" (UID: \"68aeb9c7-d3c3-4c34-96ab-bb947421c504\") " pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:42:31 crc kubenswrapper[4751]: E0131 14:42:31.816365 4751 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 14:42:31 crc kubenswrapper[4751]: E0131 14:42:31.816775 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68aeb9c7-d3c3-4c34-96ab-bb947421c504-metrics-certs podName:68aeb9c7-d3c3-4c34-96ab-bb947421c504 nodeName:}" failed. No retries permitted until 2026-01-31 14:42:47.81674671 +0000 UTC m=+72.191459625 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/68aeb9c7-d3c3-4c34-96ab-bb947421c504-metrics-certs") pod "network-metrics-daemon-xtn6l" (UID: "68aeb9c7-d3c3-4c34-96ab-bb947421c504") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 14:42:31 crc kubenswrapper[4751]: I0131 14:42:31.819550 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:31 crc kubenswrapper[4751]: I0131 14:42:31.819722 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:31 crc kubenswrapper[4751]: I0131 14:42:31.819839 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:31 crc kubenswrapper[4751]: I0131 14:42:31.819976 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:31 crc kubenswrapper[4751]: I0131 14:42:31.820177 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:31Z","lastTransitionTime":"2026-01-31T14:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:31 crc kubenswrapper[4751]: I0131 14:42:31.924243 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:31 crc kubenswrapper[4751]: I0131 14:42:31.924539 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:31 crc kubenswrapper[4751]: I0131 14:42:31.924716 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:31 crc kubenswrapper[4751]: I0131 14:42:31.924869 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:31 crc kubenswrapper[4751]: I0131 14:42:31.925003 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:31Z","lastTransitionTime":"2026-01-31T14:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.028393 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.028472 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.028496 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.028527 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.028550 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:32Z","lastTransitionTime":"2026-01-31T14:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.131171 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.131543 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.132114 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.132516 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.132823 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:32Z","lastTransitionTime":"2026-01-31T14:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.236276 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.236631 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.236868 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.237169 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.237391 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:32Z","lastTransitionTime":"2026-01-31T14:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.340947 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.341305 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.341446 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.341581 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.341721 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:32Z","lastTransitionTime":"2026-01-31T14:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.385837 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 11:22:14.223147652 +0000 UTC Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.445742 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.446100 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.447495 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.448001 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.448338 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:32Z","lastTransitionTime":"2026-01-31T14:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.551835 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.551897 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.551974 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.551998 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.552016 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:32Z","lastTransitionTime":"2026-01-31T14:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.610672 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.612322 4751 scope.go:117] "RemoveContainer" containerID="307462c29efce1de33c4d50cdbc74a6cb86d00b128964370024b108640b05b5e" Jan 31 14:42:32 crc kubenswrapper[4751]: E0131 14:42:32.612675 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-n8cdt_openshift-ovn-kubernetes(ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" podUID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.627464 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b7a3883-0c5a-4116-adb0-ad25f69cd7f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daae5a3d70247f744d33c832eaf3080a91e73ffe0b5327d111ec8e87175becdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37355e2b9d95a941be3552b713f7fe96973a49528f7a979524779b263412853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9317b698da5ee8a5f40c046d810783e523f27b6806319023e57e08f5cae267c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff7cd215439083e1e83ef38a79946be1c7e9bbc20d66cdb931c456a21bbdf17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:32Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.647226 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc4ac60a9db8d22605139ad928fb8fd667aaba0b8563be03e796ad077fb525bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:32Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.656192 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.656243 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.656259 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.656282 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.656299 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:32Z","lastTransitionTime":"2026-01-31T14:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.668455 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rtthp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7dd989b-33df-4562-a60b-f273428fea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ef2aa95d0f98ef0706040984786e2b48337984b508447f1b49b155ffb5f4608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwrtf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rtthp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:32Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.692965 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rp5sb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5353863-ec39-4357-9b86-9be42ca17916\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99c893782bcbcbd370030fe164c6849abe258ce88db65e3d22c269d20d0bef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c73fb39c6c9db93acb0fdc34044b1cafc05046a470e3897eeaa7fac8107691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c73fb39c6c9db93acb0fdc34044b1cafc05046a470e3897eeaa7fac8107691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd0a7bc51ee2198f74d0adbfa358d772dc5176a7671b6a551157ad301348181d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd0a7bc51ee2198f74d0adbfa358d772dc5176a7671b6a551157ad301348181d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dbe202f26fa5c09ba45db2a0bbb7fd7796b91e665df4529eddfdabd6ce083fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dbe202f26fa5c09ba45db2a0bbb7fd7796b91e665df4529eddfdabd6ce083fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rp5sb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:32Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.711516 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4v79q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd8c0730-67df-445e-a6ce-c2edce5d9c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f27098d3c41dbb10f76ba04a8a989e91ff3eb6fe0fb0ca746e33417839235c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45q8c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34680c760b5c6a6e2a521731e962301b54aa3184d5d66792fb43e991c6502a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45q8c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4v79q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:32Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.733613 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:32Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.752161 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lxrfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b895e2a-7887-41c3-b641-9c72bb085dda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d0546721c181de3a5f2f4e791dcaeba69f94467bfbf08b4470332a0b4b49fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9hbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lxrfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:32Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.759765 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.759864 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.759883 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.759925 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.759942 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:32Z","lastTransitionTime":"2026-01-31T14:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.772652 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xtn6l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68aeb9c7-d3c3-4c34-96ab-bb947421c504\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hljmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hljmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xtn6l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:32Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.794098 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7adba94225f6d18961ccad4eab46916b607a2bff977a48d36fee6810aa4b7293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11d68e8091e3d18fff0b3c8f744c2e9002b6307e17815b221af7123715d8abf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:32Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.812915 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-68hvr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658471aa-68b2-478e-9522-ef5533009174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5c5f03d09a887caf686dc5e5d446c4589c7e577cd65e9329cd73d1f33af3d53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbl8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-68hvr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:32Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.836638 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3446cde960ab284f70a338a9a4bfee334fdbdef503d868692fb847f4df6acba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20273cc05bc490f8dd5258dccd02aa0e97d1d22c159ce2c177594920f4af83de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://701178503ef461547e687442de0607af20cee8bca075347d1d1cb2a439562232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5623c591d9a6806335a9b671648fc172e749568a7ad29c3bd41f1a990ba56010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://357afc89f219ac34037aaeb7086076b36f05907502c53006aeb588eaf440cc74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8f6719bdf2fef7e21f7c5cdcb3de4b4f08e69bd62e22cb10fe157f638b3c148\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://307462c29efce1de33c4d50cdbc74a6cb86d00b128964370024b108640b05b5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://307462c29efce1de33c4d50cdbc74a6cb86d00b128964370024b108640b05b5e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T14:42:29Z\\\",\\\"message\\\":\\\"t:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 14:42:29.388826 6425 services_controller.go:443] Built service openshift-etcd/etcd LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.253\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:2379, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.5.253\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:9979, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0131 14:42:29.388860 6425 services_controller.go:444] Built service openshift-etcd/etcd LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0131 14:42:29.388874 6425 services_controller.go:445] Built service openshift-etcd/etcd LB template configs for network=default: []services.lbConfig(nil)\\\\nF0131 14:42:29.388002 6425 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-n8cdt_openshift-ovn-kubernetes(ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7f5e784895f315aa4d8fb6039b9e70c46f977a80fdd5d39ead81d75048abd6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n8cdt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:32Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.860582 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:32Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.862400 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.862444 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.862461 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.862485 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.862503 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:32Z","lastTransitionTime":"2026-01-31T14:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.884401 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c170e8-22c9-43a9-8b34-9d626c2ccddc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d86496066c7ba97ec9852c160ecb2e25f49cb0b7ae16e667f6279be42fd52679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3956d143be77f4a50143f9678eb51ab7871e250cae73d87c9e7fce2575e466c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wpj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:32Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.905368 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6afb84a0-c564-45aa-b7a1-cd6f8273fe45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e4fe03f50f95359d284f7d5aebb0df697fba39743cf83857c7f17893cb2bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08603a998264e94ef7871338a47019047495685fb867c5fa4f290f8a990dc6ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddd61fd469b2710e86edf5644cff4c4272ce73a59b6578c4cf941c5408e2d218\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df6c6abfaa0e16e8c6de87da9d3584b57a9e83e9dee03f98bb9de38cf71cc19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f93de7a002a1e6a281d90042a003acf677ad136b29da511ead9acd31eb7ab2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T14:41:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 14:41:50.030836 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 14:41:50.033684 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-814856940/tls.crt::/tmp/serving-cert-814856940/tls.key\\\\\\\"\\\\nI0131 14:41:55.828999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 14:41:55.832995 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 14:41:55.833028 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 14:41:55.833056 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 14:41:55.833065 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 14:41:55.854492 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 14:41:55.854645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 14:41:55.854833 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 14:41:55.854881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 14:41:55.854891 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 14:41:55.855031 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 14:41:55.858196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d34543b9a432a0dd029c18b90dd065749cf5a74b9221dcedf22a810afa749a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:32Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.919565 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64468352-f9fe-48bb-b204-b9f828c06bf8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c56c1e31014f9e3d0be8140f58cff1c752ad4be1c6c60a942bc18320bbd37b6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41a7c5739a571e6f3ec88c3798ad2604382b9320c44ddda3d41681a64c6ab2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a6478c4477b785bcb405d597f1c835faaf4ef7adb3a2bcd6e70cc2e692f44d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cbc9dfffff59a784e871d47bf2ab1b8419b11840eb329ce21c0f7c24e6e77c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cbc9dfffff59a784e871d47bf2ab1b8419b11840eb329ce21c0f7c24e6e77c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:32Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.942282 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d4eee728dbd9b6b0ade5149fd481cb3a688a6856f56a3a25a6cb49c944ee27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:32Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.961008 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:32Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.965256 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.965289 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.965311 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.965329 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:32 crc kubenswrapper[4751]: I0131 14:42:32.965341 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:32Z","lastTransitionTime":"2026-01-31T14:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.068399 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.068461 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.068479 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.068504 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.068521 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:33Z","lastTransitionTime":"2026-01-31T14:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.139642 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.139699 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.139741 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.139765 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.139783 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:33Z","lastTransitionTime":"2026-01-31T14:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:33 crc kubenswrapper[4751]: E0131 14:42:33.161797 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2bc08d22-1e39-4800-b402-ea260cc19637\\\",\\\"systemUUID\\\":\\\"ad4951a2-dbbf-4c4e-af3f-cce3c25b01ef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:33Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.166929 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.166978 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.166995 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.167018 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.167037 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:33Z","lastTransitionTime":"2026-01-31T14:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:33 crc kubenswrapper[4751]: E0131 14:42:33.186200 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2bc08d22-1e39-4800-b402-ea260cc19637\\\",\\\"systemUUID\\\":\\\"ad4951a2-dbbf-4c4e-af3f-cce3c25b01ef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:33Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.191030 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.191108 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.191126 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.191152 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.191169 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:33Z","lastTransitionTime":"2026-01-31T14:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:33 crc kubenswrapper[4751]: E0131 14:42:33.210771 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2bc08d22-1e39-4800-b402-ea260cc19637\\\",\\\"systemUUID\\\":\\\"ad4951a2-dbbf-4c4e-af3f-cce3c25b01ef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:33Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.217054 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.217173 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.217191 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.217216 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.217233 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:33Z","lastTransitionTime":"2026-01-31T14:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:33 crc kubenswrapper[4751]: E0131 14:42:33.240911 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2bc08d22-1e39-4800-b402-ea260cc19637\\\",\\\"systemUUID\\\":\\\"ad4951a2-dbbf-4c4e-af3f-cce3c25b01ef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:33Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.245678 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.245739 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.245757 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.245782 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.245801 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:33Z","lastTransitionTime":"2026-01-31T14:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:33 crc kubenswrapper[4751]: E0131 14:42:33.264855 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2bc08d22-1e39-4800-b402-ea260cc19637\\\",\\\"systemUUID\\\":\\\"ad4951a2-dbbf-4c4e-af3f-cce3c25b01ef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:33Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:33 crc kubenswrapper[4751]: E0131 14:42:33.265574 4751 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.267872 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.267972 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.268009 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.268058 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.268120 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:33Z","lastTransitionTime":"2026-01-31T14:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.370931 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.370997 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.371042 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.371174 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.371251 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:33Z","lastTransitionTime":"2026-01-31T14:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.387601 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 13:32:23.826284927 +0000 UTC Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.404919 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.404991 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.404944 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:42:33 crc kubenswrapper[4751]: E0131 14:42:33.405134 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xtn6l" podUID="68aeb9c7-d3c3-4c34-96ab-bb947421c504" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.405205 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:42:33 crc kubenswrapper[4751]: E0131 14:42:33.405348 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 14:42:33 crc kubenswrapper[4751]: E0131 14:42:33.405539 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 14:42:33 crc kubenswrapper[4751]: E0131 14:42:33.405681 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.474235 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.474295 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.474312 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.474354 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.474371 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:33Z","lastTransitionTime":"2026-01-31T14:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.577407 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.577466 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.577484 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.577508 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.577525 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:33Z","lastTransitionTime":"2026-01-31T14:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.681328 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.681389 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.681406 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.681430 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.681447 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:33Z","lastTransitionTime":"2026-01-31T14:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.784922 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.785112 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.785144 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.785173 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.785193 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:33Z","lastTransitionTime":"2026-01-31T14:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.887470 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.887525 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.887541 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.887563 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.887580 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:33Z","lastTransitionTime":"2026-01-31T14:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.990127 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.990194 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.990214 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.990239 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:33 crc kubenswrapper[4751]: I0131 14:42:33.990256 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:33Z","lastTransitionTime":"2026-01-31T14:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:34 crc kubenswrapper[4751]: I0131 14:42:34.093255 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:34 crc kubenswrapper[4751]: I0131 14:42:34.093318 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:34 crc kubenswrapper[4751]: I0131 14:42:34.093356 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:34 crc kubenswrapper[4751]: I0131 14:42:34.093383 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:34 crc kubenswrapper[4751]: I0131 14:42:34.093400 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:34Z","lastTransitionTime":"2026-01-31T14:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:34 crc kubenswrapper[4751]: I0131 14:42:34.196122 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:34 crc kubenswrapper[4751]: I0131 14:42:34.196185 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:34 crc kubenswrapper[4751]: I0131 14:42:34.196208 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:34 crc kubenswrapper[4751]: I0131 14:42:34.196249 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:34 crc kubenswrapper[4751]: I0131 14:42:34.196273 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:34Z","lastTransitionTime":"2026-01-31T14:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:34 crc kubenswrapper[4751]: I0131 14:42:34.299728 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:34 crc kubenswrapper[4751]: I0131 14:42:34.299792 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:34 crc kubenswrapper[4751]: I0131 14:42:34.299831 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:34 crc kubenswrapper[4751]: I0131 14:42:34.299856 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:34 crc kubenswrapper[4751]: I0131 14:42:34.299874 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:34Z","lastTransitionTime":"2026-01-31T14:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:34 crc kubenswrapper[4751]: I0131 14:42:34.387970 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 21:31:49.235268701 +0000 UTC Jan 31 14:42:34 crc kubenswrapper[4751]: I0131 14:42:34.402557 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:34 crc kubenswrapper[4751]: I0131 14:42:34.402644 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:34 crc kubenswrapper[4751]: I0131 14:42:34.402666 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:34 crc kubenswrapper[4751]: I0131 14:42:34.402699 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:34 crc kubenswrapper[4751]: I0131 14:42:34.402727 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:34Z","lastTransitionTime":"2026-01-31T14:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:34 crc kubenswrapper[4751]: I0131 14:42:34.506260 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:34 crc kubenswrapper[4751]: I0131 14:42:34.506334 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:34 crc kubenswrapper[4751]: I0131 14:42:34.506356 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:34 crc kubenswrapper[4751]: I0131 14:42:34.506384 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:34 crc kubenswrapper[4751]: I0131 14:42:34.506406 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:34Z","lastTransitionTime":"2026-01-31T14:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:34 crc kubenswrapper[4751]: I0131 14:42:34.610183 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:34 crc kubenswrapper[4751]: I0131 14:42:34.610248 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:34 crc kubenswrapper[4751]: I0131 14:42:34.610264 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:34 crc kubenswrapper[4751]: I0131 14:42:34.610288 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:34 crc kubenswrapper[4751]: I0131 14:42:34.610307 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:34Z","lastTransitionTime":"2026-01-31T14:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:34 crc kubenswrapper[4751]: I0131 14:42:34.714247 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:34 crc kubenswrapper[4751]: I0131 14:42:34.714313 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:34 crc kubenswrapper[4751]: I0131 14:42:34.714331 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:34 crc kubenswrapper[4751]: I0131 14:42:34.714361 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:34 crc kubenswrapper[4751]: I0131 14:42:34.714379 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:34Z","lastTransitionTime":"2026-01-31T14:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:34 crc kubenswrapper[4751]: I0131 14:42:34.817534 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:34 crc kubenswrapper[4751]: I0131 14:42:34.817588 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:34 crc kubenswrapper[4751]: I0131 14:42:34.817605 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:34 crc kubenswrapper[4751]: I0131 14:42:34.817627 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:34 crc kubenswrapper[4751]: I0131 14:42:34.817675 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:34Z","lastTransitionTime":"2026-01-31T14:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:34 crc kubenswrapper[4751]: I0131 14:42:34.921422 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:34 crc kubenswrapper[4751]: I0131 14:42:34.921473 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:34 crc kubenswrapper[4751]: I0131 14:42:34.921489 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:34 crc kubenswrapper[4751]: I0131 14:42:34.921512 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:34 crc kubenswrapper[4751]: I0131 14:42:34.921553 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:34Z","lastTransitionTime":"2026-01-31T14:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:35 crc kubenswrapper[4751]: I0131 14:42:35.024171 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:35 crc kubenswrapper[4751]: I0131 14:42:35.024271 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:35 crc kubenswrapper[4751]: I0131 14:42:35.024288 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:35 crc kubenswrapper[4751]: I0131 14:42:35.024312 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:35 crc kubenswrapper[4751]: I0131 14:42:35.024330 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:35Z","lastTransitionTime":"2026-01-31T14:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:35 crc kubenswrapper[4751]: I0131 14:42:35.127399 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:35 crc kubenswrapper[4751]: I0131 14:42:35.127474 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:35 crc kubenswrapper[4751]: I0131 14:42:35.127497 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:35 crc kubenswrapper[4751]: I0131 14:42:35.127526 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:35 crc kubenswrapper[4751]: I0131 14:42:35.127548 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:35Z","lastTransitionTime":"2026-01-31T14:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:35 crc kubenswrapper[4751]: I0131 14:42:35.231023 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:35 crc kubenswrapper[4751]: I0131 14:42:35.231122 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:35 crc kubenswrapper[4751]: I0131 14:42:35.231145 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:35 crc kubenswrapper[4751]: I0131 14:42:35.231172 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:35 crc kubenswrapper[4751]: I0131 14:42:35.231195 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:35Z","lastTransitionTime":"2026-01-31T14:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:35 crc kubenswrapper[4751]: I0131 14:42:35.334523 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:35 crc kubenswrapper[4751]: I0131 14:42:35.334572 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:35 crc kubenswrapper[4751]: I0131 14:42:35.334589 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:35 crc kubenswrapper[4751]: I0131 14:42:35.334611 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:35 crc kubenswrapper[4751]: I0131 14:42:35.334628 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:35Z","lastTransitionTime":"2026-01-31T14:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:35 crc kubenswrapper[4751]: I0131 14:42:35.388573 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 12:06:32.382209489 +0000 UTC Jan 31 14:42:35 crc kubenswrapper[4751]: I0131 14:42:35.404852 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:42:35 crc kubenswrapper[4751]: I0131 14:42:35.404923 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:42:35 crc kubenswrapper[4751]: I0131 14:42:35.404873 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:42:35 crc kubenswrapper[4751]: E0131 14:42:35.405134 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xtn6l" podUID="68aeb9c7-d3c3-4c34-96ab-bb947421c504" Jan 31 14:42:35 crc kubenswrapper[4751]: E0131 14:42:35.405237 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 14:42:35 crc kubenswrapper[4751]: I0131 14:42:35.405299 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:42:35 crc kubenswrapper[4751]: E0131 14:42:35.405483 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 14:42:35 crc kubenswrapper[4751]: E0131 14:42:35.405642 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 14:42:35 crc kubenswrapper[4751]: I0131 14:42:35.439961 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:35 crc kubenswrapper[4751]: I0131 14:42:35.440020 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:35 crc kubenswrapper[4751]: I0131 14:42:35.440038 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:35 crc kubenswrapper[4751]: I0131 14:42:35.440063 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:35 crc kubenswrapper[4751]: I0131 14:42:35.440122 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:35Z","lastTransitionTime":"2026-01-31T14:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:35 crc kubenswrapper[4751]: I0131 14:42:35.543269 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:35 crc kubenswrapper[4751]: I0131 14:42:35.543336 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:35 crc kubenswrapper[4751]: I0131 14:42:35.543352 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:35 crc kubenswrapper[4751]: I0131 14:42:35.543378 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:35 crc kubenswrapper[4751]: I0131 14:42:35.543399 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:35Z","lastTransitionTime":"2026-01-31T14:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:35 crc kubenswrapper[4751]: I0131 14:42:35.646335 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:35 crc kubenswrapper[4751]: I0131 14:42:35.646385 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:35 crc kubenswrapper[4751]: I0131 14:42:35.646402 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:35 crc kubenswrapper[4751]: I0131 14:42:35.646423 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:35 crc kubenswrapper[4751]: I0131 14:42:35.646440 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:35Z","lastTransitionTime":"2026-01-31T14:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:35 crc kubenswrapper[4751]: I0131 14:42:35.749727 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:35 crc kubenswrapper[4751]: I0131 14:42:35.749789 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:35 crc kubenswrapper[4751]: I0131 14:42:35.749807 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:35 crc kubenswrapper[4751]: I0131 14:42:35.749829 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:35 crc kubenswrapper[4751]: I0131 14:42:35.749850 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:35Z","lastTransitionTime":"2026-01-31T14:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:35 crc kubenswrapper[4751]: I0131 14:42:35.854209 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:35 crc kubenswrapper[4751]: I0131 14:42:35.854289 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:35 crc kubenswrapper[4751]: I0131 14:42:35.854312 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:35 crc kubenswrapper[4751]: I0131 14:42:35.854343 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:35 crc kubenswrapper[4751]: I0131 14:42:35.854366 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:35Z","lastTransitionTime":"2026-01-31T14:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:35 crc kubenswrapper[4751]: I0131 14:42:35.958262 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:35 crc kubenswrapper[4751]: I0131 14:42:35.958325 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:35 crc kubenswrapper[4751]: I0131 14:42:35.958347 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:35 crc kubenswrapper[4751]: I0131 14:42:35.958378 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:35 crc kubenswrapper[4751]: I0131 14:42:35.958401 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:35Z","lastTransitionTime":"2026-01-31T14:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.061920 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.062026 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.062051 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.062131 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.062154 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:36Z","lastTransitionTime":"2026-01-31T14:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.165581 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.165644 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.165661 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.165683 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.165702 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:36Z","lastTransitionTime":"2026-01-31T14:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.269308 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.269398 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.269411 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.269425 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.269435 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:36Z","lastTransitionTime":"2026-01-31T14:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.379809 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.379934 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.380012 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.380046 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.380135 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:36Z","lastTransitionTime":"2026-01-31T14:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.389298 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 19:20:13.851452218 +0000 UTC Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.427701 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b7a3883-0c5a-4116-adb0-ad25f69cd7f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daae5a3d70247f744d33c832eaf3080a91e73ffe0b5327d111ec8e87175becdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37355e2b9d95a941be3552b713f7fe96973a49528f7a979524779b263412853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9317b698da5ee8a5f40c046d810783e523f27b6806319023e57e08f5cae267c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff7cd215439083e1e83ef38a79946be1c7e9bbc20d66cdb931c456a21bbdf17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:36Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.442866 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc4ac60a9db8d22605139ad928fb8fd667aaba0b8563be03e796ad077fb525bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:36Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.460131 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rtthp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7dd989b-33df-4562-a60b-f273428fea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ef2aa95d0f98ef0706040984786e2b48337984b508447f1b49b155ffb5f4608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwrtf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rtthp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:36Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.482131 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.482187 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.482205 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.482227 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.482244 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:36Z","lastTransitionTime":"2026-01-31T14:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.482781 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rp5sb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5353863-ec39-4357-9b86-9be42ca17916\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99c893782bcbcbd370030fe164c6849abe258ce88db65e3d22c269d20d0bef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c73fb39c6c9db93acb0fdc34044b1cafc05046a470e3897eeaa7fac8107691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c73fb39c6c9db93acb0fdc34044b1cafc05046a470e3897eeaa7fac8107691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd0a7bc51ee2198f74d0adbfa358d772dc5176a7671b6a551157ad301348181d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd0a7bc51ee2198f74d0adbfa358d772dc5176a7671b6a551157ad301348181d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dbe202f26fa5c09ba45db2a0bbb7fd7796b91e665df4529eddfdabd6ce083fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dbe202f26fa5c09ba45db2a0bbb7fd7796b91e665df4529eddfdabd6ce083fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rp5sb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:36Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.497750 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4v79q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd8c0730-67df-445e-a6ce-c2edce5d9c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f27098d3c41dbb10f76ba04a8a989e91ff3eb6fe0fb0ca746e33417839235c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45q8c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34680c760b5c6a6e2a521731e962301b54aa3184d5d66792fb43e991c6502a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45q8c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4v79q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:36Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.512051 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:36Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.524422 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lxrfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b895e2a-7887-41c3-b641-9c72bb085dda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d0546721c181de3a5f2f4e791dcaeba69f94467bfbf08b4470332a0b4b49fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9hbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lxrfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:36Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.536147 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xtn6l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68aeb9c7-d3c3-4c34-96ab-bb947421c504\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hljmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hljmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xtn6l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:36Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.551227 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7adba94225f6d18961ccad4eab46916b607a2bff977a48d36fee6810aa4b7293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11d68e8091e3d18fff0b3c8f744c2e9002b6307e17815b221af7123715d8abf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:36Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.563211 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-68hvr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658471aa-68b2-478e-9522-ef5533009174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5c5f03d09a887caf686dc5e5d446c4589c7e577cd65e9329cd73d1f33af3d53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbl8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-68hvr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:36Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.584826 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.584895 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.584915 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.584941 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.584961 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:36Z","lastTransitionTime":"2026-01-31T14:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.585049 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3446cde960ab284f70a338a9a4bfee334fdbdef503d868692fb847f4df6acba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20273cc05bc490f8dd5258dccd02aa0e97d1d22c159ce2c177594920f4af83de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://701178503ef461547e687442de0607af20cee8bca075347d1d1cb2a439562232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5623c591d9a6806335a9b671648fc172e749568a7ad29c3bd41f1a990ba56010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://357afc89f219ac34037aaeb7086076b36f05907502c53006aeb588eaf440cc74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8f6719bdf2fef7e21f7c5cdcb3de4b4f08e69bd62e22cb10fe157f638b3c148\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://307462c29efce1de33c4d50cdbc74a6cb86d00b128964370024b108640b05b5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://307462c29efce1de33c4d50cdbc74a6cb86d00b128964370024b108640b05b5e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T14:42:29Z\\\",\\\"message\\\":\\\"t:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 14:42:29.388826 6425 services_controller.go:443] Built service openshift-etcd/etcd LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.253\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:2379, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.5.253\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:9979, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0131 14:42:29.388860 6425 services_controller.go:444] Built service openshift-etcd/etcd LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0131 14:42:29.388874 6425 services_controller.go:445] Built service openshift-etcd/etcd LB template configs for network=default: []services.lbConfig(nil)\\\\nF0131 14:42:29.388002 6425 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-n8cdt_openshift-ovn-kubernetes(ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7f5e784895f315aa4d8fb6039b9e70c46f977a80fdd5d39ead81d75048abd6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n8cdt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:36Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.600708 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6afb84a0-c564-45aa-b7a1-cd6f8273fe45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e4fe03f50f95359d284f7d5aebb0df697fba39743cf83857c7f17893cb2bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08603a998264e94ef7871338a47019047495685fb867c5fa4f290f8a990dc6ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddd61fd469b2710e86edf5644cff4c4272ce73a59b6578c4cf941c5408e2d218\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df6c6abfaa0e16e8c6de87da9d3584b57a9e83e9dee03f98bb9de38cf71cc19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f93de7a002a1e6a281d90042a003acf677ad136b29da511ead9acd31eb7ab2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T14:41:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 14:41:50.030836 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 14:41:50.033684 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-814856940/tls.crt::/tmp/serving-cert-814856940/tls.key\\\\\\\"\\\\nI0131 14:41:55.828999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 14:41:55.832995 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 14:41:55.833028 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 14:41:55.833056 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 14:41:55.833065 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 14:41:55.854492 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 14:41:55.854645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 14:41:55.854833 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 14:41:55.854881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 14:41:55.854891 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 14:41:55.855031 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 14:41:55.858196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d34543b9a432a0dd029c18b90dd065749cf5a74b9221dcedf22a810afa749a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:36Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.612296 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64468352-f9fe-48bb-b204-b9f828c06bf8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c56c1e31014f9e3d0be8140f58cff1c752ad4be1c6c60a942bc18320bbd37b6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41a7c5739a571e6f3ec88c3798ad2604382b9320c44ddda3d41681a64c6ab2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a6478c4477b785bcb405d597f1c835faaf4ef7adb3a2bcd6e70cc2e692f44d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cbc9dfffff59a784e871d47bf2ab1b8419b11840eb329ce21c0f7c24e6e77c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cbc9dfffff59a784e871d47bf2ab1b8419b11840eb329ce21c0f7c24e6e77c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:36Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.625332 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d4eee728dbd9b6b0ade5149fd481cb3a688a6856f56a3a25a6cb49c944ee27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:36Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.639838 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:36Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.651632 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:36Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.661982 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c170e8-22c9-43a9-8b34-9d626c2ccddc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d86496066c7ba97ec9852c160ecb2e25f49cb0b7ae16e667f6279be42fd52679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3956d143be77f4a50143f9678eb51ab7871e250cae73d87c9e7fce2575e466c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wpj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:36Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.688224 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.688536 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.688729 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.688924 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.689387 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:36Z","lastTransitionTime":"2026-01-31T14:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.792588 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.792656 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.792676 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.792703 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.792723 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:36Z","lastTransitionTime":"2026-01-31T14:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.895829 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.895903 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.895923 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.895949 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.895967 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:36Z","lastTransitionTime":"2026-01-31T14:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.999218 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.999688 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:36 crc kubenswrapper[4751]: I0131 14:42:36.999707 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:37 crc kubenswrapper[4751]: I0131 14:42:36.999732 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:37 crc kubenswrapper[4751]: I0131 14:42:36.999749 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:36Z","lastTransitionTime":"2026-01-31T14:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:37 crc kubenswrapper[4751]: I0131 14:42:37.102620 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:37 crc kubenswrapper[4751]: I0131 14:42:37.102818 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:37 crc kubenswrapper[4751]: I0131 14:42:37.102880 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:37 crc kubenswrapper[4751]: I0131 14:42:37.102911 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:37 crc kubenswrapper[4751]: I0131 14:42:37.102929 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:37Z","lastTransitionTime":"2026-01-31T14:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:37 crc kubenswrapper[4751]: I0131 14:42:37.206099 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:37 crc kubenswrapper[4751]: I0131 14:42:37.206191 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:37 crc kubenswrapper[4751]: I0131 14:42:37.206241 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:37 crc kubenswrapper[4751]: I0131 14:42:37.206267 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:37 crc kubenswrapper[4751]: I0131 14:42:37.206284 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:37Z","lastTransitionTime":"2026-01-31T14:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:37 crc kubenswrapper[4751]: I0131 14:42:37.310846 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:37 crc kubenswrapper[4751]: I0131 14:42:37.310901 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:37 crc kubenswrapper[4751]: I0131 14:42:37.310918 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:37 crc kubenswrapper[4751]: I0131 14:42:37.310941 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:37 crc kubenswrapper[4751]: I0131 14:42:37.310959 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:37Z","lastTransitionTime":"2026-01-31T14:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:37 crc kubenswrapper[4751]: I0131 14:42:37.389801 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 03:48:58.20389202 +0000 UTC Jan 31 14:42:37 crc kubenswrapper[4751]: I0131 14:42:37.405501 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:42:37 crc kubenswrapper[4751]: I0131 14:42:37.405562 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:42:37 crc kubenswrapper[4751]: E0131 14:42:37.405675 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xtn6l" podUID="68aeb9c7-d3c3-4c34-96ab-bb947421c504" Jan 31 14:42:37 crc kubenswrapper[4751]: I0131 14:42:37.405510 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:42:37 crc kubenswrapper[4751]: E0131 14:42:37.405824 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 14:42:37 crc kubenswrapper[4751]: I0131 14:42:37.405885 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:42:37 crc kubenswrapper[4751]: E0131 14:42:37.405965 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 14:42:37 crc kubenswrapper[4751]: E0131 14:42:37.406039 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 14:42:37 crc kubenswrapper[4751]: I0131 14:42:37.414821 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:37 crc kubenswrapper[4751]: I0131 14:42:37.414889 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:37 crc kubenswrapper[4751]: I0131 14:42:37.414907 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:37 crc kubenswrapper[4751]: I0131 14:42:37.414960 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:37 crc kubenswrapper[4751]: I0131 14:42:37.414986 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:37Z","lastTransitionTime":"2026-01-31T14:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:37 crc kubenswrapper[4751]: I0131 14:42:37.518598 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:37 crc kubenswrapper[4751]: I0131 14:42:37.518669 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:37 crc kubenswrapper[4751]: I0131 14:42:37.518692 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:37 crc kubenswrapper[4751]: I0131 14:42:37.518725 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:37 crc kubenswrapper[4751]: I0131 14:42:37.518748 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:37Z","lastTransitionTime":"2026-01-31T14:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:37 crc kubenswrapper[4751]: I0131 14:42:37.621597 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:37 crc kubenswrapper[4751]: I0131 14:42:37.621659 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:37 crc kubenswrapper[4751]: I0131 14:42:37.621681 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:37 crc kubenswrapper[4751]: I0131 14:42:37.621709 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:37 crc kubenswrapper[4751]: I0131 14:42:37.621730 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:37Z","lastTransitionTime":"2026-01-31T14:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:37 crc kubenswrapper[4751]: I0131 14:42:37.723623 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:37 crc kubenswrapper[4751]: I0131 14:42:37.723668 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:37 crc kubenswrapper[4751]: I0131 14:42:37.723683 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:37 crc kubenswrapper[4751]: I0131 14:42:37.723705 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:37 crc kubenswrapper[4751]: I0131 14:42:37.723720 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:37Z","lastTransitionTime":"2026-01-31T14:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:37 crc kubenswrapper[4751]: I0131 14:42:37.826581 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:37 crc kubenswrapper[4751]: I0131 14:42:37.826644 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:37 crc kubenswrapper[4751]: I0131 14:42:37.826662 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:37 crc kubenswrapper[4751]: I0131 14:42:37.826688 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:37 crc kubenswrapper[4751]: I0131 14:42:37.826706 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:37Z","lastTransitionTime":"2026-01-31T14:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:37 crc kubenswrapper[4751]: I0131 14:42:37.929426 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:37 crc kubenswrapper[4751]: I0131 14:42:37.929485 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:37 crc kubenswrapper[4751]: I0131 14:42:37.929502 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:37 crc kubenswrapper[4751]: I0131 14:42:37.929528 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:37 crc kubenswrapper[4751]: I0131 14:42:37.929547 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:37Z","lastTransitionTime":"2026-01-31T14:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:38 crc kubenswrapper[4751]: I0131 14:42:38.032646 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:38 crc kubenswrapper[4751]: I0131 14:42:38.032698 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:38 crc kubenswrapper[4751]: I0131 14:42:38.032715 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:38 crc kubenswrapper[4751]: I0131 14:42:38.032739 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:38 crc kubenswrapper[4751]: I0131 14:42:38.032757 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:38Z","lastTransitionTime":"2026-01-31T14:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:38 crc kubenswrapper[4751]: I0131 14:42:38.136128 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:38 crc kubenswrapper[4751]: I0131 14:42:38.136214 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:38 crc kubenswrapper[4751]: I0131 14:42:38.136232 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:38 crc kubenswrapper[4751]: I0131 14:42:38.136258 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:38 crc kubenswrapper[4751]: I0131 14:42:38.136276 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:38Z","lastTransitionTime":"2026-01-31T14:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:38 crc kubenswrapper[4751]: I0131 14:42:38.239225 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:38 crc kubenswrapper[4751]: I0131 14:42:38.239282 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:38 crc kubenswrapper[4751]: I0131 14:42:38.239301 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:38 crc kubenswrapper[4751]: I0131 14:42:38.239325 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:38 crc kubenswrapper[4751]: I0131 14:42:38.239342 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:38Z","lastTransitionTime":"2026-01-31T14:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:38 crc kubenswrapper[4751]: I0131 14:42:38.341626 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:38 crc kubenswrapper[4751]: I0131 14:42:38.341695 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:38 crc kubenswrapper[4751]: I0131 14:42:38.341715 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:38 crc kubenswrapper[4751]: I0131 14:42:38.341740 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:38 crc kubenswrapper[4751]: I0131 14:42:38.341756 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:38Z","lastTransitionTime":"2026-01-31T14:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:38 crc kubenswrapper[4751]: I0131 14:42:38.390431 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 13:49:02.735074908 +0000 UTC Jan 31 14:42:38 crc kubenswrapper[4751]: I0131 14:42:38.444252 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:38 crc kubenswrapper[4751]: I0131 14:42:38.444315 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:38 crc kubenswrapper[4751]: I0131 14:42:38.444332 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:38 crc kubenswrapper[4751]: I0131 14:42:38.444354 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:38 crc kubenswrapper[4751]: I0131 14:42:38.444374 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:38Z","lastTransitionTime":"2026-01-31T14:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:38 crc kubenswrapper[4751]: I0131 14:42:38.546976 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:38 crc kubenswrapper[4751]: I0131 14:42:38.547048 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:38 crc kubenswrapper[4751]: I0131 14:42:38.547098 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:38 crc kubenswrapper[4751]: I0131 14:42:38.547124 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:38 crc kubenswrapper[4751]: I0131 14:42:38.547141 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:38Z","lastTransitionTime":"2026-01-31T14:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:38 crc kubenswrapper[4751]: I0131 14:42:38.649950 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:38 crc kubenswrapper[4751]: I0131 14:42:38.650001 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:38 crc kubenswrapper[4751]: I0131 14:42:38.650017 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:38 crc kubenswrapper[4751]: I0131 14:42:38.650039 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:38 crc kubenswrapper[4751]: I0131 14:42:38.650055 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:38Z","lastTransitionTime":"2026-01-31T14:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:38 crc kubenswrapper[4751]: I0131 14:42:38.753679 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:38 crc kubenswrapper[4751]: I0131 14:42:38.753756 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:38 crc kubenswrapper[4751]: I0131 14:42:38.753782 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:38 crc kubenswrapper[4751]: I0131 14:42:38.753813 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:38 crc kubenswrapper[4751]: I0131 14:42:38.753837 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:38Z","lastTransitionTime":"2026-01-31T14:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:38 crc kubenswrapper[4751]: I0131 14:42:38.856401 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:38 crc kubenswrapper[4751]: I0131 14:42:38.856458 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:38 crc kubenswrapper[4751]: I0131 14:42:38.856475 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:38 crc kubenswrapper[4751]: I0131 14:42:38.856498 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:38 crc kubenswrapper[4751]: I0131 14:42:38.856516 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:38Z","lastTransitionTime":"2026-01-31T14:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:38 crc kubenswrapper[4751]: I0131 14:42:38.959659 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:38 crc kubenswrapper[4751]: I0131 14:42:38.959731 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:38 crc kubenswrapper[4751]: I0131 14:42:38.959752 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:38 crc kubenswrapper[4751]: I0131 14:42:38.959778 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:38 crc kubenswrapper[4751]: I0131 14:42:38.959796 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:38Z","lastTransitionTime":"2026-01-31T14:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:39 crc kubenswrapper[4751]: I0131 14:42:39.062647 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:39 crc kubenswrapper[4751]: I0131 14:42:39.062703 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:39 crc kubenswrapper[4751]: I0131 14:42:39.062719 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:39 crc kubenswrapper[4751]: I0131 14:42:39.062748 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:39 crc kubenswrapper[4751]: I0131 14:42:39.062768 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:39Z","lastTransitionTime":"2026-01-31T14:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:39 crc kubenswrapper[4751]: I0131 14:42:39.165071 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:39 crc kubenswrapper[4751]: I0131 14:42:39.165148 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:39 crc kubenswrapper[4751]: I0131 14:42:39.165165 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:39 crc kubenswrapper[4751]: I0131 14:42:39.165187 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:39 crc kubenswrapper[4751]: I0131 14:42:39.165204 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:39Z","lastTransitionTime":"2026-01-31T14:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:39 crc kubenswrapper[4751]: I0131 14:42:39.268111 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:39 crc kubenswrapper[4751]: I0131 14:42:39.268169 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:39 crc kubenswrapper[4751]: I0131 14:42:39.268186 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:39 crc kubenswrapper[4751]: I0131 14:42:39.268212 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:39 crc kubenswrapper[4751]: I0131 14:42:39.268229 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:39Z","lastTransitionTime":"2026-01-31T14:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:39 crc kubenswrapper[4751]: I0131 14:42:39.371358 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:39 crc kubenswrapper[4751]: I0131 14:42:39.371416 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:39 crc kubenswrapper[4751]: I0131 14:42:39.371432 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:39 crc kubenswrapper[4751]: I0131 14:42:39.371456 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:39 crc kubenswrapper[4751]: I0131 14:42:39.371473 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:39Z","lastTransitionTime":"2026-01-31T14:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:39 crc kubenswrapper[4751]: I0131 14:42:39.390909 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 20:02:01.20547735 +0000 UTC Jan 31 14:42:39 crc kubenswrapper[4751]: I0131 14:42:39.404773 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:42:39 crc kubenswrapper[4751]: E0131 14:42:39.404882 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 14:42:39 crc kubenswrapper[4751]: I0131 14:42:39.405060 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:42:39 crc kubenswrapper[4751]: E0131 14:42:39.405157 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 14:42:39 crc kubenswrapper[4751]: I0131 14:42:39.405313 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:42:39 crc kubenswrapper[4751]: E0131 14:42:39.405385 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xtn6l" podUID="68aeb9c7-d3c3-4c34-96ab-bb947421c504" Jan 31 14:42:39 crc kubenswrapper[4751]: I0131 14:42:39.405711 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:42:39 crc kubenswrapper[4751]: E0131 14:42:39.405782 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 14:42:39 crc kubenswrapper[4751]: I0131 14:42:39.474021 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:39 crc kubenswrapper[4751]: I0131 14:42:39.474105 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:39 crc kubenswrapper[4751]: I0131 14:42:39.474119 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:39 crc kubenswrapper[4751]: I0131 14:42:39.474136 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:39 crc kubenswrapper[4751]: I0131 14:42:39.474149 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:39Z","lastTransitionTime":"2026-01-31T14:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:39 crc kubenswrapper[4751]: I0131 14:42:39.576934 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:39 crc kubenswrapper[4751]: I0131 14:42:39.576965 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:39 crc kubenswrapper[4751]: I0131 14:42:39.576977 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:39 crc kubenswrapper[4751]: I0131 14:42:39.576993 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:39 crc kubenswrapper[4751]: I0131 14:42:39.577004 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:39Z","lastTransitionTime":"2026-01-31T14:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:39 crc kubenswrapper[4751]: I0131 14:42:39.680264 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:39 crc kubenswrapper[4751]: I0131 14:42:39.680299 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:39 crc kubenswrapper[4751]: I0131 14:42:39.680310 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:39 crc kubenswrapper[4751]: I0131 14:42:39.680328 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:39 crc kubenswrapper[4751]: I0131 14:42:39.680340 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:39Z","lastTransitionTime":"2026-01-31T14:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:39 crc kubenswrapper[4751]: I0131 14:42:39.783095 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:39 crc kubenswrapper[4751]: I0131 14:42:39.783180 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:39 crc kubenswrapper[4751]: I0131 14:42:39.783198 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:39 crc kubenswrapper[4751]: I0131 14:42:39.783220 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:39 crc kubenswrapper[4751]: I0131 14:42:39.783239 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:39Z","lastTransitionTime":"2026-01-31T14:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:39 crc kubenswrapper[4751]: I0131 14:42:39.885553 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:39 crc kubenswrapper[4751]: I0131 14:42:39.885598 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:39 crc kubenswrapper[4751]: I0131 14:42:39.885614 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:39 crc kubenswrapper[4751]: I0131 14:42:39.885635 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:39 crc kubenswrapper[4751]: I0131 14:42:39.885654 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:39Z","lastTransitionTime":"2026-01-31T14:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:39 crc kubenswrapper[4751]: I0131 14:42:39.988295 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:39 crc kubenswrapper[4751]: I0131 14:42:39.988358 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:39 crc kubenswrapper[4751]: I0131 14:42:39.988374 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:39 crc kubenswrapper[4751]: I0131 14:42:39.988396 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:39 crc kubenswrapper[4751]: I0131 14:42:39.988415 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:39Z","lastTransitionTime":"2026-01-31T14:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:40 crc kubenswrapper[4751]: I0131 14:42:40.091132 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:40 crc kubenswrapper[4751]: I0131 14:42:40.091201 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:40 crc kubenswrapper[4751]: I0131 14:42:40.091217 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:40 crc kubenswrapper[4751]: I0131 14:42:40.091240 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:40 crc kubenswrapper[4751]: I0131 14:42:40.091256 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:40Z","lastTransitionTime":"2026-01-31T14:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:40 crc kubenswrapper[4751]: I0131 14:42:40.193920 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:40 crc kubenswrapper[4751]: I0131 14:42:40.193958 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:40 crc kubenswrapper[4751]: I0131 14:42:40.193968 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:40 crc kubenswrapper[4751]: I0131 14:42:40.193981 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:40 crc kubenswrapper[4751]: I0131 14:42:40.193989 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:40Z","lastTransitionTime":"2026-01-31T14:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:40 crc kubenswrapper[4751]: I0131 14:42:40.296436 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:40 crc kubenswrapper[4751]: I0131 14:42:40.296493 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:40 crc kubenswrapper[4751]: I0131 14:42:40.296512 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:40 crc kubenswrapper[4751]: I0131 14:42:40.296534 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:40 crc kubenswrapper[4751]: I0131 14:42:40.296551 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:40Z","lastTransitionTime":"2026-01-31T14:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:40 crc kubenswrapper[4751]: I0131 14:42:40.391933 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 00:37:34.192427962 +0000 UTC Jan 31 14:42:40 crc kubenswrapper[4751]: I0131 14:42:40.399377 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:40 crc kubenswrapper[4751]: I0131 14:42:40.399439 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:40 crc kubenswrapper[4751]: I0131 14:42:40.399457 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:40 crc kubenswrapper[4751]: I0131 14:42:40.399483 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:40 crc kubenswrapper[4751]: I0131 14:42:40.399500 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:40Z","lastTransitionTime":"2026-01-31T14:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:40 crc kubenswrapper[4751]: I0131 14:42:40.502444 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:40 crc kubenswrapper[4751]: I0131 14:42:40.502480 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:40 crc kubenswrapper[4751]: I0131 14:42:40.502487 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:40 crc kubenswrapper[4751]: I0131 14:42:40.502500 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:40 crc kubenswrapper[4751]: I0131 14:42:40.502509 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:40Z","lastTransitionTime":"2026-01-31T14:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:40 crc kubenswrapper[4751]: I0131 14:42:40.606524 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:40 crc kubenswrapper[4751]: I0131 14:42:40.606580 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:40 crc kubenswrapper[4751]: I0131 14:42:40.606597 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:40 crc kubenswrapper[4751]: I0131 14:42:40.606620 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:40 crc kubenswrapper[4751]: I0131 14:42:40.606638 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:40Z","lastTransitionTime":"2026-01-31T14:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:40 crc kubenswrapper[4751]: I0131 14:42:40.709129 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:40 crc kubenswrapper[4751]: I0131 14:42:40.709174 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:40 crc kubenswrapper[4751]: I0131 14:42:40.709185 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:40 crc kubenswrapper[4751]: I0131 14:42:40.709202 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:40 crc kubenswrapper[4751]: I0131 14:42:40.709215 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:40Z","lastTransitionTime":"2026-01-31T14:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:40 crc kubenswrapper[4751]: I0131 14:42:40.811821 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:40 crc kubenswrapper[4751]: I0131 14:42:40.811862 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:40 crc kubenswrapper[4751]: I0131 14:42:40.811873 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:40 crc kubenswrapper[4751]: I0131 14:42:40.811890 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:40 crc kubenswrapper[4751]: I0131 14:42:40.811900 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:40Z","lastTransitionTime":"2026-01-31T14:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:40 crc kubenswrapper[4751]: I0131 14:42:40.914548 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:40 crc kubenswrapper[4751]: I0131 14:42:40.914620 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:40 crc kubenswrapper[4751]: I0131 14:42:40.914642 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:40 crc kubenswrapper[4751]: I0131 14:42:40.914673 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:40 crc kubenswrapper[4751]: I0131 14:42:40.914696 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:40Z","lastTransitionTime":"2026-01-31T14:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:41 crc kubenswrapper[4751]: I0131 14:42:41.017520 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:41 crc kubenswrapper[4751]: I0131 14:42:41.017623 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:41 crc kubenswrapper[4751]: I0131 14:42:41.017640 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:41 crc kubenswrapper[4751]: I0131 14:42:41.017660 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:41 crc kubenswrapper[4751]: I0131 14:42:41.017674 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:41Z","lastTransitionTime":"2026-01-31T14:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:41 crc kubenswrapper[4751]: I0131 14:42:41.120852 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:41 crc kubenswrapper[4751]: I0131 14:42:41.120903 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:41 crc kubenswrapper[4751]: I0131 14:42:41.120918 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:41 crc kubenswrapper[4751]: I0131 14:42:41.120941 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:41 crc kubenswrapper[4751]: I0131 14:42:41.120955 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:41Z","lastTransitionTime":"2026-01-31T14:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:41 crc kubenswrapper[4751]: I0131 14:42:41.222865 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:41 crc kubenswrapper[4751]: I0131 14:42:41.222915 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:41 crc kubenswrapper[4751]: I0131 14:42:41.222945 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:41 crc kubenswrapper[4751]: I0131 14:42:41.222961 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:41 crc kubenswrapper[4751]: I0131 14:42:41.222970 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:41Z","lastTransitionTime":"2026-01-31T14:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:41 crc kubenswrapper[4751]: I0131 14:42:41.324961 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:41 crc kubenswrapper[4751]: I0131 14:42:41.325017 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:41 crc kubenswrapper[4751]: I0131 14:42:41.325031 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:41 crc kubenswrapper[4751]: I0131 14:42:41.325050 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:41 crc kubenswrapper[4751]: I0131 14:42:41.325064 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:41Z","lastTransitionTime":"2026-01-31T14:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:41 crc kubenswrapper[4751]: I0131 14:42:41.393011 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 23:11:46.210802164 +0000 UTC Jan 31 14:42:41 crc kubenswrapper[4751]: I0131 14:42:41.405305 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:42:41 crc kubenswrapper[4751]: I0131 14:42:41.405370 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:42:41 crc kubenswrapper[4751]: I0131 14:42:41.405372 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:42:41 crc kubenswrapper[4751]: E0131 14:42:41.405440 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 14:42:41 crc kubenswrapper[4751]: I0131 14:42:41.405326 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:42:41 crc kubenswrapper[4751]: E0131 14:42:41.405625 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 14:42:41 crc kubenswrapper[4751]: E0131 14:42:41.405638 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xtn6l" podUID="68aeb9c7-d3c3-4c34-96ab-bb947421c504" Jan 31 14:42:41 crc kubenswrapper[4751]: E0131 14:42:41.405677 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 14:42:41 crc kubenswrapper[4751]: I0131 14:42:41.427689 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:41 crc kubenswrapper[4751]: I0131 14:42:41.427773 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:41 crc kubenswrapper[4751]: I0131 14:42:41.427798 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:41 crc kubenswrapper[4751]: I0131 14:42:41.427829 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:41 crc kubenswrapper[4751]: I0131 14:42:41.427851 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:41Z","lastTransitionTime":"2026-01-31T14:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:41 crc kubenswrapper[4751]: I0131 14:42:41.530436 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:41 crc kubenswrapper[4751]: I0131 14:42:41.530485 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:41 crc kubenswrapper[4751]: I0131 14:42:41.530495 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:41 crc kubenswrapper[4751]: I0131 14:42:41.530509 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:41 crc kubenswrapper[4751]: I0131 14:42:41.530517 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:41Z","lastTransitionTime":"2026-01-31T14:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:41 crc kubenswrapper[4751]: I0131 14:42:41.633974 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:41 crc kubenswrapper[4751]: I0131 14:42:41.634033 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:41 crc kubenswrapper[4751]: I0131 14:42:41.634045 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:41 crc kubenswrapper[4751]: I0131 14:42:41.634061 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:41 crc kubenswrapper[4751]: I0131 14:42:41.634105 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:41Z","lastTransitionTime":"2026-01-31T14:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:41 crc kubenswrapper[4751]: I0131 14:42:41.737863 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:41 crc kubenswrapper[4751]: I0131 14:42:41.737925 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:41 crc kubenswrapper[4751]: I0131 14:42:41.737935 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:41 crc kubenswrapper[4751]: I0131 14:42:41.737958 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:41 crc kubenswrapper[4751]: I0131 14:42:41.737969 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:41Z","lastTransitionTime":"2026-01-31T14:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:41 crc kubenswrapper[4751]: I0131 14:42:41.867861 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:41 crc kubenswrapper[4751]: I0131 14:42:41.867935 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:41 crc kubenswrapper[4751]: I0131 14:42:41.867954 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:41 crc kubenswrapper[4751]: I0131 14:42:41.867983 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:41 crc kubenswrapper[4751]: I0131 14:42:41.868003 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:41Z","lastTransitionTime":"2026-01-31T14:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:41 crc kubenswrapper[4751]: I0131 14:42:41.970806 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:41 crc kubenswrapper[4751]: I0131 14:42:41.971055 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:41 crc kubenswrapper[4751]: I0131 14:42:41.971112 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:41 crc kubenswrapper[4751]: I0131 14:42:41.971126 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:41 crc kubenswrapper[4751]: I0131 14:42:41.971136 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:41Z","lastTransitionTime":"2026-01-31T14:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:42 crc kubenswrapper[4751]: I0131 14:42:42.074557 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:42 crc kubenswrapper[4751]: I0131 14:42:42.074626 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:42 crc kubenswrapper[4751]: I0131 14:42:42.074645 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:42 crc kubenswrapper[4751]: I0131 14:42:42.074670 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:42 crc kubenswrapper[4751]: I0131 14:42:42.074689 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:42Z","lastTransitionTime":"2026-01-31T14:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:42 crc kubenswrapper[4751]: I0131 14:42:42.177844 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:42 crc kubenswrapper[4751]: I0131 14:42:42.177888 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:42 crc kubenswrapper[4751]: I0131 14:42:42.177896 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:42 crc kubenswrapper[4751]: I0131 14:42:42.177912 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:42 crc kubenswrapper[4751]: I0131 14:42:42.177922 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:42Z","lastTransitionTime":"2026-01-31T14:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:42 crc kubenswrapper[4751]: I0131 14:42:42.280542 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:42 crc kubenswrapper[4751]: I0131 14:42:42.280601 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:42 crc kubenswrapper[4751]: I0131 14:42:42.280613 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:42 crc kubenswrapper[4751]: I0131 14:42:42.280632 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:42 crc kubenswrapper[4751]: I0131 14:42:42.280645 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:42Z","lastTransitionTime":"2026-01-31T14:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:42 crc kubenswrapper[4751]: I0131 14:42:42.383028 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:42 crc kubenswrapper[4751]: I0131 14:42:42.383096 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:42 crc kubenswrapper[4751]: I0131 14:42:42.383113 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:42 crc kubenswrapper[4751]: I0131 14:42:42.383136 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:42 crc kubenswrapper[4751]: I0131 14:42:42.383152 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:42Z","lastTransitionTime":"2026-01-31T14:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:42 crc kubenswrapper[4751]: I0131 14:42:42.393395 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 16:49:51.274816437 +0000 UTC Jan 31 14:42:42 crc kubenswrapper[4751]: I0131 14:42:42.485247 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:42 crc kubenswrapper[4751]: I0131 14:42:42.485288 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:42 crc kubenswrapper[4751]: I0131 14:42:42.485301 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:42 crc kubenswrapper[4751]: I0131 14:42:42.485318 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:42 crc kubenswrapper[4751]: I0131 14:42:42.485331 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:42Z","lastTransitionTime":"2026-01-31T14:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:42 crc kubenswrapper[4751]: I0131 14:42:42.587405 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:42 crc kubenswrapper[4751]: I0131 14:42:42.587444 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:42 crc kubenswrapper[4751]: I0131 14:42:42.587453 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:42 crc kubenswrapper[4751]: I0131 14:42:42.587470 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:42 crc kubenswrapper[4751]: I0131 14:42:42.587484 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:42Z","lastTransitionTime":"2026-01-31T14:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:42 crc kubenswrapper[4751]: I0131 14:42:42.690281 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:42 crc kubenswrapper[4751]: I0131 14:42:42.690342 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:42 crc kubenswrapper[4751]: I0131 14:42:42.690361 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:42 crc kubenswrapper[4751]: I0131 14:42:42.690385 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:42 crc kubenswrapper[4751]: I0131 14:42:42.690404 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:42Z","lastTransitionTime":"2026-01-31T14:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:42 crc kubenswrapper[4751]: I0131 14:42:42.792816 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:42 crc kubenswrapper[4751]: I0131 14:42:42.792861 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:42 crc kubenswrapper[4751]: I0131 14:42:42.792873 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:42 crc kubenswrapper[4751]: I0131 14:42:42.792890 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:42 crc kubenswrapper[4751]: I0131 14:42:42.792902 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:42Z","lastTransitionTime":"2026-01-31T14:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:42 crc kubenswrapper[4751]: I0131 14:42:42.894845 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:42 crc kubenswrapper[4751]: I0131 14:42:42.894908 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:42 crc kubenswrapper[4751]: I0131 14:42:42.894926 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:42 crc kubenswrapper[4751]: I0131 14:42:42.895164 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:42 crc kubenswrapper[4751]: I0131 14:42:42.895188 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:42Z","lastTransitionTime":"2026-01-31T14:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:42 crc kubenswrapper[4751]: I0131 14:42:42.996890 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:42 crc kubenswrapper[4751]: I0131 14:42:42.996946 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:42 crc kubenswrapper[4751]: I0131 14:42:42.996959 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:42 crc kubenswrapper[4751]: I0131 14:42:42.996979 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:42 crc kubenswrapper[4751]: I0131 14:42:42.996992 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:42Z","lastTransitionTime":"2026-01-31T14:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.099124 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.099174 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.099184 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.099201 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.099215 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:43Z","lastTransitionTime":"2026-01-31T14:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.202139 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.202170 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.202181 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.202195 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.202206 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:43Z","lastTransitionTime":"2026-01-31T14:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.305504 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.305553 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.305562 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.305577 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.305587 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:43Z","lastTransitionTime":"2026-01-31T14:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.394529 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 22:19:40.746100594 +0000 UTC Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.405979 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.406105 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.406135 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.406220 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:42:43 crc kubenswrapper[4751]: E0131 14:42:43.406355 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xtn6l" podUID="68aeb9c7-d3c3-4c34-96ab-bb947421c504" Jan 31 14:42:43 crc kubenswrapper[4751]: E0131 14:42:43.406593 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 14:42:43 crc kubenswrapper[4751]: E0131 14:42:43.406687 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 14:42:43 crc kubenswrapper[4751]: E0131 14:42:43.406798 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.407594 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.407627 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.407638 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.407652 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.407664 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:43Z","lastTransitionTime":"2026-01-31T14:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.510723 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.510792 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.510810 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.510835 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.510857 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:43Z","lastTransitionTime":"2026-01-31T14:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.613379 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.613629 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.613759 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.613859 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.613939 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:43Z","lastTransitionTime":"2026-01-31T14:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.661442 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.661728 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.661805 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.661936 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.662025 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:43Z","lastTransitionTime":"2026-01-31T14:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:43 crc kubenswrapper[4751]: E0131 14:42:43.676619 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2bc08d22-1e39-4800-b402-ea260cc19637\\\",\\\"systemUUID\\\":\\\"ad4951a2-dbbf-4c4e-af3f-cce3c25b01ef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:43Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.681332 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.681626 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.681839 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.681957 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.682096 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:43Z","lastTransitionTime":"2026-01-31T14:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:43 crc kubenswrapper[4751]: E0131 14:42:43.696262 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2bc08d22-1e39-4800-b402-ea260cc19637\\\",\\\"systemUUID\\\":\\\"ad4951a2-dbbf-4c4e-af3f-cce3c25b01ef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:43Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.701027 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.701094 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.701104 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.701119 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.701129 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:43Z","lastTransitionTime":"2026-01-31T14:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:43 crc kubenswrapper[4751]: E0131 14:42:43.714129 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2bc08d22-1e39-4800-b402-ea260cc19637\\\",\\\"systemUUID\\\":\\\"ad4951a2-dbbf-4c4e-af3f-cce3c25b01ef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:43Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.718827 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.718874 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.718886 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.718906 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.718917 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:43Z","lastTransitionTime":"2026-01-31T14:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:43 crc kubenswrapper[4751]: E0131 14:42:43.735009 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2bc08d22-1e39-4800-b402-ea260cc19637\\\",\\\"systemUUID\\\":\\\"ad4951a2-dbbf-4c4e-af3f-cce3c25b01ef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:43Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.739753 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.739801 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.739811 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.739829 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.739845 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:43Z","lastTransitionTime":"2026-01-31T14:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:43 crc kubenswrapper[4751]: E0131 14:42:43.757918 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2bc08d22-1e39-4800-b402-ea260cc19637\\\",\\\"systemUUID\\\":\\\"ad4951a2-dbbf-4c4e-af3f-cce3c25b01ef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:43Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:43 crc kubenswrapper[4751]: E0131 14:42:43.758111 4751 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.760541 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.760590 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.760600 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.760615 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.760632 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:43Z","lastTransitionTime":"2026-01-31T14:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.863543 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.863587 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.863598 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.863613 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.863627 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:43Z","lastTransitionTime":"2026-01-31T14:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.966623 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.966691 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.966714 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.966747 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:43 crc kubenswrapper[4751]: I0131 14:42:43.966769 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:43Z","lastTransitionTime":"2026-01-31T14:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:44 crc kubenswrapper[4751]: I0131 14:42:44.069351 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:44 crc kubenswrapper[4751]: I0131 14:42:44.069388 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:44 crc kubenswrapper[4751]: I0131 14:42:44.069399 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:44 crc kubenswrapper[4751]: I0131 14:42:44.069416 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:44 crc kubenswrapper[4751]: I0131 14:42:44.069426 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:44Z","lastTransitionTime":"2026-01-31T14:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:44 crc kubenswrapper[4751]: I0131 14:42:44.174185 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:44 crc kubenswrapper[4751]: I0131 14:42:44.174231 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:44 crc kubenswrapper[4751]: I0131 14:42:44.174243 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:44 crc kubenswrapper[4751]: I0131 14:42:44.174260 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:44 crc kubenswrapper[4751]: I0131 14:42:44.174271 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:44Z","lastTransitionTime":"2026-01-31T14:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:44 crc kubenswrapper[4751]: I0131 14:42:44.277961 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:44 crc kubenswrapper[4751]: I0131 14:42:44.278029 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:44 crc kubenswrapper[4751]: I0131 14:42:44.278048 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:44 crc kubenswrapper[4751]: I0131 14:42:44.278102 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:44 crc kubenswrapper[4751]: I0131 14:42:44.278122 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:44Z","lastTransitionTime":"2026-01-31T14:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:44 crc kubenswrapper[4751]: I0131 14:42:44.380266 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:44 crc kubenswrapper[4751]: I0131 14:42:44.380492 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:44 crc kubenswrapper[4751]: I0131 14:42:44.380632 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:44 crc kubenswrapper[4751]: I0131 14:42:44.380753 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:44 crc kubenswrapper[4751]: I0131 14:42:44.380838 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:44Z","lastTransitionTime":"2026-01-31T14:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:44 crc kubenswrapper[4751]: I0131 14:42:44.395018 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 01:45:32.087808831 +0000 UTC Jan 31 14:42:44 crc kubenswrapper[4751]: I0131 14:42:44.483808 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:44 crc kubenswrapper[4751]: I0131 14:42:44.484152 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:44 crc kubenswrapper[4751]: I0131 14:42:44.484245 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:44 crc kubenswrapper[4751]: I0131 14:42:44.484325 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:44 crc kubenswrapper[4751]: I0131 14:42:44.484397 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:44Z","lastTransitionTime":"2026-01-31T14:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:44 crc kubenswrapper[4751]: I0131 14:42:44.586645 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:44 crc kubenswrapper[4751]: I0131 14:42:44.586902 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:44 crc kubenswrapper[4751]: I0131 14:42:44.587000 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:44 crc kubenswrapper[4751]: I0131 14:42:44.587142 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:44 crc kubenswrapper[4751]: I0131 14:42:44.587229 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:44Z","lastTransitionTime":"2026-01-31T14:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:44 crc kubenswrapper[4751]: I0131 14:42:44.690294 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:44 crc kubenswrapper[4751]: I0131 14:42:44.690432 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:44 crc kubenswrapper[4751]: I0131 14:42:44.690460 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:44 crc kubenswrapper[4751]: I0131 14:42:44.690498 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:44 crc kubenswrapper[4751]: I0131 14:42:44.690524 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:44Z","lastTransitionTime":"2026-01-31T14:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:44 crc kubenswrapper[4751]: I0131 14:42:44.794643 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:44 crc kubenswrapper[4751]: I0131 14:42:44.794822 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:44 crc kubenswrapper[4751]: I0131 14:42:44.794896 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:44 crc kubenswrapper[4751]: I0131 14:42:44.794988 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:44 crc kubenswrapper[4751]: I0131 14:42:44.795067 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:44Z","lastTransitionTime":"2026-01-31T14:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:44 crc kubenswrapper[4751]: I0131 14:42:44.899220 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:44 crc kubenswrapper[4751]: I0131 14:42:44.899569 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:44 crc kubenswrapper[4751]: I0131 14:42:44.899591 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:44 crc kubenswrapper[4751]: I0131 14:42:44.899611 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:44 crc kubenswrapper[4751]: I0131 14:42:44.899758 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:44Z","lastTransitionTime":"2026-01-31T14:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:45 crc kubenswrapper[4751]: I0131 14:42:45.004649 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:45 crc kubenswrapper[4751]: I0131 14:42:45.004694 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:45 crc kubenswrapper[4751]: I0131 14:42:45.004703 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:45 crc kubenswrapper[4751]: I0131 14:42:45.004720 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:45 crc kubenswrapper[4751]: I0131 14:42:45.004729 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:45Z","lastTransitionTime":"2026-01-31T14:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:45 crc kubenswrapper[4751]: I0131 14:42:45.108593 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:45 crc kubenswrapper[4751]: I0131 14:42:45.108650 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:45 crc kubenswrapper[4751]: I0131 14:42:45.108663 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:45 crc kubenswrapper[4751]: I0131 14:42:45.108684 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:45 crc kubenswrapper[4751]: I0131 14:42:45.108697 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:45Z","lastTransitionTime":"2026-01-31T14:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:45 crc kubenswrapper[4751]: I0131 14:42:45.212914 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:45 crc kubenswrapper[4751]: I0131 14:42:45.212985 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:45 crc kubenswrapper[4751]: I0131 14:42:45.213007 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:45 crc kubenswrapper[4751]: I0131 14:42:45.213029 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:45 crc kubenswrapper[4751]: I0131 14:42:45.213044 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:45Z","lastTransitionTime":"2026-01-31T14:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:45 crc kubenswrapper[4751]: I0131 14:42:45.316484 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:45 crc kubenswrapper[4751]: I0131 14:42:45.316548 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:45 crc kubenswrapper[4751]: I0131 14:42:45.316573 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:45 crc kubenswrapper[4751]: I0131 14:42:45.316629 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:45 crc kubenswrapper[4751]: I0131 14:42:45.316652 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:45Z","lastTransitionTime":"2026-01-31T14:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:45 crc kubenswrapper[4751]: I0131 14:42:45.396593 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 16:02:26.520514282 +0000 UTC Jan 31 14:42:45 crc kubenswrapper[4751]: I0131 14:42:45.405029 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:42:45 crc kubenswrapper[4751]: I0131 14:42:45.405148 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:42:45 crc kubenswrapper[4751]: I0131 14:42:45.405148 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:42:45 crc kubenswrapper[4751]: I0131 14:42:45.405301 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:42:45 crc kubenswrapper[4751]: E0131 14:42:45.405554 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xtn6l" podUID="68aeb9c7-d3c3-4c34-96ab-bb947421c504" Jan 31 14:42:45 crc kubenswrapper[4751]: E0131 14:42:45.405704 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 14:42:45 crc kubenswrapper[4751]: E0131 14:42:45.405830 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 14:42:45 crc kubenswrapper[4751]: E0131 14:42:45.405955 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 14:42:45 crc kubenswrapper[4751]: I0131 14:42:45.420460 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:45 crc kubenswrapper[4751]: I0131 14:42:45.420524 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:45 crc kubenswrapper[4751]: I0131 14:42:45.420539 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:45 crc kubenswrapper[4751]: I0131 14:42:45.420563 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:45 crc kubenswrapper[4751]: I0131 14:42:45.420579 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:45Z","lastTransitionTime":"2026-01-31T14:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:45 crc kubenswrapper[4751]: I0131 14:42:45.524333 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:45 crc kubenswrapper[4751]: I0131 14:42:45.524398 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:45 crc kubenswrapper[4751]: I0131 14:42:45.524412 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:45 crc kubenswrapper[4751]: I0131 14:42:45.524434 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:45 crc kubenswrapper[4751]: I0131 14:42:45.524448 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:45Z","lastTransitionTime":"2026-01-31T14:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:45 crc kubenswrapper[4751]: I0131 14:42:45.626937 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:45 crc kubenswrapper[4751]: I0131 14:42:45.626991 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:45 crc kubenswrapper[4751]: I0131 14:42:45.627004 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:45 crc kubenswrapper[4751]: I0131 14:42:45.627027 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:45 crc kubenswrapper[4751]: I0131 14:42:45.627040 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:45Z","lastTransitionTime":"2026-01-31T14:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:45 crc kubenswrapper[4751]: I0131 14:42:45.729691 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:45 crc kubenswrapper[4751]: I0131 14:42:45.729751 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:45 crc kubenswrapper[4751]: I0131 14:42:45.729770 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:45 crc kubenswrapper[4751]: I0131 14:42:45.729794 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:45 crc kubenswrapper[4751]: I0131 14:42:45.729847 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:45Z","lastTransitionTime":"2026-01-31T14:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:45 crc kubenswrapper[4751]: I0131 14:42:45.832992 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:45 crc kubenswrapper[4751]: I0131 14:42:45.833057 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:45 crc kubenswrapper[4751]: I0131 14:42:45.833111 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:45 crc kubenswrapper[4751]: I0131 14:42:45.833136 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:45 crc kubenswrapper[4751]: I0131 14:42:45.833154 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:45Z","lastTransitionTime":"2026-01-31T14:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:45 crc kubenswrapper[4751]: I0131 14:42:45.936388 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:45 crc kubenswrapper[4751]: I0131 14:42:45.936438 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:45 crc kubenswrapper[4751]: I0131 14:42:45.936452 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:45 crc kubenswrapper[4751]: I0131 14:42:45.936472 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:45 crc kubenswrapper[4751]: I0131 14:42:45.936486 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:45Z","lastTransitionTime":"2026-01-31T14:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.039504 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.039568 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.039582 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.039601 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.039613 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:46Z","lastTransitionTime":"2026-01-31T14:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.142197 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.142252 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.142263 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.142281 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.142292 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:46Z","lastTransitionTime":"2026-01-31T14:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.244923 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.244984 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.244994 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.245010 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.245018 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:46Z","lastTransitionTime":"2026-01-31T14:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.347126 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.347170 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.347182 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.347199 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.347212 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:46Z","lastTransitionTime":"2026-01-31T14:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.396787 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 07:21:47.703379668 +0000 UTC Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.406687 4751 scope.go:117] "RemoveContainer" containerID="307462c29efce1de33c4d50cdbc74a6cb86d00b128964370024b108640b05b5e" Jan 31 14:42:46 crc kubenswrapper[4751]: E0131 14:42:46.406868 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-n8cdt_openshift-ovn-kubernetes(ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" podUID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.424758 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:46Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.441513 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c170e8-22c9-43a9-8b34-9d626c2ccddc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d86496066c7ba97ec9852c160ecb2e25f49cb0b7ae16e667f6279be42fd52679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3956d143be77f4a50143f9678eb51ab7871e250cae73d87c9e7fce2575e466c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wpj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:46Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.449613 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.449776 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.449858 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.449943 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.450041 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:46Z","lastTransitionTime":"2026-01-31T14:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.460431 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6afb84a0-c564-45aa-b7a1-cd6f8273fe45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e4fe03f50f95359d284f7d5aebb0df697fba39743cf83857c7f17893cb2bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08603a998264e94ef7871338a47019047495685fb867c5fa4f290f8a990dc6ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddd61fd469b2710e86edf5644cff4c4272ce73a59b6578c4cf941c5408e2d218\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df6c6abfaa0e16e8c6de87da9d3584b57a9e83e9dee03f98bb9de38cf71cc19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f93de7a002a1e6a281d90042a003acf677ad136b29da511ead9acd31eb7ab2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T14:41:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 14:41:50.030836 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 14:41:50.033684 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-814856940/tls.crt::/tmp/serving-cert-814856940/tls.key\\\\\\\"\\\\nI0131 14:41:55.828999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 14:41:55.832995 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 14:41:55.833028 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 14:41:55.833056 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 14:41:55.833065 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 14:41:55.854492 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 14:41:55.854645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 14:41:55.854833 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 14:41:55.854881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 14:41:55.854891 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 14:41:55.855031 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 14:41:55.858196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d34543b9a432a0dd029c18b90dd065749cf5a74b9221dcedf22a810afa749a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:46Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.477287 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64468352-f9fe-48bb-b204-b9f828c06bf8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c56c1e31014f9e3d0be8140f58cff1c752ad4be1c6c60a942bc18320bbd37b6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41a7c5739a571e6f3ec88c3798ad2604382b9320c44ddda3d41681a64c6ab2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a6478c4477b785bcb405d597f1c835faaf4ef7adb3a2bcd6e70cc2e692f44d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cbc9dfffff59a784e871d47bf2ab1b8419b11840eb329ce21c0f7c24e6e77c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cbc9dfffff59a784e871d47bf2ab1b8419b11840eb329ce21c0f7c24e6e77c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:46Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.497396 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d4eee728dbd9b6b0ade5149fd481cb3a688a6856f56a3a25a6cb49c944ee27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:46Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.515613 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:46Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.533401 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b7a3883-0c5a-4116-adb0-ad25f69cd7f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daae5a3d70247f744d33c832eaf3080a91e73ffe0b5327d111ec8e87175becdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37355e2b9d95a941be3552b713f7fe96973a49528f7a979524779b263412853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9317b698da5ee8a5f40c046d810783e523f27b6806319023e57e08f5cae267c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff7cd215439083e1e83ef38a79946be1c7e9bbc20d66cdb931c456a21bbdf17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:46Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.548614 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc4ac60a9db8d22605139ad928fb8fd667aaba0b8563be03e796ad077fb525bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:46Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.552928 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.552962 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.552973 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.552989 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.553000 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:46Z","lastTransitionTime":"2026-01-31T14:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.563927 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rtthp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7dd989b-33df-4562-a60b-f273428fea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ef2aa95d0f98ef0706040984786e2b48337984b508447f1b49b155ffb5f4608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwrtf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rtthp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:46Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.582699 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rp5sb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5353863-ec39-4357-9b86-9be42ca17916\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99c893782bcbcbd370030fe164c6849abe258ce88db65e3d22c269d20d0bef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c73fb39c6c9db93acb0fdc34044b1cafc05046a470e3897eeaa7fac8107691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c73fb39c6c9db93acb0fdc34044b1cafc05046a470e3897eeaa7fac8107691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd0a7bc51ee2198f74d0adbfa358d772dc5176a7671b6a551157ad301348181d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd0a7bc51ee2198f74d0adbfa358d772dc5176a7671b6a551157ad301348181d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dbe202f26fa5c09ba45db2a0bbb7fd7796b91e665df4529eddfdabd6ce083fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dbe202f26fa5c09ba45db2a0bbb7fd7796b91e665df4529eddfdabd6ce083fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rp5sb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:46Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.595850 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4v79q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd8c0730-67df-445e-a6ce-c2edce5d9c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f27098d3c41dbb10f76ba04a8a989e91ff3eb6fe0fb0ca746e33417839235c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45q8c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34680c760b5c6a6e2a521731e962301b54aa3184d5d66792fb43e991c6502a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45q8c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4v79q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:46Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.609734 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:46Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.621862 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lxrfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b895e2a-7887-41c3-b641-9c72bb085dda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d0546721c181de3a5f2f4e791dcaeba69f94467bfbf08b4470332a0b4b49fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9hbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lxrfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:46Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.632642 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xtn6l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68aeb9c7-d3c3-4c34-96ab-bb947421c504\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hljmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hljmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xtn6l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:46Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.645474 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7adba94225f6d18961ccad4eab46916b607a2bff977a48d36fee6810aa4b7293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11d68e8091e3d18fff0b3c8f744c2e9002b6307e17815b221af7123715d8abf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:46Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.654985 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.655030 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.655045 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.655091 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.655108 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:46Z","lastTransitionTime":"2026-01-31T14:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.655345 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-68hvr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658471aa-68b2-478e-9522-ef5533009174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5c5f03d09a887caf686dc5e5d446c4589c7e577cd65e9329cd73d1f33af3d53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbl8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-68hvr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:46Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.672280 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3446cde960ab284f70a338a9a4bfee334fdbdef503d868692fb847f4df6acba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20273cc05bc490f8dd5258dccd02aa0e97d1d22c159ce2c177594920f4af83de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://701178503ef461547e687442de0607af20cee8bca075347d1d1cb2a439562232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5623c591d9a6806335a9b671648fc172e749568a7ad29c3bd41f1a990ba56010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://357afc89f219ac34037aaeb7086076b36f05907502c53006aeb588eaf440cc74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8f6719bdf2fef7e21f7c5cdcb3de4b4f08e69bd62e22cb10fe157f638b3c148\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://307462c29efce1de33c4d50cdbc74a6cb86d00b128964370024b108640b05b5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://307462c29efce1de33c4d50cdbc74a6cb86d00b128964370024b108640b05b5e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T14:42:29Z\\\",\\\"message\\\":\\\"t:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 14:42:29.388826 6425 services_controller.go:443] Built service openshift-etcd/etcd LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.253\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:2379, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.5.253\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:9979, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0131 14:42:29.388860 6425 services_controller.go:444] Built service openshift-etcd/etcd LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0131 14:42:29.388874 6425 services_controller.go:445] Built service openshift-etcd/etcd LB template configs for network=default: []services.lbConfig(nil)\\\\nF0131 14:42:29.388002 6425 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-n8cdt_openshift-ovn-kubernetes(ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7f5e784895f315aa4d8fb6039b9e70c46f977a80fdd5d39ead81d75048abd6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n8cdt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:46Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.757451 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.757485 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.757494 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.757510 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.757519 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:46Z","lastTransitionTime":"2026-01-31T14:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.859867 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.859904 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.859913 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.859927 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.859937 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:46Z","lastTransitionTime":"2026-01-31T14:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.962820 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.962866 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.962877 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.962895 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:46 crc kubenswrapper[4751]: I0131 14:42:46.962905 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:46Z","lastTransitionTime":"2026-01-31T14:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:47 crc kubenswrapper[4751]: I0131 14:42:47.065319 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:47 crc kubenswrapper[4751]: I0131 14:42:47.065560 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:47 crc kubenswrapper[4751]: I0131 14:42:47.065573 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:47 crc kubenswrapper[4751]: I0131 14:42:47.065593 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:47 crc kubenswrapper[4751]: I0131 14:42:47.065605 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:47Z","lastTransitionTime":"2026-01-31T14:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:47 crc kubenswrapper[4751]: I0131 14:42:47.167971 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:47 crc kubenswrapper[4751]: I0131 14:42:47.168280 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:47 crc kubenswrapper[4751]: I0131 14:42:47.168340 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:47 crc kubenswrapper[4751]: I0131 14:42:47.168396 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:47 crc kubenswrapper[4751]: I0131 14:42:47.168461 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:47Z","lastTransitionTime":"2026-01-31T14:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:47 crc kubenswrapper[4751]: I0131 14:42:47.270702 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:47 crc kubenswrapper[4751]: I0131 14:42:47.270959 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:47 crc kubenswrapper[4751]: I0131 14:42:47.271041 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:47 crc kubenswrapper[4751]: I0131 14:42:47.271146 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:47 crc kubenswrapper[4751]: I0131 14:42:47.271215 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:47Z","lastTransitionTime":"2026-01-31T14:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:47 crc kubenswrapper[4751]: I0131 14:42:47.373879 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:47 crc kubenswrapper[4751]: I0131 14:42:47.373938 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:47 crc kubenswrapper[4751]: I0131 14:42:47.373956 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:47 crc kubenswrapper[4751]: I0131 14:42:47.373982 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:47 crc kubenswrapper[4751]: I0131 14:42:47.374002 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:47Z","lastTransitionTime":"2026-01-31T14:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:47 crc kubenswrapper[4751]: I0131 14:42:47.397486 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 18:46:06.237489149 +0000 UTC Jan 31 14:42:47 crc kubenswrapper[4751]: I0131 14:42:47.405907 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:42:47 crc kubenswrapper[4751]: E0131 14:42:47.406139 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 14:42:47 crc kubenswrapper[4751]: I0131 14:42:47.406176 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:42:47 crc kubenswrapper[4751]: I0131 14:42:47.406240 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:42:47 crc kubenswrapper[4751]: I0131 14:42:47.406188 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:42:47 crc kubenswrapper[4751]: E0131 14:42:47.406317 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xtn6l" podUID="68aeb9c7-d3c3-4c34-96ab-bb947421c504" Jan 31 14:42:47 crc kubenswrapper[4751]: E0131 14:42:47.406695 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 14:42:47 crc kubenswrapper[4751]: E0131 14:42:47.406551 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 14:42:47 crc kubenswrapper[4751]: I0131 14:42:47.476821 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:47 crc kubenswrapper[4751]: I0131 14:42:47.477136 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:47 crc kubenswrapper[4751]: I0131 14:42:47.477209 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:47 crc kubenswrapper[4751]: I0131 14:42:47.477279 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:47 crc kubenswrapper[4751]: I0131 14:42:47.477359 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:47Z","lastTransitionTime":"2026-01-31T14:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:47 crc kubenswrapper[4751]: I0131 14:42:47.580199 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:47 crc kubenswrapper[4751]: I0131 14:42:47.580248 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:47 crc kubenswrapper[4751]: I0131 14:42:47.580265 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:47 crc kubenswrapper[4751]: I0131 14:42:47.580292 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:47 crc kubenswrapper[4751]: I0131 14:42:47.580310 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:47Z","lastTransitionTime":"2026-01-31T14:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:47 crc kubenswrapper[4751]: I0131 14:42:47.682843 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:47 crc kubenswrapper[4751]: I0131 14:42:47.682888 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:47 crc kubenswrapper[4751]: I0131 14:42:47.682901 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:47 crc kubenswrapper[4751]: I0131 14:42:47.682918 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:47 crc kubenswrapper[4751]: I0131 14:42:47.682931 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:47Z","lastTransitionTime":"2026-01-31T14:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:47 crc kubenswrapper[4751]: I0131 14:42:47.785100 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:47 crc kubenswrapper[4751]: I0131 14:42:47.785491 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:47 crc kubenswrapper[4751]: I0131 14:42:47.785574 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:47 crc kubenswrapper[4751]: I0131 14:42:47.785661 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:47 crc kubenswrapper[4751]: I0131 14:42:47.785729 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:47Z","lastTransitionTime":"2026-01-31T14:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:47 crc kubenswrapper[4751]: I0131 14:42:47.888629 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:47 crc kubenswrapper[4751]: I0131 14:42:47.888682 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:47 crc kubenswrapper[4751]: I0131 14:42:47.888696 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:47 crc kubenswrapper[4751]: I0131 14:42:47.888743 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:47 crc kubenswrapper[4751]: I0131 14:42:47.888760 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:47Z","lastTransitionTime":"2026-01-31T14:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:47 crc kubenswrapper[4751]: I0131 14:42:47.895806 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/68aeb9c7-d3c3-4c34-96ab-bb947421c504-metrics-certs\") pod \"network-metrics-daemon-xtn6l\" (UID: \"68aeb9c7-d3c3-4c34-96ab-bb947421c504\") " pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:42:47 crc kubenswrapper[4751]: E0131 14:42:47.895950 4751 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 14:42:47 crc kubenswrapper[4751]: E0131 14:42:47.896014 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68aeb9c7-d3c3-4c34-96ab-bb947421c504-metrics-certs podName:68aeb9c7-d3c3-4c34-96ab-bb947421c504 nodeName:}" failed. No retries permitted until 2026-01-31 14:43:19.895991593 +0000 UTC m=+104.270704488 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/68aeb9c7-d3c3-4c34-96ab-bb947421c504-metrics-certs") pod "network-metrics-daemon-xtn6l" (UID: "68aeb9c7-d3c3-4c34-96ab-bb947421c504") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 14:42:47 crc kubenswrapper[4751]: I0131 14:42:47.993555 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:47 crc kubenswrapper[4751]: I0131 14:42:47.993659 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:47 crc kubenswrapper[4751]: I0131 14:42:47.993680 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:47 crc kubenswrapper[4751]: I0131 14:42:47.993709 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:47 crc kubenswrapper[4751]: I0131 14:42:47.993732 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:47Z","lastTransitionTime":"2026-01-31T14:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:48 crc kubenswrapper[4751]: I0131 14:42:48.097154 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:48 crc kubenswrapper[4751]: I0131 14:42:48.097193 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:48 crc kubenswrapper[4751]: I0131 14:42:48.097203 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:48 crc kubenswrapper[4751]: I0131 14:42:48.097219 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:48 crc kubenswrapper[4751]: I0131 14:42:48.097229 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:48Z","lastTransitionTime":"2026-01-31T14:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:48 crc kubenswrapper[4751]: I0131 14:42:48.201264 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:48 crc kubenswrapper[4751]: I0131 14:42:48.201323 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:48 crc kubenswrapper[4751]: I0131 14:42:48.201342 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:48 crc kubenswrapper[4751]: I0131 14:42:48.201366 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:48 crc kubenswrapper[4751]: I0131 14:42:48.201384 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:48Z","lastTransitionTime":"2026-01-31T14:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:48 crc kubenswrapper[4751]: I0131 14:42:48.303805 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:48 crc kubenswrapper[4751]: I0131 14:42:48.303854 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:48 crc kubenswrapper[4751]: I0131 14:42:48.303869 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:48 crc kubenswrapper[4751]: I0131 14:42:48.303891 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:48 crc kubenswrapper[4751]: I0131 14:42:48.303908 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:48Z","lastTransitionTime":"2026-01-31T14:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:48 crc kubenswrapper[4751]: I0131 14:42:48.398684 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 19:40:11.34768546 +0000 UTC Jan 31 14:42:48 crc kubenswrapper[4751]: I0131 14:42:48.406513 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:48 crc kubenswrapper[4751]: I0131 14:42:48.406555 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:48 crc kubenswrapper[4751]: I0131 14:42:48.406567 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:48 crc kubenswrapper[4751]: I0131 14:42:48.406586 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:48 crc kubenswrapper[4751]: I0131 14:42:48.406598 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:48Z","lastTransitionTime":"2026-01-31T14:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:48 crc kubenswrapper[4751]: I0131 14:42:48.508221 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:48 crc kubenswrapper[4751]: I0131 14:42:48.508496 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:48 crc kubenswrapper[4751]: I0131 14:42:48.508628 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:48 crc kubenswrapper[4751]: I0131 14:42:48.508723 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:48 crc kubenswrapper[4751]: I0131 14:42:48.508817 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:48Z","lastTransitionTime":"2026-01-31T14:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:48 crc kubenswrapper[4751]: I0131 14:42:48.610452 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:48 crc kubenswrapper[4751]: I0131 14:42:48.610506 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:48 crc kubenswrapper[4751]: I0131 14:42:48.610522 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:48 crc kubenswrapper[4751]: I0131 14:42:48.610545 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:48 crc kubenswrapper[4751]: I0131 14:42:48.610564 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:48Z","lastTransitionTime":"2026-01-31T14:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:48 crc kubenswrapper[4751]: I0131 14:42:48.712377 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:48 crc kubenswrapper[4751]: I0131 14:42:48.712414 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:48 crc kubenswrapper[4751]: I0131 14:42:48.712425 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:48 crc kubenswrapper[4751]: I0131 14:42:48.712441 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:48 crc kubenswrapper[4751]: I0131 14:42:48.712452 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:48Z","lastTransitionTime":"2026-01-31T14:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:48 crc kubenswrapper[4751]: I0131 14:42:48.814892 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:48 crc kubenswrapper[4751]: I0131 14:42:48.815157 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:48 crc kubenswrapper[4751]: I0131 14:42:48.815235 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:48 crc kubenswrapper[4751]: I0131 14:42:48.815301 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:48 crc kubenswrapper[4751]: I0131 14:42:48.815366 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:48Z","lastTransitionTime":"2026-01-31T14:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:48 crc kubenswrapper[4751]: I0131 14:42:48.918312 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:48 crc kubenswrapper[4751]: I0131 14:42:48.918355 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:48 crc kubenswrapper[4751]: I0131 14:42:48.918371 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:48 crc kubenswrapper[4751]: I0131 14:42:48.918392 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:48 crc kubenswrapper[4751]: I0131 14:42:48.918412 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:48Z","lastTransitionTime":"2026-01-31T14:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.021243 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.021319 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.021339 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.021369 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.021395 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:49Z","lastTransitionTime":"2026-01-31T14:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.123828 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.125788 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.126008 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.126208 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.126624 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:49Z","lastTransitionTime":"2026-01-31T14:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.230214 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.230248 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.230257 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.230271 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.230280 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:49Z","lastTransitionTime":"2026-01-31T14:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.333107 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.333136 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.333146 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.333160 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.333169 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:49Z","lastTransitionTime":"2026-01-31T14:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.399716 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 02:03:32.583360852 +0000 UTC Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.405193 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.405229 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.405340 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.405337 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:42:49 crc kubenswrapper[4751]: E0131 14:42:49.405471 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 14:42:49 crc kubenswrapper[4751]: E0131 14:42:49.405545 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xtn6l" podUID="68aeb9c7-d3c3-4c34-96ab-bb947421c504" Jan 31 14:42:49 crc kubenswrapper[4751]: E0131 14:42:49.405626 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 14:42:49 crc kubenswrapper[4751]: E0131 14:42:49.405765 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.435419 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.435471 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.435482 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.435508 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.435518 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:49Z","lastTransitionTime":"2026-01-31T14:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.538350 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.538387 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.538397 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.538412 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.538421 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:49Z","lastTransitionTime":"2026-01-31T14:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.642821 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.642848 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.642856 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.642869 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.642878 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:49Z","lastTransitionTime":"2026-01-31T14:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.745040 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.745086 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.745095 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.745106 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.745116 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:49Z","lastTransitionTime":"2026-01-31T14:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.847908 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.847975 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.847997 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.848028 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.848052 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:49Z","lastTransitionTime":"2026-01-31T14:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.898161 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rtthp_e7dd989b-33df-4562-a60b-f273428fea3d/kube-multus/0.log" Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.898213 4751 generic.go:334] "Generic (PLEG): container finished" podID="e7dd989b-33df-4562-a60b-f273428fea3d" containerID="7ef2aa95d0f98ef0706040984786e2b48337984b508447f1b49b155ffb5f4608" exitCode=1 Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.898245 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rtthp" event={"ID":"e7dd989b-33df-4562-a60b-f273428fea3d","Type":"ContainerDied","Data":"7ef2aa95d0f98ef0706040984786e2b48337984b508447f1b49b155ffb5f4608"} Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.898644 4751 scope.go:117] "RemoveContainer" containerID="7ef2aa95d0f98ef0706040984786e2b48337984b508447f1b49b155ffb5f4608" Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.925252 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7adba94225f6d18961ccad4eab46916b607a2bff977a48d36fee6810aa4b7293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11d68e8091e3d18fff0b3c8f744c2e9002b6307e17815b221af7123715d8abf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:49Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.939246 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-68hvr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658471aa-68b2-478e-9522-ef5533009174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5c5f03d09a887caf686dc5e5d446c4589c7e577cd65e9329cd73d1f33af3d53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbl8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-68hvr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:49Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.950451 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.950486 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.950495 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.950510 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.950521 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:49Z","lastTransitionTime":"2026-01-31T14:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.971608 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3446cde960ab284f70a338a9a4bfee334fdbdef503d868692fb847f4df6acba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20273cc05bc490f8dd5258dccd02aa0e97d1d22c159ce2c177594920f4af83de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://701178503ef461547e687442de0607af20cee8bca075347d1d1cb2a439562232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5623c591d9a6806335a9b671648fc172e749568a7ad29c3bd41f1a990ba56010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://357afc89f219ac34037aaeb7086076b36f05907502c53006aeb588eaf440cc74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8f6719bdf2fef7e21f7c5cdcb3de4b4f08e69bd62e22cb10fe157f638b3c148\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://307462c29efce1de33c4d50cdbc74a6cb86d00b128964370024b108640b05b5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://307462c29efce1de33c4d50cdbc74a6cb86d00b128964370024b108640b05b5e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T14:42:29Z\\\",\\\"message\\\":\\\"t:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 14:42:29.388826 6425 services_controller.go:443] Built service openshift-etcd/etcd LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.253\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:2379, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.5.253\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:9979, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0131 14:42:29.388860 6425 services_controller.go:444] Built service openshift-etcd/etcd LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0131 14:42:29.388874 6425 services_controller.go:445] Built service openshift-etcd/etcd LB template configs for network=default: []services.lbConfig(nil)\\\\nF0131 14:42:29.388002 6425 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-n8cdt_openshift-ovn-kubernetes(ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7f5e784895f315aa4d8fb6039b9e70c46f977a80fdd5d39ead81d75048abd6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n8cdt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:49Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:49 crc kubenswrapper[4751]: I0131 14:42:49.983857 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:49Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.000297 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:49Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.014924 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c170e8-22c9-43a9-8b34-9d626c2ccddc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d86496066c7ba97ec9852c160ecb2e25f49cb0b7ae16e667f6279be42fd52679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3956d143be77f4a50143f9678eb51ab7871e250cae73d87c9e7fce2575e466c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wpj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:50Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.032355 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6afb84a0-c564-45aa-b7a1-cd6f8273fe45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e4fe03f50f95359d284f7d5aebb0df697fba39743cf83857c7f17893cb2bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08603a998264e94ef7871338a47019047495685fb867c5fa4f290f8a990dc6ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddd61fd469b2710e86edf5644cff4c4272ce73a59b6578c4cf941c5408e2d218\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df6c6abfaa0e16e8c6de87da9d3584b57a9e83e9dee03f98bb9de38cf71cc19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f93de7a002a1e6a281d90042a003acf677ad136b29da511ead9acd31eb7ab2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T14:41:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 14:41:50.030836 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 14:41:50.033684 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-814856940/tls.crt::/tmp/serving-cert-814856940/tls.key\\\\\\\"\\\\nI0131 14:41:55.828999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 14:41:55.832995 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 14:41:55.833028 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 14:41:55.833056 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 14:41:55.833065 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 14:41:55.854492 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 14:41:55.854645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 14:41:55.854833 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 14:41:55.854881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 14:41:55.854891 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 14:41:55.855031 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 14:41:55.858196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d34543b9a432a0dd029c18b90dd065749cf5a74b9221dcedf22a810afa749a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:50Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.045098 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64468352-f9fe-48bb-b204-b9f828c06bf8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c56c1e31014f9e3d0be8140f58cff1c752ad4be1c6c60a942bc18320bbd37b6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41a7c5739a571e6f3ec88c3798ad2604382b9320c44ddda3d41681a64c6ab2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a6478c4477b785bcb405d597f1c835faaf4ef7adb3a2bcd6e70cc2e692f44d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cbc9dfffff59a784e871d47bf2ab1b8419b11840eb329ce21c0f7c24e6e77c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cbc9dfffff59a784e871d47bf2ab1b8419b11840eb329ce21c0f7c24e6e77c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:50Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.053834 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.053936 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.053997 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.054090 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.054181 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:50Z","lastTransitionTime":"2026-01-31T14:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.059718 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d4eee728dbd9b6b0ade5149fd481cb3a688a6856f56a3a25a6cb49c944ee27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:50Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.073852 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b7a3883-0c5a-4116-adb0-ad25f69cd7f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daae5a3d70247f744d33c832eaf3080a91e73ffe0b5327d111ec8e87175becdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37355e2b9d95a941be3552b713f7fe96973a49528f7a979524779b263412853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9317b698da5ee8a5f40c046d810783e523f27b6806319023e57e08f5cae267c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff7cd215439083e1e83ef38a79946be1c7e9bbc20d66cdb931c456a21bbdf17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:50Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.084814 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc4ac60a9db8d22605139ad928fb8fd667aaba0b8563be03e796ad077fb525bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:50Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.102562 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rtthp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7dd989b-33df-4562-a60b-f273428fea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ef2aa95d0f98ef0706040984786e2b48337984b508447f1b49b155ffb5f4608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ef2aa95d0f98ef0706040984786e2b48337984b508447f1b49b155ffb5f4608\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T14:42:49Z\\\",\\\"message\\\":\\\"2026-01-31T14:42:03+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_eaa2f7bd-5331-496e-8b4a-3784e7751508\\\\n2026-01-31T14:42:03+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_eaa2f7bd-5331-496e-8b4a-3784e7751508 to /host/opt/cni/bin/\\\\n2026-01-31T14:42:04Z [verbose] multus-daemon started\\\\n2026-01-31T14:42:04Z [verbose] Readiness Indicator file check\\\\n2026-01-31T14:42:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwrtf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rtthp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:50Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.123527 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rp5sb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5353863-ec39-4357-9b86-9be42ca17916\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99c893782bcbcbd370030fe164c6849abe258ce88db65e3d22c269d20d0bef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c73fb39c6c9db93acb0fdc34044b1cafc05046a470e3897eeaa7fac8107691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c73fb39c6c9db93acb0fdc34044b1cafc05046a470e3897eeaa7fac8107691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd0a7bc51ee2198f74d0adbfa358d772dc5176a7671b6a551157ad301348181d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd0a7bc51ee2198f74d0adbfa358d772dc5176a7671b6a551157ad301348181d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dbe202f26fa5c09ba45db2a0bbb7fd7796b91e665df4529eddfdabd6ce083fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dbe202f26fa5c09ba45db2a0bbb7fd7796b91e665df4529eddfdabd6ce083fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rp5sb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:50Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.135936 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4v79q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd8c0730-67df-445e-a6ce-c2edce5d9c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f27098d3c41dbb10f76ba04a8a989e91ff3eb6fe0fb0ca746e33417839235c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45q8c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34680c760b5c6a6e2a521731e962301b54aa3184d5d66792fb43e991c6502a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45q8c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4v79q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:50Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.149365 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:50Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.157876 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.157922 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.157940 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.157966 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.157983 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:50Z","lastTransitionTime":"2026-01-31T14:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.161305 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lxrfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b895e2a-7887-41c3-b641-9c72bb085dda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d0546721c181de3a5f2f4e791dcaeba69f94467bfbf08b4470332a0b4b49fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9hbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lxrfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:50Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.174288 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xtn6l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68aeb9c7-d3c3-4c34-96ab-bb947421c504\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hljmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hljmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xtn6l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:50Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.261423 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.261470 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.261479 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.261494 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.261505 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:50Z","lastTransitionTime":"2026-01-31T14:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.364198 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.364266 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.364289 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.364319 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.364341 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:50Z","lastTransitionTime":"2026-01-31T14:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.400004 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 21:47:35.620530663 +0000 UTC Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.467496 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.467539 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.467548 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.467568 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.467581 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:50Z","lastTransitionTime":"2026-01-31T14:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.570169 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.570528 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.570618 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.570699 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.570788 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:50Z","lastTransitionTime":"2026-01-31T14:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.673611 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.673912 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.674090 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.674233 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.674353 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:50Z","lastTransitionTime":"2026-01-31T14:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.782197 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.782239 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.782249 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.782266 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.782277 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:50Z","lastTransitionTime":"2026-01-31T14:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.885329 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.885634 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.885808 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.885952 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.886135 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:50Z","lastTransitionTime":"2026-01-31T14:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.905994 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rtthp_e7dd989b-33df-4562-a60b-f273428fea3d/kube-multus/0.log" Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.906280 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rtthp" event={"ID":"e7dd989b-33df-4562-a60b-f273428fea3d","Type":"ContainerStarted","Data":"2bdcbdac0cc4b17e027947c041a0ee4a7d7f549aa6dbe5c07c370ca7c0c50475"} Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.928320 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4v79q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd8c0730-67df-445e-a6ce-c2edce5d9c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f27098d3c41dbb10f76ba04a8a989e91ff3eb6fe0fb0ca746e33417839235c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45q8c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34680c760b5c6a6e2a521731e962301b54aa3184d5d66792fb43e991c6502a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45q8c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4v79q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:50Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.945293 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b7a3883-0c5a-4116-adb0-ad25f69cd7f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daae5a3d70247f744d33c832eaf3080a91e73ffe0b5327d111ec8e87175becdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37355e2b9d95a941be3552b713f7fe96973a49528f7a979524779b263412853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9317b698da5ee8a5f40c046d810783e523f27b6806319023e57e08f5cae267c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff7cd215439083e1e83ef38a79946be1c7e9bbc20d66cdb931c456a21bbdf17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:50Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.958230 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc4ac60a9db8d22605139ad928fb8fd667aaba0b8563be03e796ad077fb525bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:50Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.974629 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rtthp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7dd989b-33df-4562-a60b-f273428fea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bdcbdac0cc4b17e027947c041a0ee4a7d7f549aa6dbe5c07c370ca7c0c50475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ef2aa95d0f98ef0706040984786e2b48337984b508447f1b49b155ffb5f4608\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T14:42:49Z\\\",\\\"message\\\":\\\"2026-01-31T14:42:03+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_eaa2f7bd-5331-496e-8b4a-3784e7751508\\\\n2026-01-31T14:42:03+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_eaa2f7bd-5331-496e-8b4a-3784e7751508 to /host/opt/cni/bin/\\\\n2026-01-31T14:42:04Z [verbose] multus-daemon started\\\\n2026-01-31T14:42:04Z [verbose] Readiness Indicator file check\\\\n2026-01-31T14:42:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwrtf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rtthp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:50Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.989318 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.989360 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.989376 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.989399 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.989416 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:50Z","lastTransitionTime":"2026-01-31T14:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:50 crc kubenswrapper[4751]: I0131 14:42:50.995612 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rp5sb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5353863-ec39-4357-9b86-9be42ca17916\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99c893782bcbcbd370030fe164c6849abe258ce88db65e3d22c269d20d0bef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c73fb39c6c9db93acb0fdc34044b1cafc05046a470e3897eeaa7fac8107691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c73fb39c6c9db93acb0fdc34044b1cafc05046a470e3897eeaa7fac8107691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd0a7bc51ee2198f74d0adbfa358d772dc5176a7671b6a551157ad301348181d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd0a7bc51ee2198f74d0adbfa358d772dc5176a7671b6a551157ad301348181d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dbe202f26fa5c09ba45db2a0bbb7fd7796b91e665df4529eddfdabd6ce083fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dbe202f26fa5c09ba45db2a0bbb7fd7796b91e665df4529eddfdabd6ce083fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rp5sb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:50Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.012247 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:51Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.025501 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lxrfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b895e2a-7887-41c3-b641-9c72bb085dda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d0546721c181de3a5f2f4e791dcaeba69f94467bfbf08b4470332a0b4b49fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9hbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lxrfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:51Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.038690 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xtn6l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68aeb9c7-d3c3-4c34-96ab-bb947421c504\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hljmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hljmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xtn6l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:51Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.057708 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7adba94225f6d18961ccad4eab46916b607a2bff977a48d36fee6810aa4b7293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11d68e8091e3d18fff0b3c8f744c2e9002b6307e17815b221af7123715d8abf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:51Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.070668 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-68hvr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658471aa-68b2-478e-9522-ef5533009174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5c5f03d09a887caf686dc5e5d446c4589c7e577cd65e9329cd73d1f33af3d53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbl8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-68hvr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:51Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.091617 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.091681 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.091698 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.091722 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.091740 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:51Z","lastTransitionTime":"2026-01-31T14:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.093959 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3446cde960ab284f70a338a9a4bfee334fdbdef503d868692fb847f4df6acba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20273cc05bc490f8dd5258dccd02aa0e97d1d22c159ce2c177594920f4af83de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://701178503ef461547e687442de0607af20cee8bca075347d1d1cb2a439562232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5623c591d9a6806335a9b671648fc172e749568a7ad29c3bd41f1a990ba56010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://357afc89f219ac34037aaeb7086076b36f05907502c53006aeb588eaf440cc74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8f6719bdf2fef7e21f7c5cdcb3de4b4f08e69bd62e22cb10fe157f638b3c148\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://307462c29efce1de33c4d50cdbc74a6cb86d00b128964370024b108640b05b5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://307462c29efce1de33c4d50cdbc74a6cb86d00b128964370024b108640b05b5e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T14:42:29Z\\\",\\\"message\\\":\\\"t:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 14:42:29.388826 6425 services_controller.go:443] Built service openshift-etcd/etcd LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.253\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:2379, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.5.253\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:9979, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0131 14:42:29.388860 6425 services_controller.go:444] Built service openshift-etcd/etcd LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0131 14:42:29.388874 6425 services_controller.go:445] Built service openshift-etcd/etcd LB template configs for network=default: []services.lbConfig(nil)\\\\nF0131 14:42:29.388002 6425 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-n8cdt_openshift-ovn-kubernetes(ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7f5e784895f315aa4d8fb6039b9e70c46f977a80fdd5d39ead81d75048abd6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n8cdt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:51Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.108426 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d4eee728dbd9b6b0ade5149fd481cb3a688a6856f56a3a25a6cb49c944ee27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:51Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.123722 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:51Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.137803 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:51Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.153988 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c170e8-22c9-43a9-8b34-9d626c2ccddc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d86496066c7ba97ec9852c160ecb2e25f49cb0b7ae16e667f6279be42fd52679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3956d143be77f4a50143f9678eb51ab7871e250cae73d87c9e7fce2575e466c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wpj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:51Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.172107 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6afb84a0-c564-45aa-b7a1-cd6f8273fe45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e4fe03f50f95359d284f7d5aebb0df697fba39743cf83857c7f17893cb2bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08603a998264e94ef7871338a47019047495685fb867c5fa4f290f8a990dc6ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddd61fd469b2710e86edf5644cff4c4272ce73a59b6578c4cf941c5408e2d218\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df6c6abfaa0e16e8c6de87da9d3584b57a9e83e9dee03f98bb9de38cf71cc19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f93de7a002a1e6a281d90042a003acf677ad136b29da511ead9acd31eb7ab2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T14:41:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 14:41:50.030836 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 14:41:50.033684 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-814856940/tls.crt::/tmp/serving-cert-814856940/tls.key\\\\\\\"\\\\nI0131 14:41:55.828999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 14:41:55.832995 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 14:41:55.833028 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 14:41:55.833056 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 14:41:55.833065 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 14:41:55.854492 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 14:41:55.854645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 14:41:55.854833 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 14:41:55.854881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 14:41:55.854891 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 14:41:55.855031 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 14:41:55.858196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d34543b9a432a0dd029c18b90dd065749cf5a74b9221dcedf22a810afa749a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:51Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.186147 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64468352-f9fe-48bb-b204-b9f828c06bf8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c56c1e31014f9e3d0be8140f58cff1c752ad4be1c6c60a942bc18320bbd37b6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41a7c5739a571e6f3ec88c3798ad2604382b9320c44ddda3d41681a64c6ab2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a6478c4477b785bcb405d597f1c835faaf4ef7adb3a2bcd6e70cc2e692f44d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cbc9dfffff59a784e871d47bf2ab1b8419b11840eb329ce21c0f7c24e6e77c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cbc9dfffff59a784e871d47bf2ab1b8419b11840eb329ce21c0f7c24e6e77c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:51Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.193783 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.193839 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.193858 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.193882 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.193900 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:51Z","lastTransitionTime":"2026-01-31T14:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.296466 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.296517 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.296533 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.296554 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.296571 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:51Z","lastTransitionTime":"2026-01-31T14:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.399471 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.399518 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.399531 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.399553 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.399570 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:51Z","lastTransitionTime":"2026-01-31T14:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.400828 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 07:05:42.661210316 +0000 UTC Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.405208 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.405274 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.405230 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.405222 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:42:51 crc kubenswrapper[4751]: E0131 14:42:51.405367 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 14:42:51 crc kubenswrapper[4751]: E0131 14:42:51.405520 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 14:42:51 crc kubenswrapper[4751]: E0131 14:42:51.405620 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 14:42:51 crc kubenswrapper[4751]: E0131 14:42:51.405691 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xtn6l" podUID="68aeb9c7-d3c3-4c34-96ab-bb947421c504" Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.502419 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.502468 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.502480 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.502499 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.502513 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:51Z","lastTransitionTime":"2026-01-31T14:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.604419 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.604450 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.604459 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.604476 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.604487 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:51Z","lastTransitionTime":"2026-01-31T14:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.706949 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.706981 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.706989 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.707003 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.707012 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:51Z","lastTransitionTime":"2026-01-31T14:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.810105 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.810141 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.810151 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.810163 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.810172 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:51Z","lastTransitionTime":"2026-01-31T14:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.912706 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.912779 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.912802 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.912830 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:51 crc kubenswrapper[4751]: I0131 14:42:51.912850 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:51Z","lastTransitionTime":"2026-01-31T14:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:52 crc kubenswrapper[4751]: I0131 14:42:52.015444 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:52 crc kubenswrapper[4751]: I0131 14:42:52.015494 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:52 crc kubenswrapper[4751]: I0131 14:42:52.015512 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:52 crc kubenswrapper[4751]: I0131 14:42:52.015536 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:52 crc kubenswrapper[4751]: I0131 14:42:52.015555 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:52Z","lastTransitionTime":"2026-01-31T14:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:52 crc kubenswrapper[4751]: I0131 14:42:52.118975 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:52 crc kubenswrapper[4751]: I0131 14:42:52.119015 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:52 crc kubenswrapper[4751]: I0131 14:42:52.119026 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:52 crc kubenswrapper[4751]: I0131 14:42:52.119040 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:52 crc kubenswrapper[4751]: I0131 14:42:52.119051 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:52Z","lastTransitionTime":"2026-01-31T14:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:52 crc kubenswrapper[4751]: I0131 14:42:52.221354 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:52 crc kubenswrapper[4751]: I0131 14:42:52.221428 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:52 crc kubenswrapper[4751]: I0131 14:42:52.221451 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:52 crc kubenswrapper[4751]: I0131 14:42:52.221479 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:52 crc kubenswrapper[4751]: I0131 14:42:52.221503 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:52Z","lastTransitionTime":"2026-01-31T14:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:52 crc kubenswrapper[4751]: I0131 14:42:52.324261 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:52 crc kubenswrapper[4751]: I0131 14:42:52.324319 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:52 crc kubenswrapper[4751]: I0131 14:42:52.324337 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:52 crc kubenswrapper[4751]: I0131 14:42:52.324364 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:52 crc kubenswrapper[4751]: I0131 14:42:52.324384 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:52Z","lastTransitionTime":"2026-01-31T14:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:52 crc kubenswrapper[4751]: I0131 14:42:52.401554 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 02:44:02.046084211 +0000 UTC Jan 31 14:42:52 crc kubenswrapper[4751]: I0131 14:42:52.426841 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:52 crc kubenswrapper[4751]: I0131 14:42:52.426889 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:52 crc kubenswrapper[4751]: I0131 14:42:52.426899 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:52 crc kubenswrapper[4751]: I0131 14:42:52.426913 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:52 crc kubenswrapper[4751]: I0131 14:42:52.426922 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:52Z","lastTransitionTime":"2026-01-31T14:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:52 crc kubenswrapper[4751]: I0131 14:42:52.529993 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:52 crc kubenswrapper[4751]: I0131 14:42:52.530403 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:52 crc kubenswrapper[4751]: I0131 14:42:52.530452 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:52 crc kubenswrapper[4751]: I0131 14:42:52.530475 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:52 crc kubenswrapper[4751]: I0131 14:42:52.530491 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:52Z","lastTransitionTime":"2026-01-31T14:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:52 crc kubenswrapper[4751]: I0131 14:42:52.633695 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:52 crc kubenswrapper[4751]: I0131 14:42:52.633735 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:52 crc kubenswrapper[4751]: I0131 14:42:52.633743 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:52 crc kubenswrapper[4751]: I0131 14:42:52.633759 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:52 crc kubenswrapper[4751]: I0131 14:42:52.633774 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:52Z","lastTransitionTime":"2026-01-31T14:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:52 crc kubenswrapper[4751]: I0131 14:42:52.737044 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:52 crc kubenswrapper[4751]: I0131 14:42:52.737133 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:52 crc kubenswrapper[4751]: I0131 14:42:52.737150 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:52 crc kubenswrapper[4751]: I0131 14:42:52.737172 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:52 crc kubenswrapper[4751]: I0131 14:42:52.737189 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:52Z","lastTransitionTime":"2026-01-31T14:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:52 crc kubenswrapper[4751]: I0131 14:42:52.839586 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:52 crc kubenswrapper[4751]: I0131 14:42:52.839911 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:52 crc kubenswrapper[4751]: I0131 14:42:52.839923 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:52 crc kubenswrapper[4751]: I0131 14:42:52.839935 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:52 crc kubenswrapper[4751]: I0131 14:42:52.839946 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:52Z","lastTransitionTime":"2026-01-31T14:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:52 crc kubenswrapper[4751]: I0131 14:42:52.941978 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:52 crc kubenswrapper[4751]: I0131 14:42:52.942054 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:52 crc kubenswrapper[4751]: I0131 14:42:52.942115 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:52 crc kubenswrapper[4751]: I0131 14:42:52.942149 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:52 crc kubenswrapper[4751]: I0131 14:42:52.942177 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:52Z","lastTransitionTime":"2026-01-31T14:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.045268 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.045321 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.045337 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.045361 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.045383 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:53Z","lastTransitionTime":"2026-01-31T14:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.147884 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.147943 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.147965 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.147989 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.148006 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:53Z","lastTransitionTime":"2026-01-31T14:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.251195 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.251252 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.251270 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.251300 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.251318 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:53Z","lastTransitionTime":"2026-01-31T14:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.353446 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.353520 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.353536 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.353565 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.353589 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:53Z","lastTransitionTime":"2026-01-31T14:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.402498 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 12:54:59.610272862 +0000 UTC Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.405787 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.405907 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:42:53 crc kubenswrapper[4751]: E0131 14:42:53.406041 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.406119 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.406156 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:42:53 crc kubenswrapper[4751]: E0131 14:42:53.406281 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 14:42:53 crc kubenswrapper[4751]: E0131 14:42:53.406313 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xtn6l" podUID="68aeb9c7-d3c3-4c34-96ab-bb947421c504" Jan 31 14:42:53 crc kubenswrapper[4751]: E0131 14:42:53.406455 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.456582 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.456634 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.456646 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.456663 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.456675 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:53Z","lastTransitionTime":"2026-01-31T14:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.559042 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.559129 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.559151 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.559180 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.559201 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:53Z","lastTransitionTime":"2026-01-31T14:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.661241 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.661287 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.661299 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.661315 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.661326 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:53Z","lastTransitionTime":"2026-01-31T14:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.764490 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.764587 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.764609 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.764639 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.764659 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:53Z","lastTransitionTime":"2026-01-31T14:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.866602 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.866649 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.866660 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.866677 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.866690 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:53Z","lastTransitionTime":"2026-01-31T14:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.915041 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.915100 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.915110 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.915126 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.915136 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:53Z","lastTransitionTime":"2026-01-31T14:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:53 crc kubenswrapper[4751]: E0131 14:42:53.934133 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2bc08d22-1e39-4800-b402-ea260cc19637\\\",\\\"systemUUID\\\":\\\"ad4951a2-dbbf-4c4e-af3f-cce3c25b01ef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:53Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.939670 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.939736 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.939748 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.939768 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.939802 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:53Z","lastTransitionTime":"2026-01-31T14:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:53 crc kubenswrapper[4751]: E0131 14:42:53.960030 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2bc08d22-1e39-4800-b402-ea260cc19637\\\",\\\"systemUUID\\\":\\\"ad4951a2-dbbf-4c4e-af3f-cce3c25b01ef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:53Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.965986 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.966110 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.966130 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.966188 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.966205 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:53Z","lastTransitionTime":"2026-01-31T14:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:53 crc kubenswrapper[4751]: E0131 14:42:53.987913 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2bc08d22-1e39-4800-b402-ea260cc19637\\\",\\\"systemUUID\\\":\\\"ad4951a2-dbbf-4c4e-af3f-cce3c25b01ef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:53Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.994654 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.994708 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.994719 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.994739 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:53 crc kubenswrapper[4751]: I0131 14:42:53.994752 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:53Z","lastTransitionTime":"2026-01-31T14:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:54 crc kubenswrapper[4751]: E0131 14:42:54.010146 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2bc08d22-1e39-4800-b402-ea260cc19637\\\",\\\"systemUUID\\\":\\\"ad4951a2-dbbf-4c4e-af3f-cce3c25b01ef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:54Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:54 crc kubenswrapper[4751]: I0131 14:42:54.014724 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:54 crc kubenswrapper[4751]: I0131 14:42:54.014841 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:54 crc kubenswrapper[4751]: I0131 14:42:54.014903 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:54 crc kubenswrapper[4751]: I0131 14:42:54.014929 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:54 crc kubenswrapper[4751]: I0131 14:42:54.014946 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:54Z","lastTransitionTime":"2026-01-31T14:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:54 crc kubenswrapper[4751]: E0131 14:42:54.030013 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:42:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2bc08d22-1e39-4800-b402-ea260cc19637\\\",\\\"systemUUID\\\":\\\"ad4951a2-dbbf-4c4e-af3f-cce3c25b01ef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:54Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:54 crc kubenswrapper[4751]: E0131 14:42:54.030213 4751 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 31 14:42:54 crc kubenswrapper[4751]: I0131 14:42:54.032703 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:54 crc kubenswrapper[4751]: I0131 14:42:54.032742 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:54 crc kubenswrapper[4751]: I0131 14:42:54.032771 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:54 crc kubenswrapper[4751]: I0131 14:42:54.032789 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:54 crc kubenswrapper[4751]: I0131 14:42:54.032798 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:54Z","lastTransitionTime":"2026-01-31T14:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:54 crc kubenswrapper[4751]: I0131 14:42:54.135567 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:54 crc kubenswrapper[4751]: I0131 14:42:54.135660 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:54 crc kubenswrapper[4751]: I0131 14:42:54.135678 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:54 crc kubenswrapper[4751]: I0131 14:42:54.135737 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:54 crc kubenswrapper[4751]: I0131 14:42:54.135756 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:54Z","lastTransitionTime":"2026-01-31T14:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:54 crc kubenswrapper[4751]: I0131 14:42:54.239714 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:54 crc kubenswrapper[4751]: I0131 14:42:54.239773 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:54 crc kubenswrapper[4751]: I0131 14:42:54.239791 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:54 crc kubenswrapper[4751]: I0131 14:42:54.239814 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:54 crc kubenswrapper[4751]: I0131 14:42:54.239831 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:54Z","lastTransitionTime":"2026-01-31T14:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:54 crc kubenswrapper[4751]: I0131 14:42:54.343647 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:54 crc kubenswrapper[4751]: I0131 14:42:54.343712 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:54 crc kubenswrapper[4751]: I0131 14:42:54.343764 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:54 crc kubenswrapper[4751]: I0131 14:42:54.343794 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:54 crc kubenswrapper[4751]: I0131 14:42:54.343818 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:54Z","lastTransitionTime":"2026-01-31T14:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:54 crc kubenswrapper[4751]: I0131 14:42:54.402686 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 02:11:33.052218152 +0000 UTC Jan 31 14:42:54 crc kubenswrapper[4751]: I0131 14:42:54.446181 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:54 crc kubenswrapper[4751]: I0131 14:42:54.446265 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:54 crc kubenswrapper[4751]: I0131 14:42:54.446313 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:54 crc kubenswrapper[4751]: I0131 14:42:54.446335 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:54 crc kubenswrapper[4751]: I0131 14:42:54.446352 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:54Z","lastTransitionTime":"2026-01-31T14:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:54 crc kubenswrapper[4751]: I0131 14:42:54.548791 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:54 crc kubenswrapper[4751]: I0131 14:42:54.548884 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:54 crc kubenswrapper[4751]: I0131 14:42:54.548894 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:54 crc kubenswrapper[4751]: I0131 14:42:54.548929 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:54 crc kubenswrapper[4751]: I0131 14:42:54.548942 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:54Z","lastTransitionTime":"2026-01-31T14:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:54 crc kubenswrapper[4751]: I0131 14:42:54.651619 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:54 crc kubenswrapper[4751]: I0131 14:42:54.651659 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:54 crc kubenswrapper[4751]: I0131 14:42:54.651674 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:54 crc kubenswrapper[4751]: I0131 14:42:54.651693 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:54 crc kubenswrapper[4751]: I0131 14:42:54.651708 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:54Z","lastTransitionTime":"2026-01-31T14:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:54 crc kubenswrapper[4751]: I0131 14:42:54.754984 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:54 crc kubenswrapper[4751]: I0131 14:42:54.755052 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:54 crc kubenswrapper[4751]: I0131 14:42:54.755112 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:54 crc kubenswrapper[4751]: I0131 14:42:54.755677 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:54 crc kubenswrapper[4751]: I0131 14:42:54.755738 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:54Z","lastTransitionTime":"2026-01-31T14:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:54 crc kubenswrapper[4751]: I0131 14:42:54.858671 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:54 crc kubenswrapper[4751]: I0131 14:42:54.858709 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:54 crc kubenswrapper[4751]: I0131 14:42:54.858717 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:54 crc kubenswrapper[4751]: I0131 14:42:54.858731 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:54 crc kubenswrapper[4751]: I0131 14:42:54.858742 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:54Z","lastTransitionTime":"2026-01-31T14:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:54 crc kubenswrapper[4751]: I0131 14:42:54.962116 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:54 crc kubenswrapper[4751]: I0131 14:42:54.962189 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:54 crc kubenswrapper[4751]: I0131 14:42:54.962210 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:54 crc kubenswrapper[4751]: I0131 14:42:54.962235 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:54 crc kubenswrapper[4751]: I0131 14:42:54.962252 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:54Z","lastTransitionTime":"2026-01-31T14:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:55 crc kubenswrapper[4751]: I0131 14:42:55.065033 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:55 crc kubenswrapper[4751]: I0131 14:42:55.065131 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:55 crc kubenswrapper[4751]: I0131 14:42:55.065148 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:55 crc kubenswrapper[4751]: I0131 14:42:55.065172 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:55 crc kubenswrapper[4751]: I0131 14:42:55.065190 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:55Z","lastTransitionTime":"2026-01-31T14:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:55 crc kubenswrapper[4751]: I0131 14:42:55.167876 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:55 crc kubenswrapper[4751]: I0131 14:42:55.167925 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:55 crc kubenswrapper[4751]: I0131 14:42:55.167943 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:55 crc kubenswrapper[4751]: I0131 14:42:55.167999 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:55 crc kubenswrapper[4751]: I0131 14:42:55.168016 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:55Z","lastTransitionTime":"2026-01-31T14:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:55 crc kubenswrapper[4751]: I0131 14:42:55.270992 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:55 crc kubenswrapper[4751]: I0131 14:42:55.271363 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:55 crc kubenswrapper[4751]: I0131 14:42:55.271520 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:55 crc kubenswrapper[4751]: I0131 14:42:55.271656 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:55 crc kubenswrapper[4751]: I0131 14:42:55.271782 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:55Z","lastTransitionTime":"2026-01-31T14:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:55 crc kubenswrapper[4751]: I0131 14:42:55.374540 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:55 crc kubenswrapper[4751]: I0131 14:42:55.374606 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:55 crc kubenswrapper[4751]: I0131 14:42:55.374627 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:55 crc kubenswrapper[4751]: I0131 14:42:55.374649 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:55 crc kubenswrapper[4751]: I0131 14:42:55.374667 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:55Z","lastTransitionTime":"2026-01-31T14:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:55 crc kubenswrapper[4751]: I0131 14:42:55.403227 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 15:51:35.202500685 +0000 UTC Jan 31 14:42:55 crc kubenswrapper[4751]: I0131 14:42:55.405682 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:42:55 crc kubenswrapper[4751]: I0131 14:42:55.405716 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:42:55 crc kubenswrapper[4751]: I0131 14:42:55.405761 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:42:55 crc kubenswrapper[4751]: I0131 14:42:55.405705 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:42:55 crc kubenswrapper[4751]: E0131 14:42:55.405916 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xtn6l" podUID="68aeb9c7-d3c3-4c34-96ab-bb947421c504" Jan 31 14:42:55 crc kubenswrapper[4751]: E0131 14:42:55.406318 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 14:42:55 crc kubenswrapper[4751]: E0131 14:42:55.406484 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 14:42:55 crc kubenswrapper[4751]: E0131 14:42:55.406570 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 14:42:55 crc kubenswrapper[4751]: I0131 14:42:55.477347 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:55 crc kubenswrapper[4751]: I0131 14:42:55.477429 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:55 crc kubenswrapper[4751]: I0131 14:42:55.477447 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:55 crc kubenswrapper[4751]: I0131 14:42:55.477471 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:55 crc kubenswrapper[4751]: I0131 14:42:55.477488 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:55Z","lastTransitionTime":"2026-01-31T14:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:55 crc kubenswrapper[4751]: I0131 14:42:55.579876 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:55 crc kubenswrapper[4751]: I0131 14:42:55.579942 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:55 crc kubenswrapper[4751]: I0131 14:42:55.579964 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:55 crc kubenswrapper[4751]: I0131 14:42:55.579991 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:55 crc kubenswrapper[4751]: I0131 14:42:55.580012 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:55Z","lastTransitionTime":"2026-01-31T14:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:55 crc kubenswrapper[4751]: I0131 14:42:55.681965 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:55 crc kubenswrapper[4751]: I0131 14:42:55.682029 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:55 crc kubenswrapper[4751]: I0131 14:42:55.682050 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:55 crc kubenswrapper[4751]: I0131 14:42:55.682112 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:55 crc kubenswrapper[4751]: I0131 14:42:55.682136 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:55Z","lastTransitionTime":"2026-01-31T14:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:55 crc kubenswrapper[4751]: I0131 14:42:55.785126 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:55 crc kubenswrapper[4751]: I0131 14:42:55.785175 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:55 crc kubenswrapper[4751]: I0131 14:42:55.785187 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:55 crc kubenswrapper[4751]: I0131 14:42:55.785204 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:55 crc kubenswrapper[4751]: I0131 14:42:55.785217 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:55Z","lastTransitionTime":"2026-01-31T14:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:55 crc kubenswrapper[4751]: I0131 14:42:55.887857 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:55 crc kubenswrapper[4751]: I0131 14:42:55.887899 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:55 crc kubenswrapper[4751]: I0131 14:42:55.887911 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:55 crc kubenswrapper[4751]: I0131 14:42:55.887927 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:55 crc kubenswrapper[4751]: I0131 14:42:55.887939 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:55Z","lastTransitionTime":"2026-01-31T14:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:55 crc kubenswrapper[4751]: I0131 14:42:55.990609 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:55 crc kubenswrapper[4751]: I0131 14:42:55.990674 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:55 crc kubenswrapper[4751]: I0131 14:42:55.990690 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:55 crc kubenswrapper[4751]: I0131 14:42:55.990716 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:55 crc kubenswrapper[4751]: I0131 14:42:55.990734 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:55Z","lastTransitionTime":"2026-01-31T14:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.093145 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.093181 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.093193 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.093207 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.093225 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:56Z","lastTransitionTime":"2026-01-31T14:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.195872 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.195922 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.195939 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.195961 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.195977 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:56Z","lastTransitionTime":"2026-01-31T14:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.298880 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.298945 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.298962 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.298986 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.299005 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:56Z","lastTransitionTime":"2026-01-31T14:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.402128 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.402191 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.402213 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.402239 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.402257 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:56Z","lastTransitionTime":"2026-01-31T14:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.403572 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 00:42:19.648682466 +0000 UTC Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.420428 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6afb84a0-c564-45aa-b7a1-cd6f8273fe45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e4fe03f50f95359d284f7d5aebb0df697fba39743cf83857c7f17893cb2bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08603a998264e94ef7871338a47019047495685fb867c5fa4f290f8a990dc6ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddd61fd469b2710e86edf5644cff4c4272ce73a59b6578c4cf941c5408e2d218\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df6c6abfaa0e16e8c6de87da9d3584b57a9e83e9dee03f98bb9de38cf71cc19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f93de7a002a1e6a281d90042a003acf677ad136b29da511ead9acd31eb7ab2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T14:41:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 14:41:50.030836 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 14:41:50.033684 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-814856940/tls.crt::/tmp/serving-cert-814856940/tls.key\\\\\\\"\\\\nI0131 14:41:55.828999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 14:41:55.832995 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 14:41:55.833028 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 14:41:55.833056 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 14:41:55.833065 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 14:41:55.854492 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 14:41:55.854645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 14:41:55.854833 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 14:41:55.854881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 14:41:55.854891 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 14:41:55.855031 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 14:41:55.858196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d34543b9a432a0dd029c18b90dd065749cf5a74b9221dcedf22a810afa749a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:56Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.435469 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64468352-f9fe-48bb-b204-b9f828c06bf8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c56c1e31014f9e3d0be8140f58cff1c752ad4be1c6c60a942bc18320bbd37b6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41a7c5739a571e6f3ec88c3798ad2604382b9320c44ddda3d41681a64c6ab2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a6478c4477b785bcb405d597f1c835faaf4ef7adb3a2bcd6e70cc2e692f44d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cbc9dfffff59a784e871d47bf2ab1b8419b11840eb329ce21c0f7c24e6e77c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cbc9dfffff59a784e871d47bf2ab1b8419b11840eb329ce21c0f7c24e6e77c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:56Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.457058 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d4eee728dbd9b6b0ade5149fd481cb3a688a6856f56a3a25a6cb49c944ee27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:56Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.473139 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:56Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.492873 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:56Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.504420 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.504497 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.504523 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.504558 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.504576 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:56Z","lastTransitionTime":"2026-01-31T14:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.512558 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c170e8-22c9-43a9-8b34-9d626c2ccddc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d86496066c7ba97ec9852c160ecb2e25f49cb0b7ae16e667f6279be42fd52679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3956d143be77f4a50143f9678eb51ab7871e250cae73d87c9e7fce2575e466c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wpj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:56Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.532029 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rtthp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7dd989b-33df-4562-a60b-f273428fea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bdcbdac0cc4b17e027947c041a0ee4a7d7f549aa6dbe5c07c370ca7c0c50475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ef2aa95d0f98ef0706040984786e2b48337984b508447f1b49b155ffb5f4608\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T14:42:49Z\\\",\\\"message\\\":\\\"2026-01-31T14:42:03+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_eaa2f7bd-5331-496e-8b4a-3784e7751508\\\\n2026-01-31T14:42:03+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_eaa2f7bd-5331-496e-8b4a-3784e7751508 to /host/opt/cni/bin/\\\\n2026-01-31T14:42:04Z [verbose] multus-daemon started\\\\n2026-01-31T14:42:04Z [verbose] Readiness Indicator file check\\\\n2026-01-31T14:42:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwrtf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rtthp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:56Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.555191 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rp5sb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5353863-ec39-4357-9b86-9be42ca17916\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99c893782bcbcbd370030fe164c6849abe258ce88db65e3d22c269d20d0bef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c73fb39c6c9db93acb0fdc34044b1cafc05046a470e3897eeaa7fac8107691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c73fb39c6c9db93acb0fdc34044b1cafc05046a470e3897eeaa7fac8107691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd0a7bc51ee2198f74d0adbfa358d772dc5176a7671b6a551157ad301348181d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd0a7bc51ee2198f74d0adbfa358d772dc5176a7671b6a551157ad301348181d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dbe202f26fa5c09ba45db2a0bbb7fd7796b91e665df4529eddfdabd6ce083fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dbe202f26fa5c09ba45db2a0bbb7fd7796b91e665df4529eddfdabd6ce083fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rp5sb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:56Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.571996 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4v79q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd8c0730-67df-445e-a6ce-c2edce5d9c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f27098d3c41dbb10f76ba04a8a989e91ff3eb6fe0fb0ca746e33417839235c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45q8c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34680c760b5c6a6e2a521731e962301b54aa3184d5d66792fb43e991c6502a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45q8c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4v79q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:56Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.587478 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b7a3883-0c5a-4116-adb0-ad25f69cd7f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daae5a3d70247f744d33c832eaf3080a91e73ffe0b5327d111ec8e87175becdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37355e2b9d95a941be3552b713f7fe96973a49528f7a979524779b263412853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9317b698da5ee8a5f40c046d810783e523f27b6806319023e57e08f5cae267c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff7cd215439083e1e83ef38a79946be1c7e9bbc20d66cdb931c456a21bbdf17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:56Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.603878 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc4ac60a9db8d22605139ad928fb8fd667aaba0b8563be03e796ad077fb525bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:56Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.608364 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.608425 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.608438 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.608454 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.608465 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:56Z","lastTransitionTime":"2026-01-31T14:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.619471 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xtn6l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68aeb9c7-d3c3-4c34-96ab-bb947421c504\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hljmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hljmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xtn6l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:56Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.638121 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:56Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.651495 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lxrfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b895e2a-7887-41c3-b641-9c72bb085dda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d0546721c181de3a5f2f4e791dcaeba69f94467bfbf08b4470332a0b4b49fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9hbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lxrfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:56Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.683852 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3446cde960ab284f70a338a9a4bfee334fdbdef503d868692fb847f4df6acba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20273cc05bc490f8dd5258dccd02aa0e97d1d22c159ce2c177594920f4af83de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://701178503ef461547e687442de0607af20cee8bca075347d1d1cb2a439562232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5623c591d9a6806335a9b671648fc172e749568a7ad29c3bd41f1a990ba56010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://357afc89f219ac34037aaeb7086076b36f05907502c53006aeb588eaf440cc74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8f6719bdf2fef7e21f7c5cdcb3de4b4f08e69bd62e22cb10fe157f638b3c148\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://307462c29efce1de33c4d50cdbc74a6cb86d00b128964370024b108640b05b5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://307462c29efce1de33c4d50cdbc74a6cb86d00b128964370024b108640b05b5e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T14:42:29Z\\\",\\\"message\\\":\\\"t:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 14:42:29.388826 6425 services_controller.go:443] Built service openshift-etcd/etcd LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.253\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:2379, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.5.253\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:9979, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0131 14:42:29.388860 6425 services_controller.go:444] Built service openshift-etcd/etcd LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0131 14:42:29.388874 6425 services_controller.go:445] Built service openshift-etcd/etcd LB template configs for network=default: []services.lbConfig(nil)\\\\nF0131 14:42:29.388002 6425 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-n8cdt_openshift-ovn-kubernetes(ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7f5e784895f315aa4d8fb6039b9e70c46f977a80fdd5d39ead81d75048abd6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n8cdt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:56Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.705932 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7adba94225f6d18961ccad4eab46916b607a2bff977a48d36fee6810aa4b7293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11d68e8091e3d18fff0b3c8f744c2e9002b6307e17815b221af7123715d8abf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:56Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.710707 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.710818 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.710842 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.710874 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.710896 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:56Z","lastTransitionTime":"2026-01-31T14:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.722414 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-68hvr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658471aa-68b2-478e-9522-ef5533009174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5c5f03d09a887caf686dc5e5d446c4589c7e577cd65e9329cd73d1f33af3d53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbl8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-68hvr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:56Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.814040 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.814120 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.814140 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.814165 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.814183 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:56Z","lastTransitionTime":"2026-01-31T14:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.917531 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.917874 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.918011 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.918321 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:56 crc kubenswrapper[4751]: I0131 14:42:56.918473 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:56Z","lastTransitionTime":"2026-01-31T14:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:57 crc kubenswrapper[4751]: I0131 14:42:57.022318 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:57 crc kubenswrapper[4751]: I0131 14:42:57.023299 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:57 crc kubenswrapper[4751]: I0131 14:42:57.023485 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:57 crc kubenswrapper[4751]: I0131 14:42:57.023655 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:57 crc kubenswrapper[4751]: I0131 14:42:57.023777 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:57Z","lastTransitionTime":"2026-01-31T14:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:57 crc kubenswrapper[4751]: I0131 14:42:57.127275 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:57 crc kubenswrapper[4751]: I0131 14:42:57.127331 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:57 crc kubenswrapper[4751]: I0131 14:42:57.127342 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:57 crc kubenswrapper[4751]: I0131 14:42:57.127361 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:57 crc kubenswrapper[4751]: I0131 14:42:57.127372 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:57Z","lastTransitionTime":"2026-01-31T14:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:57 crc kubenswrapper[4751]: I0131 14:42:57.230544 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:57 crc kubenswrapper[4751]: I0131 14:42:57.230601 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:57 crc kubenswrapper[4751]: I0131 14:42:57.230615 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:57 crc kubenswrapper[4751]: I0131 14:42:57.230638 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:57 crc kubenswrapper[4751]: I0131 14:42:57.230655 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:57Z","lastTransitionTime":"2026-01-31T14:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:57 crc kubenswrapper[4751]: I0131 14:42:57.334133 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:57 crc kubenswrapper[4751]: I0131 14:42:57.334175 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:57 crc kubenswrapper[4751]: I0131 14:42:57.334188 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:57 crc kubenswrapper[4751]: I0131 14:42:57.334210 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:57 crc kubenswrapper[4751]: I0131 14:42:57.334227 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:57Z","lastTransitionTime":"2026-01-31T14:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:57 crc kubenswrapper[4751]: I0131 14:42:57.404516 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 18:01:54.246493654 +0000 UTC Jan 31 14:42:57 crc kubenswrapper[4751]: I0131 14:42:57.404832 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:42:57 crc kubenswrapper[4751]: I0131 14:42:57.404865 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:42:57 crc kubenswrapper[4751]: I0131 14:42:57.404881 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:42:57 crc kubenswrapper[4751]: I0131 14:42:57.404881 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:42:57 crc kubenswrapper[4751]: E0131 14:42:57.404996 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xtn6l" podUID="68aeb9c7-d3c3-4c34-96ab-bb947421c504" Jan 31 14:42:57 crc kubenswrapper[4751]: E0131 14:42:57.405168 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 14:42:57 crc kubenswrapper[4751]: E0131 14:42:57.405295 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 14:42:57 crc kubenswrapper[4751]: E0131 14:42:57.405420 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 14:42:57 crc kubenswrapper[4751]: I0131 14:42:57.436960 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:57 crc kubenswrapper[4751]: I0131 14:42:57.437043 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:57 crc kubenswrapper[4751]: I0131 14:42:57.437056 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:57 crc kubenswrapper[4751]: I0131 14:42:57.437097 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:57 crc kubenswrapper[4751]: I0131 14:42:57.437114 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:57Z","lastTransitionTime":"2026-01-31T14:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:57 crc kubenswrapper[4751]: I0131 14:42:57.540407 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:57 crc kubenswrapper[4751]: I0131 14:42:57.540739 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:57 crc kubenswrapper[4751]: I0131 14:42:57.540839 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:57 crc kubenswrapper[4751]: I0131 14:42:57.540924 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:57 crc kubenswrapper[4751]: I0131 14:42:57.540999 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:57Z","lastTransitionTime":"2026-01-31T14:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:57 crc kubenswrapper[4751]: I0131 14:42:57.643761 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:57 crc kubenswrapper[4751]: I0131 14:42:57.643816 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:57 crc kubenswrapper[4751]: I0131 14:42:57.643826 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:57 crc kubenswrapper[4751]: I0131 14:42:57.643843 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:57 crc kubenswrapper[4751]: I0131 14:42:57.643854 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:57Z","lastTransitionTime":"2026-01-31T14:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:57 crc kubenswrapper[4751]: I0131 14:42:57.746462 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:57 crc kubenswrapper[4751]: I0131 14:42:57.746520 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:57 crc kubenswrapper[4751]: I0131 14:42:57.746537 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:57 crc kubenswrapper[4751]: I0131 14:42:57.746560 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:57 crc kubenswrapper[4751]: I0131 14:42:57.746577 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:57Z","lastTransitionTime":"2026-01-31T14:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:57 crc kubenswrapper[4751]: I0131 14:42:57.849415 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:57 crc kubenswrapper[4751]: I0131 14:42:57.849658 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:57 crc kubenswrapper[4751]: I0131 14:42:57.849778 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:57 crc kubenswrapper[4751]: I0131 14:42:57.849862 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:57 crc kubenswrapper[4751]: I0131 14:42:57.850144 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:57Z","lastTransitionTime":"2026-01-31T14:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:57 crc kubenswrapper[4751]: I0131 14:42:57.952395 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:57 crc kubenswrapper[4751]: I0131 14:42:57.952743 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:57 crc kubenswrapper[4751]: I0131 14:42:57.952826 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:57 crc kubenswrapper[4751]: I0131 14:42:57.952917 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:57 crc kubenswrapper[4751]: I0131 14:42:57.952997 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:57Z","lastTransitionTime":"2026-01-31T14:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:58 crc kubenswrapper[4751]: I0131 14:42:58.055038 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:58 crc kubenswrapper[4751]: I0131 14:42:58.055339 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:58 crc kubenswrapper[4751]: I0131 14:42:58.055408 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:58 crc kubenswrapper[4751]: I0131 14:42:58.055489 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:58 crc kubenswrapper[4751]: I0131 14:42:58.055559 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:58Z","lastTransitionTime":"2026-01-31T14:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:58 crc kubenswrapper[4751]: I0131 14:42:58.157609 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:58 crc kubenswrapper[4751]: I0131 14:42:58.157659 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:58 crc kubenswrapper[4751]: I0131 14:42:58.157671 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:58 crc kubenswrapper[4751]: I0131 14:42:58.157689 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:58 crc kubenswrapper[4751]: I0131 14:42:58.157702 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:58Z","lastTransitionTime":"2026-01-31T14:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:58 crc kubenswrapper[4751]: I0131 14:42:58.263240 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:58 crc kubenswrapper[4751]: I0131 14:42:58.263324 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:58 crc kubenswrapper[4751]: I0131 14:42:58.263355 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:58 crc kubenswrapper[4751]: I0131 14:42:58.263409 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:58 crc kubenswrapper[4751]: I0131 14:42:58.263435 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:58Z","lastTransitionTime":"2026-01-31T14:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:58 crc kubenswrapper[4751]: I0131 14:42:58.366394 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:58 crc kubenswrapper[4751]: I0131 14:42:58.366443 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:58 crc kubenswrapper[4751]: I0131 14:42:58.366460 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:58 crc kubenswrapper[4751]: I0131 14:42:58.366484 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:58 crc kubenswrapper[4751]: I0131 14:42:58.366500 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:58Z","lastTransitionTime":"2026-01-31T14:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:58 crc kubenswrapper[4751]: I0131 14:42:58.405537 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 01:26:27.520264792 +0000 UTC Jan 31 14:42:58 crc kubenswrapper[4751]: I0131 14:42:58.469685 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:58 crc kubenswrapper[4751]: I0131 14:42:58.469767 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:58 crc kubenswrapper[4751]: I0131 14:42:58.469791 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:58 crc kubenswrapper[4751]: I0131 14:42:58.469816 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:58 crc kubenswrapper[4751]: I0131 14:42:58.469835 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:58Z","lastTransitionTime":"2026-01-31T14:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:58 crc kubenswrapper[4751]: I0131 14:42:58.573170 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:58 crc kubenswrapper[4751]: I0131 14:42:58.573552 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:58 crc kubenswrapper[4751]: I0131 14:42:58.573704 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:58 crc kubenswrapper[4751]: I0131 14:42:58.573845 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:58 crc kubenswrapper[4751]: I0131 14:42:58.573962 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:58Z","lastTransitionTime":"2026-01-31T14:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:58 crc kubenswrapper[4751]: I0131 14:42:58.676646 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:58 crc kubenswrapper[4751]: I0131 14:42:58.676971 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:58 crc kubenswrapper[4751]: I0131 14:42:58.677146 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:58 crc kubenswrapper[4751]: I0131 14:42:58.677280 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:58 crc kubenswrapper[4751]: I0131 14:42:58.677393 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:58Z","lastTransitionTime":"2026-01-31T14:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:58 crc kubenswrapper[4751]: I0131 14:42:58.780674 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:58 crc kubenswrapper[4751]: I0131 14:42:58.780722 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:58 crc kubenswrapper[4751]: I0131 14:42:58.780735 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:58 crc kubenswrapper[4751]: I0131 14:42:58.780752 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:58 crc kubenswrapper[4751]: I0131 14:42:58.780765 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:58Z","lastTransitionTime":"2026-01-31T14:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:58 crc kubenswrapper[4751]: I0131 14:42:58.884019 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:58 crc kubenswrapper[4751]: I0131 14:42:58.884132 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:58 crc kubenswrapper[4751]: I0131 14:42:58.884157 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:58 crc kubenswrapper[4751]: I0131 14:42:58.884188 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:58 crc kubenswrapper[4751]: I0131 14:42:58.884210 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:58Z","lastTransitionTime":"2026-01-31T14:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:58 crc kubenswrapper[4751]: I0131 14:42:58.987022 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:58 crc kubenswrapper[4751]: I0131 14:42:58.987086 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:58 crc kubenswrapper[4751]: I0131 14:42:58.987100 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:58 crc kubenswrapper[4751]: I0131 14:42:58.987118 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:58 crc kubenswrapper[4751]: I0131 14:42:58.987131 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:58Z","lastTransitionTime":"2026-01-31T14:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:59 crc kubenswrapper[4751]: I0131 14:42:59.089322 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:59 crc kubenswrapper[4751]: I0131 14:42:59.089356 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:59 crc kubenswrapper[4751]: I0131 14:42:59.089367 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:59 crc kubenswrapper[4751]: I0131 14:42:59.089382 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:59 crc kubenswrapper[4751]: I0131 14:42:59.089392 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:59Z","lastTransitionTime":"2026-01-31T14:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:59 crc kubenswrapper[4751]: I0131 14:42:59.192498 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:59 crc kubenswrapper[4751]: I0131 14:42:59.192539 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:59 crc kubenswrapper[4751]: I0131 14:42:59.192551 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:59 crc kubenswrapper[4751]: I0131 14:42:59.192566 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:59 crc kubenswrapper[4751]: I0131 14:42:59.192577 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:59Z","lastTransitionTime":"2026-01-31T14:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:59 crc kubenswrapper[4751]: I0131 14:42:59.295051 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:59 crc kubenswrapper[4751]: I0131 14:42:59.295116 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:59 crc kubenswrapper[4751]: I0131 14:42:59.295127 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:59 crc kubenswrapper[4751]: I0131 14:42:59.295145 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:59 crc kubenswrapper[4751]: I0131 14:42:59.295157 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:59Z","lastTransitionTime":"2026-01-31T14:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:59 crc kubenswrapper[4751]: I0131 14:42:59.398411 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:59 crc kubenswrapper[4751]: I0131 14:42:59.398453 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:59 crc kubenswrapper[4751]: I0131 14:42:59.398463 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:59 crc kubenswrapper[4751]: I0131 14:42:59.398477 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:59 crc kubenswrapper[4751]: I0131 14:42:59.398487 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:59Z","lastTransitionTime":"2026-01-31T14:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:59 crc kubenswrapper[4751]: I0131 14:42:59.404779 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:42:59 crc kubenswrapper[4751]: I0131 14:42:59.404863 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:42:59 crc kubenswrapper[4751]: I0131 14:42:59.404985 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:42:59 crc kubenswrapper[4751]: E0131 14:42:59.404871 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 14:42:59 crc kubenswrapper[4751]: E0131 14:42:59.405116 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 14:42:59 crc kubenswrapper[4751]: I0131 14:42:59.404938 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:42:59 crc kubenswrapper[4751]: E0131 14:42:59.405295 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xtn6l" podUID="68aeb9c7-d3c3-4c34-96ab-bb947421c504" Jan 31 14:42:59 crc kubenswrapper[4751]: E0131 14:42:59.405397 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 14:42:59 crc kubenswrapper[4751]: I0131 14:42:59.406222 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 20:48:41.556041555 +0000 UTC Jan 31 14:42:59 crc kubenswrapper[4751]: I0131 14:42:59.406836 4751 scope.go:117] "RemoveContainer" containerID="307462c29efce1de33c4d50cdbc74a6cb86d00b128964370024b108640b05b5e" Jan 31 14:42:59 crc kubenswrapper[4751]: I0131 14:42:59.501480 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:59 crc kubenswrapper[4751]: I0131 14:42:59.501519 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:59 crc kubenswrapper[4751]: I0131 14:42:59.501531 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:59 crc kubenswrapper[4751]: I0131 14:42:59.501550 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:59 crc kubenswrapper[4751]: I0131 14:42:59.501562 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:59Z","lastTransitionTime":"2026-01-31T14:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:59 crc kubenswrapper[4751]: I0131 14:42:59.604621 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:59 crc kubenswrapper[4751]: I0131 14:42:59.605128 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:59 crc kubenswrapper[4751]: I0131 14:42:59.605147 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:59 crc kubenswrapper[4751]: I0131 14:42:59.605170 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:59 crc kubenswrapper[4751]: I0131 14:42:59.605218 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:59Z","lastTransitionTime":"2026-01-31T14:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:59 crc kubenswrapper[4751]: I0131 14:42:59.709235 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:59 crc kubenswrapper[4751]: I0131 14:42:59.709300 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:59 crc kubenswrapper[4751]: I0131 14:42:59.709318 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:59 crc kubenswrapper[4751]: I0131 14:42:59.709344 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:59 crc kubenswrapper[4751]: I0131 14:42:59.709362 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:59Z","lastTransitionTime":"2026-01-31T14:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:59 crc kubenswrapper[4751]: I0131 14:42:59.812248 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:59 crc kubenswrapper[4751]: I0131 14:42:59.812413 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:59 crc kubenswrapper[4751]: I0131 14:42:59.812437 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:59 crc kubenswrapper[4751]: I0131 14:42:59.812697 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:59 crc kubenswrapper[4751]: I0131 14:42:59.812744 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:59Z","lastTransitionTime":"2026-01-31T14:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:59 crc kubenswrapper[4751]: I0131 14:42:59.915654 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:42:59 crc kubenswrapper[4751]: I0131 14:42:59.915702 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:42:59 crc kubenswrapper[4751]: I0131 14:42:59.915715 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:42:59 crc kubenswrapper[4751]: I0131 14:42:59.915734 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:42:59 crc kubenswrapper[4751]: I0131 14:42:59.915749 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:42:59Z","lastTransitionTime":"2026-01-31T14:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:42:59 crc kubenswrapper[4751]: I0131 14:42:59.938865 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n8cdt_ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b/ovnkube-controller/2.log" Jan 31 14:42:59 crc kubenswrapper[4751]: I0131 14:42:59.942225 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" event={"ID":"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b","Type":"ContainerStarted","Data":"373c89defd0c3e17f3124be6af9afba6b241a48af85f558bb51d281d16ba27ac"} Jan 31 14:42:59 crc kubenswrapper[4751]: I0131 14:42:59.942769 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:42:59 crc kubenswrapper[4751]: I0131 14:42:59.959542 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b7a3883-0c5a-4116-adb0-ad25f69cd7f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daae5a3d70247f744d33c832eaf3080a91e73ffe0b5327d111ec8e87175becdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37355e2b9d95a941be3552b713f7fe96973a49528f7a979524779b263412853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9317b698da5ee8a5f40c046d810783e523f27b6806319023e57e08f5cae267c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff7cd215439083e1e83ef38a79946be1c7e9bbc20d66cdb931c456a21bbdf17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:59Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:59 crc kubenswrapper[4751]: I0131 14:42:59.976309 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc4ac60a9db8d22605139ad928fb8fd667aaba0b8563be03e796ad077fb525bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:59Z is after 2025-08-24T17:21:41Z" Jan 31 14:42:59 crc kubenswrapper[4751]: I0131 14:42:59.994267 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rtthp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7dd989b-33df-4562-a60b-f273428fea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bdcbdac0cc4b17e027947c041a0ee4a7d7f549aa6dbe5c07c370ca7c0c50475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ef2aa95d0f98ef0706040984786e2b48337984b508447f1b49b155ffb5f4608\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T14:42:49Z\\\",\\\"message\\\":\\\"2026-01-31T14:42:03+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_eaa2f7bd-5331-496e-8b4a-3784e7751508\\\\n2026-01-31T14:42:03+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_eaa2f7bd-5331-496e-8b4a-3784e7751508 to /host/opt/cni/bin/\\\\n2026-01-31T14:42:04Z [verbose] multus-daemon started\\\\n2026-01-31T14:42:04Z [verbose] Readiness Indicator file check\\\\n2026-01-31T14:42:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwrtf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rtthp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:42:59Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.013779 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rp5sb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5353863-ec39-4357-9b86-9be42ca17916\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99c893782bcbcbd370030fe164c6849abe258ce88db65e3d22c269d20d0bef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c73fb39c6c9db93acb0fdc34044b1cafc05046a470e3897eeaa7fac8107691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c73fb39c6c9db93acb0fdc34044b1cafc05046a470e3897eeaa7fac8107691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd0a7bc51ee2198f74d0adbfa358d772dc5176a7671b6a551157ad301348181d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd0a7bc51ee2198f74d0adbfa358d772dc5176a7671b6a551157ad301348181d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dbe202f26fa5c09ba45db2a0bbb7fd7796b91e665df4529eddfdabd6ce083fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dbe202f26fa5c09ba45db2a0bbb7fd7796b91e665df4529eddfdabd6ce083fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rp5sb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:00Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.018373 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.018398 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.018407 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.018422 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.018434 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:00Z","lastTransitionTime":"2026-01-31T14:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.030848 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4v79q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd8c0730-67df-445e-a6ce-c2edce5d9c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f27098d3c41dbb10f76ba04a8a989e91ff3eb6fe0fb0ca746e33417839235c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45q8c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34680c760b5c6a6e2a521731e962301b54aa3184d5d66792fb43e991c6502a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45q8c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4v79q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:00Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.047619 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:00Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.058458 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lxrfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b895e2a-7887-41c3-b641-9c72bb085dda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d0546721c181de3a5f2f4e791dcaeba69f94467bfbf08b4470332a0b4b49fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9hbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lxrfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:00Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.070029 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xtn6l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68aeb9c7-d3c3-4c34-96ab-bb947421c504\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hljmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hljmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xtn6l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:00Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.089468 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7adba94225f6d18961ccad4eab46916b607a2bff977a48d36fee6810aa4b7293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11d68e8091e3d18fff0b3c8f744c2e9002b6307e17815b221af7123715d8abf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:00Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.108858 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-68hvr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658471aa-68b2-478e-9522-ef5533009174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5c5f03d09a887caf686dc5e5d446c4589c7e577cd65e9329cd73d1f33af3d53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbl8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-68hvr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:00Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.121314 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.121369 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.121379 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.121399 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.121411 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:00Z","lastTransitionTime":"2026-01-31T14:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.141242 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3446cde960ab284f70a338a9a4bfee334fdbdef503d868692fb847f4df6acba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20273cc05bc490f8dd5258dccd02aa0e97d1d22c159ce2c177594920f4af83de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://701178503ef461547e687442de0607af20cee8bca075347d1d1cb2a439562232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5623c591d9a6806335a9b671648fc172e749568a7ad29c3bd41f1a990ba56010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://357afc89f219ac34037aaeb7086076b36f05907502c53006aeb588eaf440cc74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8f6719bdf2fef7e21f7c5cdcb3de4b4f08e69bd62e22cb10fe157f638b3c148\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://373c89defd0c3e17f3124be6af9afba6b241a48af85f558bb51d281d16ba27ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://307462c29efce1de33c4d50cdbc74a6cb86d00b128964370024b108640b05b5e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T14:42:29Z\\\",\\\"message\\\":\\\"t:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 14:42:29.388826 6425 services_controller.go:443] Built service openshift-etcd/etcd LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.253\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:2379, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.5.253\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:9979, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0131 14:42:29.388860 6425 services_controller.go:444] Built service openshift-etcd/etcd LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0131 14:42:29.388874 6425 services_controller.go:445] Built service openshift-etcd/etcd LB template configs for network=default: []services.lbConfig(nil)\\\\nF0131 14:42:29.388002 6425 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7f5e784895f315aa4d8fb6039b9e70c46f977a80fdd5d39ead81d75048abd6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n8cdt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:00Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.157388 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6afb84a0-c564-45aa-b7a1-cd6f8273fe45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e4fe03f50f95359d284f7d5aebb0df697fba39743cf83857c7f17893cb2bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08603a998264e94ef7871338a47019047495685fb867c5fa4f290f8a990dc6ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddd61fd469b2710e86edf5644cff4c4272ce73a59b6578c4cf941c5408e2d218\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df6c6abfaa0e16e8c6de87da9d3584b57a9e83e9dee03f98bb9de38cf71cc19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f93de7a002a1e6a281d90042a003acf677ad136b29da511ead9acd31eb7ab2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T14:41:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 14:41:50.030836 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 14:41:50.033684 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-814856940/tls.crt::/tmp/serving-cert-814856940/tls.key\\\\\\\"\\\\nI0131 14:41:55.828999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 14:41:55.832995 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 14:41:55.833028 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 14:41:55.833056 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 14:41:55.833065 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 14:41:55.854492 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 14:41:55.854645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 14:41:55.854833 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 14:41:55.854881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 14:41:55.854891 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 14:41:55.855031 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 14:41:55.858196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d34543b9a432a0dd029c18b90dd065749cf5a74b9221dcedf22a810afa749a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:00Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.171711 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64468352-f9fe-48bb-b204-b9f828c06bf8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c56c1e31014f9e3d0be8140f58cff1c752ad4be1c6c60a942bc18320bbd37b6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41a7c5739a571e6f3ec88c3798ad2604382b9320c44ddda3d41681a64c6ab2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a6478c4477b785bcb405d597f1c835faaf4ef7adb3a2bcd6e70cc2e692f44d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cbc9dfffff59a784e871d47bf2ab1b8419b11840eb329ce21c0f7c24e6e77c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cbc9dfffff59a784e871d47bf2ab1b8419b11840eb329ce21c0f7c24e6e77c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:00Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.191012 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d4eee728dbd9b6b0ade5149fd481cb3a688a6856f56a3a25a6cb49c944ee27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:00Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.211041 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:00Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.224728 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.224762 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.224771 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.224787 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.224797 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:00Z","lastTransitionTime":"2026-01-31T14:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.227971 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:00Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.246507 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c170e8-22c9-43a9-8b34-9d626c2ccddc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d86496066c7ba97ec9852c160ecb2e25f49cb0b7ae16e667f6279be42fd52679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3956d143be77f4a50143f9678eb51ab7871e250cae73d87c9e7fce2575e466c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wpj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:00Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.328054 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.328292 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.328353 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.328450 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.328507 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:00Z","lastTransitionTime":"2026-01-31T14:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.406789 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 23:13:41.196467546 +0000 UTC Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.420036 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.431306 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.431350 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.431367 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.431387 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.431405 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:00Z","lastTransitionTime":"2026-01-31T14:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.534019 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.534430 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.534561 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.534684 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.534803 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:00Z","lastTransitionTime":"2026-01-31T14:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.639346 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.639398 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.639413 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.639436 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.639501 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:00Z","lastTransitionTime":"2026-01-31T14:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.742625 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.742705 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.742729 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.742759 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.742839 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:00Z","lastTransitionTime":"2026-01-31T14:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.846213 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.846461 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.846715 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.846943 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.847160 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:00Z","lastTransitionTime":"2026-01-31T14:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.949159 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.950131 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.950447 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.950037 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n8cdt_ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b/ovnkube-controller/3.log" Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.950742 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.950988 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:00Z","lastTransitionTime":"2026-01-31T14:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.952060 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n8cdt_ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b/ovnkube-controller/2.log" Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.955525 4751 generic.go:334] "Generic (PLEG): container finished" podID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" containerID="373c89defd0c3e17f3124be6af9afba6b241a48af85f558bb51d281d16ba27ac" exitCode=1 Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.956369 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" event={"ID":"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b","Type":"ContainerDied","Data":"373c89defd0c3e17f3124be6af9afba6b241a48af85f558bb51d281d16ba27ac"} Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.956473 4751 scope.go:117] "RemoveContainer" containerID="307462c29efce1de33c4d50cdbc74a6cb86d00b128964370024b108640b05b5e" Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.957354 4751 scope.go:117] "RemoveContainer" containerID="373c89defd0c3e17f3124be6af9afba6b241a48af85f558bb51d281d16ba27ac" Jan 31 14:43:00 crc kubenswrapper[4751]: E0131 14:43:00.957617 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-n8cdt_openshift-ovn-kubernetes(ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" podUID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" Jan 31 14:43:00 crc kubenswrapper[4751]: I0131 14:43:00.974954 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-68hvr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658471aa-68b2-478e-9522-ef5533009174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5c5f03d09a887caf686dc5e5d446c4589c7e577cd65e9329cd73d1f33af3d53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbl8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-68hvr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:00Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.006223 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3446cde960ab284f70a338a9a4bfee334fdbdef503d868692fb847f4df6acba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20273cc05bc490f8dd5258dccd02aa0e97d1d22c159ce2c177594920f4af83de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://701178503ef461547e687442de0607af20cee8bca075347d1d1cb2a439562232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5623c591d9a6806335a9b671648fc172e749568a7ad29c3bd41f1a990ba56010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://357afc89f219ac34037aaeb7086076b36f05907502c53006aeb588eaf440cc74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8f6719bdf2fef7e21f7c5cdcb3de4b4f08e69bd62e22cb10fe157f638b3c148\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://373c89defd0c3e17f3124be6af9afba6b241a48af85f558bb51d281d16ba27ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://307462c29efce1de33c4d50cdbc74a6cb86d00b128964370024b108640b05b5e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T14:42:29Z\\\",\\\"message\\\":\\\"t:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 14:42:29.388826 6425 services_controller.go:443] Built service openshift-etcd/etcd LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.253\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:2379, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.5.253\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:9979, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0131 14:42:29.388860 6425 services_controller.go:444] Built service openshift-etcd/etcd LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0131 14:42:29.388874 6425 services_controller.go:445] Built service openshift-etcd/etcd LB template configs for network=default: []services.lbConfig(nil)\\\\nF0131 14:42:29.388002 6425 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://373c89defd0c3e17f3124be6af9afba6b241a48af85f558bb51d281d16ba27ac\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T14:43:00Z\\\",\\\"message\\\":\\\"t handler 5\\\\nI0131 14:43:00.256116 6857 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 14:43:00.256190 6857 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 14:43:00.256429 6857 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0131 14:43:00.256447 6857 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0131 14:43:00.256455 6857 handler.go:208] Removed *v1.Node event handler 2\\\\nI0131 14:43:00.256450 6857 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0131 14:43:00.256470 6857 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0131 14:43:00.256481 6857 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0131 14:43:00.256463 6857 handler.go:208] Removed *v1.Node event handler 7\\\\nI0131 14:43:00.256712 6857 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7f5e784895f315aa4d8fb6039b9e70c46f977a80fdd5d39ead81d75048abd6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n8cdt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:01Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.026201 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7adba94225f6d18961ccad4eab46916b607a2bff977a48d36fee6810aa4b7293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11d68e8091e3d18fff0b3c8f744c2e9002b6307e17815b221af7123715d8abf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:01Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.046974 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6afb84a0-c564-45aa-b7a1-cd6f8273fe45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e4fe03f50f95359d284f7d5aebb0df697fba39743cf83857c7f17893cb2bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08603a998264e94ef7871338a47019047495685fb867c5fa4f290f8a990dc6ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddd61fd469b2710e86edf5644cff4c4272ce73a59b6578c4cf941c5408e2d218\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df6c6abfaa0e16e8c6de87da9d3584b57a9e83e9dee03f98bb9de38cf71cc19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f93de7a002a1e6a281d90042a003acf677ad136b29da511ead9acd31eb7ab2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T14:41:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 14:41:50.030836 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 14:41:50.033684 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-814856940/tls.crt::/tmp/serving-cert-814856940/tls.key\\\\\\\"\\\\nI0131 14:41:55.828999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 14:41:55.832995 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 14:41:55.833028 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 14:41:55.833056 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 14:41:55.833065 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 14:41:55.854492 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 14:41:55.854645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 14:41:55.854833 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 14:41:55.854881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 14:41:55.854891 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 14:41:55.855031 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 14:41:55.858196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d34543b9a432a0dd029c18b90dd065749cf5a74b9221dcedf22a810afa749a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:01Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.053417 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.053457 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.053475 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.053498 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.053516 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:01Z","lastTransitionTime":"2026-01-31T14:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.067792 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64468352-f9fe-48bb-b204-b9f828c06bf8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c56c1e31014f9e3d0be8140f58cff1c752ad4be1c6c60a942bc18320bbd37b6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41a7c5739a571e6f3ec88c3798ad2604382b9320c44ddda3d41681a64c6ab2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a6478c4477b785bcb405d597f1c835faaf4ef7adb3a2bcd6e70cc2e692f44d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cbc9dfffff59a784e871d47bf2ab1b8419b11840eb329ce21c0f7c24e6e77c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cbc9dfffff59a784e871d47bf2ab1b8419b11840eb329ce21c0f7c24e6e77c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:01Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.088361 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d4eee728dbd9b6b0ade5149fd481cb3a688a6856f56a3a25a6cb49c944ee27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:01Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.106420 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:01Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.124894 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:01Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.141909 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c170e8-22c9-43a9-8b34-9d626c2ccddc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d86496066c7ba97ec9852c160ecb2e25f49cb0b7ae16e667f6279be42fd52679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3956d143be77f4a50143f9678eb51ab7871e250cae73d87c9e7fce2575e466c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wpj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:01Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.156641 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.156679 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.156691 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.156712 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.156729 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:01Z","lastTransitionTime":"2026-01-31T14:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.157364 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af117b1f-6308-4303-bff0-ebc3a310c356\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e9f62b49c0d916da6e1631f3216d52fd37ab407e878dc0509ccb19d0e5fb1df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0845dfce4ee156b5b52e07b6257d62908413eba9570b3767b9f00724e81e034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0845dfce4ee156b5b52e07b6257d62908413eba9570b3767b9f00724e81e034\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:01Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.174768 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc4ac60a9db8d22605139ad928fb8fd667aaba0b8563be03e796ad077fb525bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:01Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.198034 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rtthp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7dd989b-33df-4562-a60b-f273428fea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bdcbdac0cc4b17e027947c041a0ee4a7d7f549aa6dbe5c07c370ca7c0c50475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ef2aa95d0f98ef0706040984786e2b48337984b508447f1b49b155ffb5f4608\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T14:42:49Z\\\",\\\"message\\\":\\\"2026-01-31T14:42:03+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_eaa2f7bd-5331-496e-8b4a-3784e7751508\\\\n2026-01-31T14:42:03+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_eaa2f7bd-5331-496e-8b4a-3784e7751508 to /host/opt/cni/bin/\\\\n2026-01-31T14:42:04Z [verbose] multus-daemon started\\\\n2026-01-31T14:42:04Z [verbose] Readiness Indicator file check\\\\n2026-01-31T14:42:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwrtf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rtthp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:01Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.223337 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rp5sb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5353863-ec39-4357-9b86-9be42ca17916\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99c893782bcbcbd370030fe164c6849abe258ce88db65e3d22c269d20d0bef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c73fb39c6c9db93acb0fdc34044b1cafc05046a470e3897eeaa7fac8107691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c73fb39c6c9db93acb0fdc34044b1cafc05046a470e3897eeaa7fac8107691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd0a7bc51ee2198f74d0adbfa358d772dc5176a7671b6a551157ad301348181d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd0a7bc51ee2198f74d0adbfa358d772dc5176a7671b6a551157ad301348181d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dbe202f26fa5c09ba45db2a0bbb7fd7796b91e665df4529eddfdabd6ce083fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dbe202f26fa5c09ba45db2a0bbb7fd7796b91e665df4529eddfdabd6ce083fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rp5sb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:01Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.242990 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4v79q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd8c0730-67df-445e-a6ce-c2edce5d9c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f27098d3c41dbb10f76ba04a8a989e91ff3eb6fe0fb0ca746e33417839235c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45q8c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34680c760b5c6a6e2a521731e962301b54aa3184d5d66792fb43e991c6502a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45q8c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4v79q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:01Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.264812 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.265051 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.265186 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.265276 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.265351 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:01Z","lastTransitionTime":"2026-01-31T14:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.267275 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b7a3883-0c5a-4116-adb0-ad25f69cd7f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daae5a3d70247f744d33c832eaf3080a91e73ffe0b5327d111ec8e87175becdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37355e2b9d95a941be3552b713f7fe96973a49528f7a979524779b263412853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9317b698da5ee8a5f40c046d810783e523f27b6806319023e57e08f5cae267c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff7cd215439083e1e83ef38a79946be1c7e9bbc20d66cdb931c456a21bbdf17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:01Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.281629 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lxrfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b895e2a-7887-41c3-b641-9c72bb085dda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d0546721c181de3a5f2f4e791dcaeba69f94467bfbf08b4470332a0b4b49fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9hbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lxrfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:01Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.296463 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xtn6l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68aeb9c7-d3c3-4c34-96ab-bb947421c504\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hljmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hljmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xtn6l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:01Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.315697 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:01Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.338590 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:43:01 crc kubenswrapper[4751]: E0131 14:43:01.338766 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:05.338737758 +0000 UTC m=+149.713450653 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.338850 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.338889 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.338931 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.338969 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:43:01 crc kubenswrapper[4751]: E0131 14:43:01.339135 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 14:43:01 crc kubenswrapper[4751]: E0131 14:43:01.339177 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 14:43:01 crc kubenswrapper[4751]: E0131 14:43:01.339198 4751 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 14:43:01 crc kubenswrapper[4751]: E0131 14:43:01.339234 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 14:43:01 crc kubenswrapper[4751]: E0131 14:43:01.339244 4751 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 14:43:01 crc kubenswrapper[4751]: E0131 14:43:01.339260 4751 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 14:43:01 crc kubenswrapper[4751]: E0131 14:43:01.339436 4751 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 14:43:01 crc kubenswrapper[4751]: E0131 14:43:01.339272 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-31 14:44:05.339246101 +0000 UTC m=+149.713959006 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 14:43:01 crc kubenswrapper[4751]: E0131 14:43:01.339514 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 14:44:05.339497707 +0000 UTC m=+149.714210592 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 14:43:01 crc kubenswrapper[4751]: E0131 14:43:01.339526 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-31 14:44:05.339520578 +0000 UTC m=+149.714233453 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 14:43:01 crc kubenswrapper[4751]: E0131 14:43:01.339150 4751 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 14:43:01 crc kubenswrapper[4751]: E0131 14:43:01.339747 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 14:44:05.339689602 +0000 UTC m=+149.714402517 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.370583 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.370673 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.370693 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.370723 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.370749 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:01Z","lastTransitionTime":"2026-01-31T14:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.405273 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:43:01 crc kubenswrapper[4751]: E0131 14:43:01.405432 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.405508 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.405560 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.405526 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:43:01 crc kubenswrapper[4751]: E0131 14:43:01.405696 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 14:43:01 crc kubenswrapper[4751]: E0131 14:43:01.405819 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 14:43:01 crc kubenswrapper[4751]: E0131 14:43:01.405868 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xtn6l" podUID="68aeb9c7-d3c3-4c34-96ab-bb947421c504" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.408279 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 23:54:06.89267671 +0000 UTC Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.473736 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.473787 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.473799 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.473817 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.473828 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:01Z","lastTransitionTime":"2026-01-31T14:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.576996 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.577084 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.577100 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.577124 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.577141 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:01Z","lastTransitionTime":"2026-01-31T14:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.679828 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.679915 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.679933 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.679962 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.679980 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:01Z","lastTransitionTime":"2026-01-31T14:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.783577 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.783911 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.784062 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.784236 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.784382 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:01Z","lastTransitionTime":"2026-01-31T14:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.887806 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.887850 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.887860 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.887878 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.887889 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:01Z","lastTransitionTime":"2026-01-31T14:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.961591 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n8cdt_ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b/ovnkube-controller/3.log" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.967548 4751 scope.go:117] "RemoveContainer" containerID="373c89defd0c3e17f3124be6af9afba6b241a48af85f558bb51d281d16ba27ac" Jan 31 14:43:01 crc kubenswrapper[4751]: E0131 14:43:01.967833 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-n8cdt_openshift-ovn-kubernetes(ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" podUID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.992809 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.992859 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.992870 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.992888 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.992900 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:01Z","lastTransitionTime":"2026-01-31T14:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:01 crc kubenswrapper[4751]: I0131 14:43:01.992803 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:01Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.006143 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lxrfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b895e2a-7887-41c3-b641-9c72bb085dda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d0546721c181de3a5f2f4e791dcaeba69f94467bfbf08b4470332a0b4b49fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9hbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lxrfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.020748 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xtn6l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68aeb9c7-d3c3-4c34-96ab-bb947421c504\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hljmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hljmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xtn6l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.035512 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7adba94225f6d18961ccad4eab46916b607a2bff977a48d36fee6810aa4b7293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11d68e8091e3d18fff0b3c8f744c2e9002b6307e17815b221af7123715d8abf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.051582 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-68hvr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658471aa-68b2-478e-9522-ef5533009174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5c5f03d09a887caf686dc5e5d446c4589c7e577cd65e9329cd73d1f33af3d53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbl8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-68hvr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.095443 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.095483 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.095492 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.095507 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.095517 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:02Z","lastTransitionTime":"2026-01-31T14:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.108472 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3446cde960ab284f70a338a9a4bfee334fdbdef503d868692fb847f4df6acba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20273cc05bc490f8dd5258dccd02aa0e97d1d22c159ce2c177594920f4af83de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://701178503ef461547e687442de0607af20cee8bca075347d1d1cb2a439562232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5623c591d9a6806335a9b671648fc172e749568a7ad29c3bd41f1a990ba56010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://357afc89f219ac34037aaeb7086076b36f05907502c53006aeb588eaf440cc74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8f6719bdf2fef7e21f7c5cdcb3de4b4f08e69bd62e22cb10fe157f638b3c148\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://373c89defd0c3e17f3124be6af9afba6b241a48af85f558bb51d281d16ba27ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://373c89defd0c3e17f3124be6af9afba6b241a48af85f558bb51d281d16ba27ac\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T14:43:00Z\\\",\\\"message\\\":\\\"t handler 5\\\\nI0131 14:43:00.256116 6857 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 14:43:00.256190 6857 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 14:43:00.256429 6857 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0131 14:43:00.256447 6857 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0131 14:43:00.256455 6857 handler.go:208] Removed *v1.Node event handler 2\\\\nI0131 14:43:00.256450 6857 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0131 14:43:00.256470 6857 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0131 14:43:00.256481 6857 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0131 14:43:00.256463 6857 handler.go:208] Removed *v1.Node event handler 7\\\\nI0131 14:43:00.256712 6857 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-n8cdt_openshift-ovn-kubernetes(ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7f5e784895f315aa4d8fb6039b9e70c46f977a80fdd5d39ead81d75048abd6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n8cdt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.121820 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64468352-f9fe-48bb-b204-b9f828c06bf8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c56c1e31014f9e3d0be8140f58cff1c752ad4be1c6c60a942bc18320bbd37b6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41a7c5739a571e6f3ec88c3798ad2604382b9320c44ddda3d41681a64c6ab2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a6478c4477b785bcb405d597f1c835faaf4ef7adb3a2bcd6e70cc2e692f44d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cbc9dfffff59a784e871d47bf2ab1b8419b11840eb329ce21c0f7c24e6e77c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cbc9dfffff59a784e871d47bf2ab1b8419b11840eb329ce21c0f7c24e6e77c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.140772 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d4eee728dbd9b6b0ade5149fd481cb3a688a6856f56a3a25a6cb49c944ee27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.156664 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.176147 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.192559 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c170e8-22c9-43a9-8b34-9d626c2ccddc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d86496066c7ba97ec9852c160ecb2e25f49cb0b7ae16e667f6279be42fd52679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3956d143be77f4a50143f9678eb51ab7871e250cae73d87c9e7fce2575e466c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wpj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.198734 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.198775 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.198787 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.198844 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.198859 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:02Z","lastTransitionTime":"2026-01-31T14:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.214478 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af117b1f-6308-4303-bff0-ebc3a310c356\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e9f62b49c0d916da6e1631f3216d52fd37ab407e878dc0509ccb19d0e5fb1df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0845dfce4ee156b5b52e07b6257d62908413eba9570b3767b9f00724e81e034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0845dfce4ee156b5b52e07b6257d62908413eba9570b3767b9f00724e81e034\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.233030 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6afb84a0-c564-45aa-b7a1-cd6f8273fe45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e4fe03f50f95359d284f7d5aebb0df697fba39743cf83857c7f17893cb2bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08603a998264e94ef7871338a47019047495685fb867c5fa4f290f8a990dc6ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddd61fd469b2710e86edf5644cff4c4272ce73a59b6578c4cf941c5408e2d218\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df6c6abfaa0e16e8c6de87da9d3584b57a9e83e9dee03f98bb9de38cf71cc19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f93de7a002a1e6a281d90042a003acf677ad136b29da511ead9acd31eb7ab2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T14:41:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 14:41:50.030836 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 14:41:50.033684 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-814856940/tls.crt::/tmp/serving-cert-814856940/tls.key\\\\\\\"\\\\nI0131 14:41:55.828999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 14:41:55.832995 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 14:41:55.833028 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 14:41:55.833056 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 14:41:55.833065 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 14:41:55.854492 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 14:41:55.854645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 14:41:55.854833 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 14:41:55.854881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 14:41:55.854891 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 14:41:55.855031 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 14:41:55.858196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d34543b9a432a0dd029c18b90dd065749cf5a74b9221dcedf22a810afa749a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.257455 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rp5sb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5353863-ec39-4357-9b86-9be42ca17916\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99c893782bcbcbd370030fe164c6849abe258ce88db65e3d22c269d20d0bef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c73fb39c6c9db93acb0fdc34044b1cafc05046a470e3897eeaa7fac8107691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c73fb39c6c9db93acb0fdc34044b1cafc05046a470e3897eeaa7fac8107691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd0a7bc51ee2198f74d0adbfa358d772dc5176a7671b6a551157ad301348181d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd0a7bc51ee2198f74d0adbfa358d772dc5176a7671b6a551157ad301348181d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dbe202f26fa5c09ba45db2a0bbb7fd7796b91e665df4529eddfdabd6ce083fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dbe202f26fa5c09ba45db2a0bbb7fd7796b91e665df4529eddfdabd6ce083fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rp5sb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.273776 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4v79q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd8c0730-67df-445e-a6ce-c2edce5d9c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f27098d3c41dbb10f76ba04a8a989e91ff3eb6fe0fb0ca746e33417839235c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45q8c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34680c760b5c6a6e2a521731e962301b54aa3184d5d66792fb43e991c6502a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45q8c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4v79q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.292747 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b7a3883-0c5a-4116-adb0-ad25f69cd7f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daae5a3d70247f744d33c832eaf3080a91e73ffe0b5327d111ec8e87175becdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37355e2b9d95a941be3552b713f7fe96973a49528f7a979524779b263412853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9317b698da5ee8a5f40c046d810783e523f27b6806319023e57e08f5cae267c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff7cd215439083e1e83ef38a79946be1c7e9bbc20d66cdb931c456a21bbdf17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.302244 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.302323 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.302349 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.302380 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.302469 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:02Z","lastTransitionTime":"2026-01-31T14:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.310362 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc4ac60a9db8d22605139ad928fb8fd667aaba0b8563be03e796ad077fb525bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.330504 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rtthp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7dd989b-33df-4562-a60b-f273428fea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bdcbdac0cc4b17e027947c041a0ee4a7d7f549aa6dbe5c07c370ca7c0c50475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ef2aa95d0f98ef0706040984786e2b48337984b508447f1b49b155ffb5f4608\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T14:42:49Z\\\",\\\"message\\\":\\\"2026-01-31T14:42:03+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_eaa2f7bd-5331-496e-8b4a-3784e7751508\\\\n2026-01-31T14:42:03+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_eaa2f7bd-5331-496e-8b4a-3784e7751508 to /host/opt/cni/bin/\\\\n2026-01-31T14:42:04Z [verbose] multus-daemon started\\\\n2026-01-31T14:42:04Z [verbose] Readiness Indicator file check\\\\n2026-01-31T14:42:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwrtf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rtthp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:02Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.405426 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.405507 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.405519 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.405535 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.405547 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:02Z","lastTransitionTime":"2026-01-31T14:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.409594 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 13:05:18.724521635 +0000 UTC Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.426615 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.509277 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.509360 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.509387 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.509418 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.509442 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:02Z","lastTransitionTime":"2026-01-31T14:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.612389 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.612450 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.612468 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.612492 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.612509 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:02Z","lastTransitionTime":"2026-01-31T14:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.714844 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.714903 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.714915 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.714934 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.714947 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:02Z","lastTransitionTime":"2026-01-31T14:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.817961 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.818001 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.818012 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.818097 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.818112 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:02Z","lastTransitionTime":"2026-01-31T14:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.920541 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.920592 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.920609 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.920634 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:02 crc kubenswrapper[4751]: I0131 14:43:02.920652 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:02Z","lastTransitionTime":"2026-01-31T14:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:03 crc kubenswrapper[4751]: I0131 14:43:03.024174 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:03 crc kubenswrapper[4751]: I0131 14:43:03.024221 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:03 crc kubenswrapper[4751]: I0131 14:43:03.024236 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:03 crc kubenswrapper[4751]: I0131 14:43:03.024255 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:03 crc kubenswrapper[4751]: I0131 14:43:03.024266 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:03Z","lastTransitionTime":"2026-01-31T14:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:03 crc kubenswrapper[4751]: I0131 14:43:03.127849 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:03 crc kubenswrapper[4751]: I0131 14:43:03.127893 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:03 crc kubenswrapper[4751]: I0131 14:43:03.127907 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:03 crc kubenswrapper[4751]: I0131 14:43:03.127927 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:03 crc kubenswrapper[4751]: I0131 14:43:03.127941 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:03Z","lastTransitionTime":"2026-01-31T14:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:03 crc kubenswrapper[4751]: I0131 14:43:03.230906 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:03 crc kubenswrapper[4751]: I0131 14:43:03.230950 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:03 crc kubenswrapper[4751]: I0131 14:43:03.230961 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:03 crc kubenswrapper[4751]: I0131 14:43:03.230977 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:03 crc kubenswrapper[4751]: I0131 14:43:03.230989 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:03Z","lastTransitionTime":"2026-01-31T14:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:03 crc kubenswrapper[4751]: I0131 14:43:03.333115 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:03 crc kubenswrapper[4751]: I0131 14:43:03.333149 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:03 crc kubenswrapper[4751]: I0131 14:43:03.333159 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:03 crc kubenswrapper[4751]: I0131 14:43:03.333174 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:03 crc kubenswrapper[4751]: I0131 14:43:03.333184 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:03Z","lastTransitionTime":"2026-01-31T14:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:03 crc kubenswrapper[4751]: I0131 14:43:03.405011 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:43:03 crc kubenswrapper[4751]: I0131 14:43:03.405050 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:43:03 crc kubenswrapper[4751]: I0131 14:43:03.405058 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:43:03 crc kubenswrapper[4751]: I0131 14:43:03.405031 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:43:03 crc kubenswrapper[4751]: E0131 14:43:03.405201 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 14:43:03 crc kubenswrapper[4751]: E0131 14:43:03.405301 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 14:43:03 crc kubenswrapper[4751]: E0131 14:43:03.405415 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xtn6l" podUID="68aeb9c7-d3c3-4c34-96ab-bb947421c504" Jan 31 14:43:03 crc kubenswrapper[4751]: E0131 14:43:03.405495 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 14:43:03 crc kubenswrapper[4751]: I0131 14:43:03.409738 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 23:32:30.049192764 +0000 UTC Jan 31 14:43:03 crc kubenswrapper[4751]: I0131 14:43:03.436528 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:03 crc kubenswrapper[4751]: I0131 14:43:03.436566 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:03 crc kubenswrapper[4751]: I0131 14:43:03.436575 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:03 crc kubenswrapper[4751]: I0131 14:43:03.436592 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:03 crc kubenswrapper[4751]: I0131 14:43:03.436604 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:03Z","lastTransitionTime":"2026-01-31T14:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:03 crc kubenswrapper[4751]: I0131 14:43:03.539909 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:03 crc kubenswrapper[4751]: I0131 14:43:03.539961 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:03 crc kubenswrapper[4751]: I0131 14:43:03.539977 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:03 crc kubenswrapper[4751]: I0131 14:43:03.539999 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:03 crc kubenswrapper[4751]: I0131 14:43:03.540016 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:03Z","lastTransitionTime":"2026-01-31T14:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:03 crc kubenswrapper[4751]: I0131 14:43:03.644296 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:03 crc kubenswrapper[4751]: I0131 14:43:03.644355 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:03 crc kubenswrapper[4751]: I0131 14:43:03.644375 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:03 crc kubenswrapper[4751]: I0131 14:43:03.644397 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:03 crc kubenswrapper[4751]: I0131 14:43:03.644414 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:03Z","lastTransitionTime":"2026-01-31T14:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:03 crc kubenswrapper[4751]: I0131 14:43:03.747415 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:03 crc kubenswrapper[4751]: I0131 14:43:03.747466 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:03 crc kubenswrapper[4751]: I0131 14:43:03.747486 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:03 crc kubenswrapper[4751]: I0131 14:43:03.747510 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:03 crc kubenswrapper[4751]: I0131 14:43:03.747529 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:03Z","lastTransitionTime":"2026-01-31T14:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:03 crc kubenswrapper[4751]: I0131 14:43:03.850296 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:03 crc kubenswrapper[4751]: I0131 14:43:03.850386 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:03 crc kubenswrapper[4751]: I0131 14:43:03.850403 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:03 crc kubenswrapper[4751]: I0131 14:43:03.850429 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:03 crc kubenswrapper[4751]: I0131 14:43:03.850447 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:03Z","lastTransitionTime":"2026-01-31T14:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:03 crc kubenswrapper[4751]: I0131 14:43:03.953145 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:03 crc kubenswrapper[4751]: I0131 14:43:03.953194 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:03 crc kubenswrapper[4751]: I0131 14:43:03.953211 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:03 crc kubenswrapper[4751]: I0131 14:43:03.953236 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:03 crc kubenswrapper[4751]: I0131 14:43:03.953253 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:03Z","lastTransitionTime":"2026-01-31T14:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.056379 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.056440 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.056458 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.056482 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.056501 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:04Z","lastTransitionTime":"2026-01-31T14:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.083446 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.083532 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.083551 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.083606 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.083622 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:04Z","lastTransitionTime":"2026-01-31T14:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:04 crc kubenswrapper[4751]: E0131 14:43:04.105135 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:43:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:43:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:43:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:43:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:43:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:43:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:43:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:43:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2bc08d22-1e39-4800-b402-ea260cc19637\\\",\\\"systemUUID\\\":\\\"ad4951a2-dbbf-4c4e-af3f-cce3c25b01ef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:04Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.111107 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.111164 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.111174 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.111189 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.111226 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:04Z","lastTransitionTime":"2026-01-31T14:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:04 crc kubenswrapper[4751]: E0131 14:43:04.129645 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:43:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:43:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:43:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:43:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:43:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:43:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:43:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:43:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2bc08d22-1e39-4800-b402-ea260cc19637\\\",\\\"systemUUID\\\":\\\"ad4951a2-dbbf-4c4e-af3f-cce3c25b01ef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:04Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.134782 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.134858 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.134881 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.134915 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.134942 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:04Z","lastTransitionTime":"2026-01-31T14:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:04 crc kubenswrapper[4751]: E0131 14:43:04.155755 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:43:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:43:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:43:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:43:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:43:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:43:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:43:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:43:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2bc08d22-1e39-4800-b402-ea260cc19637\\\",\\\"systemUUID\\\":\\\"ad4951a2-dbbf-4c4e-af3f-cce3c25b01ef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:04Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.160591 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.160632 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.160643 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.160661 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.160671 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:04Z","lastTransitionTime":"2026-01-31T14:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:04 crc kubenswrapper[4751]: E0131 14:43:04.178965 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:43:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:43:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:43:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:43:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:43:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:43:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:43:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:43:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2bc08d22-1e39-4800-b402-ea260cc19637\\\",\\\"systemUUID\\\":\\\"ad4951a2-dbbf-4c4e-af3f-cce3c25b01ef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:04Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.184226 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.184285 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.184303 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.184328 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.184348 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:04Z","lastTransitionTime":"2026-01-31T14:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:04 crc kubenswrapper[4751]: E0131 14:43:04.203877 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:43:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:43:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:43:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:43:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:43:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:43:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:43:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:43:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2bc08d22-1e39-4800-b402-ea260cc19637\\\",\\\"systemUUID\\\":\\\"ad4951a2-dbbf-4c4e-af3f-cce3c25b01ef\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:04Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:04 crc kubenswrapper[4751]: E0131 14:43:04.204182 4751 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.206630 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.206705 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.206718 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.206742 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.206758 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:04Z","lastTransitionTime":"2026-01-31T14:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.310528 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.310574 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.310590 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.310615 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.310634 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:04Z","lastTransitionTime":"2026-01-31T14:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.410661 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 08:40:46.994417485 +0000 UTC Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.415539 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.415612 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.415629 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.415653 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.415670 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:04Z","lastTransitionTime":"2026-01-31T14:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.519622 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.519700 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.519726 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.519756 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.519776 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:04Z","lastTransitionTime":"2026-01-31T14:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.623576 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.623652 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.623676 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.623706 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.623729 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:04Z","lastTransitionTime":"2026-01-31T14:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.727723 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.727788 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.727806 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.727834 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.727850 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:04Z","lastTransitionTime":"2026-01-31T14:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.831581 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.831673 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.831692 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.831719 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.831739 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:04Z","lastTransitionTime":"2026-01-31T14:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.935970 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.936023 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.936035 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.936057 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:04 crc kubenswrapper[4751]: I0131 14:43:04.936087 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:04Z","lastTransitionTime":"2026-01-31T14:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:05 crc kubenswrapper[4751]: I0131 14:43:05.039156 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:05 crc kubenswrapper[4751]: I0131 14:43:05.039218 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:05 crc kubenswrapper[4751]: I0131 14:43:05.039241 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:05 crc kubenswrapper[4751]: I0131 14:43:05.039270 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:05 crc kubenswrapper[4751]: I0131 14:43:05.039291 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:05Z","lastTransitionTime":"2026-01-31T14:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:05 crc kubenswrapper[4751]: I0131 14:43:05.142921 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:05 crc kubenswrapper[4751]: I0131 14:43:05.142988 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:05 crc kubenswrapper[4751]: I0131 14:43:05.143009 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:05 crc kubenswrapper[4751]: I0131 14:43:05.143038 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:05 crc kubenswrapper[4751]: I0131 14:43:05.143058 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:05Z","lastTransitionTime":"2026-01-31T14:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:05 crc kubenswrapper[4751]: I0131 14:43:05.246316 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:05 crc kubenswrapper[4751]: I0131 14:43:05.246374 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:05 crc kubenswrapper[4751]: I0131 14:43:05.246392 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:05 crc kubenswrapper[4751]: I0131 14:43:05.246413 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:05 crc kubenswrapper[4751]: I0131 14:43:05.246431 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:05Z","lastTransitionTime":"2026-01-31T14:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:05 crc kubenswrapper[4751]: I0131 14:43:05.349533 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:05 crc kubenswrapper[4751]: I0131 14:43:05.349607 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:05 crc kubenswrapper[4751]: I0131 14:43:05.349632 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:05 crc kubenswrapper[4751]: I0131 14:43:05.349659 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:05 crc kubenswrapper[4751]: I0131 14:43:05.349724 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:05Z","lastTransitionTime":"2026-01-31T14:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:05 crc kubenswrapper[4751]: I0131 14:43:05.405800 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:43:05 crc kubenswrapper[4751]: I0131 14:43:05.405828 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:43:05 crc kubenswrapper[4751]: I0131 14:43:05.405908 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:43:05 crc kubenswrapper[4751]: I0131 14:43:05.405908 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:43:05 crc kubenswrapper[4751]: E0131 14:43:05.406114 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 14:43:05 crc kubenswrapper[4751]: E0131 14:43:05.406528 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xtn6l" podUID="68aeb9c7-d3c3-4c34-96ab-bb947421c504" Jan 31 14:43:05 crc kubenswrapper[4751]: E0131 14:43:05.406624 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 14:43:05 crc kubenswrapper[4751]: E0131 14:43:05.406768 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 14:43:05 crc kubenswrapper[4751]: I0131 14:43:05.411817 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 04:49:47.597623665 +0000 UTC Jan 31 14:43:05 crc kubenswrapper[4751]: I0131 14:43:05.452773 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:05 crc kubenswrapper[4751]: I0131 14:43:05.452837 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:05 crc kubenswrapper[4751]: I0131 14:43:05.452857 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:05 crc kubenswrapper[4751]: I0131 14:43:05.452885 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:05 crc kubenswrapper[4751]: I0131 14:43:05.452904 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:05Z","lastTransitionTime":"2026-01-31T14:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:05 crc kubenswrapper[4751]: I0131 14:43:05.557123 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:05 crc kubenswrapper[4751]: I0131 14:43:05.557181 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:05 crc kubenswrapper[4751]: I0131 14:43:05.557192 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:05 crc kubenswrapper[4751]: I0131 14:43:05.557213 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:05 crc kubenswrapper[4751]: I0131 14:43:05.557225 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:05Z","lastTransitionTime":"2026-01-31T14:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:05 crc kubenswrapper[4751]: I0131 14:43:05.660776 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:05 crc kubenswrapper[4751]: I0131 14:43:05.660834 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:05 crc kubenswrapper[4751]: I0131 14:43:05.660852 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:05 crc kubenswrapper[4751]: I0131 14:43:05.660878 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:05 crc kubenswrapper[4751]: I0131 14:43:05.660898 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:05Z","lastTransitionTime":"2026-01-31T14:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:05 crc kubenswrapper[4751]: I0131 14:43:05.764181 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:05 crc kubenswrapper[4751]: I0131 14:43:05.764247 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:05 crc kubenswrapper[4751]: I0131 14:43:05.764265 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:05 crc kubenswrapper[4751]: I0131 14:43:05.764290 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:05 crc kubenswrapper[4751]: I0131 14:43:05.764308 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:05Z","lastTransitionTime":"2026-01-31T14:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:05 crc kubenswrapper[4751]: I0131 14:43:05.866427 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:05 crc kubenswrapper[4751]: I0131 14:43:05.866497 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:05 crc kubenswrapper[4751]: I0131 14:43:05.866514 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:05 crc kubenswrapper[4751]: I0131 14:43:05.866544 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:05 crc kubenswrapper[4751]: I0131 14:43:05.866563 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:05Z","lastTransitionTime":"2026-01-31T14:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:05 crc kubenswrapper[4751]: I0131 14:43:05.968699 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:05 crc kubenswrapper[4751]: I0131 14:43:05.968764 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:05 crc kubenswrapper[4751]: I0131 14:43:05.968784 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:05 crc kubenswrapper[4751]: I0131 14:43:05.968813 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:05 crc kubenswrapper[4751]: I0131 14:43:05.968832 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:05Z","lastTransitionTime":"2026-01-31T14:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.071993 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.072059 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.072145 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.072172 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.072189 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:06Z","lastTransitionTime":"2026-01-31T14:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.176110 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.176173 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.176187 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.176210 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.176227 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:06Z","lastTransitionTime":"2026-01-31T14:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.279792 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.279881 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.279899 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.279923 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.279970 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:06Z","lastTransitionTime":"2026-01-31T14:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.383207 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.383265 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.383281 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.383304 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.383325 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:06Z","lastTransitionTime":"2026-01-31T14:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.411993 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 17:41:31.866259288 +0000 UTC Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.426792 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rtthp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7dd989b-33df-4562-a60b-f273428fea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bdcbdac0cc4b17e027947c041a0ee4a7d7f549aa6dbe5c07c370ca7c0c50475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ef2aa95d0f98ef0706040984786e2b48337984b508447f1b49b155ffb5f4608\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T14:42:49Z\\\",\\\"message\\\":\\\"2026-01-31T14:42:03+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_eaa2f7bd-5331-496e-8b4a-3784e7751508\\\\n2026-01-31T14:42:03+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_eaa2f7bd-5331-496e-8b4a-3784e7751508 to /host/opt/cni/bin/\\\\n2026-01-31T14:42:04Z [verbose] multus-daemon started\\\\n2026-01-31T14:42:04Z [verbose] Readiness Indicator file check\\\\n2026-01-31T14:42:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwrtf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rtthp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:06Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.447464 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rp5sb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5353863-ec39-4357-9b86-9be42ca17916\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99c893782bcbcbd370030fe164c6849abe258ce88db65e3d22c269d20d0bef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c73fb39c6c9db93acb0fdc34044b1cafc05046a470e3897eeaa7fac8107691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c73fb39c6c9db93acb0fdc34044b1cafc05046a470e3897eeaa7fac8107691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd0a7bc51ee2198f74d0adbfa358d772dc5176a7671b6a551157ad301348181d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd0a7bc51ee2198f74d0adbfa358d772dc5176a7671b6a551157ad301348181d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dbe202f26fa5c09ba45db2a0bbb7fd7796b91e665df4529eddfdabd6ce083fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dbe202f26fa5c09ba45db2a0bbb7fd7796b91e665df4529eddfdabd6ce083fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rp5sb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:06Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.461486 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4v79q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd8c0730-67df-445e-a6ce-c2edce5d9c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f27098d3c41dbb10f76ba04a8a989e91ff3eb6fe0fb0ca746e33417839235c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45q8c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34680c760b5c6a6e2a521731e962301b54aa3184d5d66792fb43e991c6502a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45q8c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4v79q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:06Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.478790 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b7a3883-0c5a-4116-adb0-ad25f69cd7f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daae5a3d70247f744d33c832eaf3080a91e73ffe0b5327d111ec8e87175becdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37355e2b9d95a941be3552b713f7fe96973a49528f7a979524779b263412853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9317b698da5ee8a5f40c046d810783e523f27b6806319023e57e08f5cae267c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff7cd215439083e1e83ef38a79946be1c7e9bbc20d66cdb931c456a21bbdf17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:06Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.486258 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.486324 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.486343 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.486390 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.486410 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:06Z","lastTransitionTime":"2026-01-31T14:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.493219 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc4ac60a9db8d22605139ad928fb8fd667aaba0b8563be03e796ad077fb525bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:06Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.510031 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xtn6l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68aeb9c7-d3c3-4c34-96ab-bb947421c504\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hljmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hljmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xtn6l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:06Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.533288 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:06Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.543369 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lxrfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b895e2a-7887-41c3-b641-9c72bb085dda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d0546721c181de3a5f2f4e791dcaeba69f94467bfbf08b4470332a0b4b49fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9hbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lxrfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:06Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.565400 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3446cde960ab284f70a338a9a4bfee334fdbdef503d868692fb847f4df6acba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20273cc05bc490f8dd5258dccd02aa0e97d1d22c159ce2c177594920f4af83de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://701178503ef461547e687442de0607af20cee8bca075347d1d1cb2a439562232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5623c591d9a6806335a9b671648fc172e749568a7ad29c3bd41f1a990ba56010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://357afc89f219ac34037aaeb7086076b36f05907502c53006aeb588eaf440cc74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8f6719bdf2fef7e21f7c5cdcb3de4b4f08e69bd62e22cb10fe157f638b3c148\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://373c89defd0c3e17f3124be6af9afba6b241a48af85f558bb51d281d16ba27ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://373c89defd0c3e17f3124be6af9afba6b241a48af85f558bb51d281d16ba27ac\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T14:43:00Z\\\",\\\"message\\\":\\\"t handler 5\\\\nI0131 14:43:00.256116 6857 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 14:43:00.256190 6857 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 14:43:00.256429 6857 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0131 14:43:00.256447 6857 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0131 14:43:00.256455 6857 handler.go:208] Removed *v1.Node event handler 2\\\\nI0131 14:43:00.256450 6857 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0131 14:43:00.256470 6857 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0131 14:43:00.256481 6857 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0131 14:43:00.256463 6857 handler.go:208] Removed *v1.Node event handler 7\\\\nI0131 14:43:00.256712 6857 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-n8cdt_openshift-ovn-kubernetes(ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7f5e784895f315aa4d8fb6039b9e70c46f977a80fdd5d39ead81d75048abd6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n8cdt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:06Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.580558 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7adba94225f6d18961ccad4eab46916b607a2bff977a48d36fee6810aa4b7293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11d68e8091e3d18fff0b3c8f744c2e9002b6307e17815b221af7123715d8abf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:06Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.589640 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.589685 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.589702 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.589727 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.589744 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:06Z","lastTransitionTime":"2026-01-31T14:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.593734 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-68hvr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658471aa-68b2-478e-9522-ef5533009174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5c5f03d09a887caf686dc5e5d446c4589c7e577cd65e9329cd73d1f33af3d53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbl8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-68hvr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:06Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.612873 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6afb84a0-c564-45aa-b7a1-cd6f8273fe45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e4fe03f50f95359d284f7d5aebb0df697fba39743cf83857c7f17893cb2bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08603a998264e94ef7871338a47019047495685fb867c5fa4f290f8a990dc6ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddd61fd469b2710e86edf5644cff4c4272ce73a59b6578c4cf941c5408e2d218\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df6c6abfaa0e16e8c6de87da9d3584b57a9e83e9dee03f98bb9de38cf71cc19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f93de7a002a1e6a281d90042a003acf677ad136b29da511ead9acd31eb7ab2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T14:41:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 14:41:50.030836 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 14:41:50.033684 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-814856940/tls.crt::/tmp/serving-cert-814856940/tls.key\\\\\\\"\\\\nI0131 14:41:55.828999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 14:41:55.832995 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 14:41:55.833028 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 14:41:55.833056 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 14:41:55.833065 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 14:41:55.854492 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 14:41:55.854645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 14:41:55.854833 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 14:41:55.854881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 14:41:55.854891 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 14:41:55.855031 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 14:41:55.858196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d34543b9a432a0dd029c18b90dd065749cf5a74b9221dcedf22a810afa749a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:06Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.625575 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64468352-f9fe-48bb-b204-b9f828c06bf8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c56c1e31014f9e3d0be8140f58cff1c752ad4be1c6c60a942bc18320bbd37b6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41a7c5739a571e6f3ec88c3798ad2604382b9320c44ddda3d41681a64c6ab2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a6478c4477b785bcb405d597f1c835faaf4ef7adb3a2bcd6e70cc2e692f44d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cbc9dfffff59a784e871d47bf2ab1b8419b11840eb329ce21c0f7c24e6e77c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cbc9dfffff59a784e871d47bf2ab1b8419b11840eb329ce21c0f7c24e6e77c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:06Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.639553 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d4eee728dbd9b6b0ade5149fd481cb3a688a6856f56a3a25a6cb49c944ee27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:06Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.656063 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:06Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.670446 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:06Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.682212 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c170e8-22c9-43a9-8b34-9d626c2ccddc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d86496066c7ba97ec9852c160ecb2e25f49cb0b7ae16e667f6279be42fd52679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3956d143be77f4a50143f9678eb51ab7871e250cae73d87c9e7fce2575e466c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wpj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:06Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.695210 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.695273 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.696271 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.696310 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.696331 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:06Z","lastTransitionTime":"2026-01-31T14:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.696664 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af117b1f-6308-4303-bff0-ebc3a310c356\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e9f62b49c0d916da6e1631f3216d52fd37ab407e878dc0509ccb19d0e5fb1df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0845dfce4ee156b5b52e07b6257d62908413eba9570b3767b9f00724e81e034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0845dfce4ee156b5b52e07b6257d62908413eba9570b3767b9f00724e81e034\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:06Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.728168 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23027958-cbc9-4206-8dd5-13f10df7f298\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f4a4eb52c2c850f91c212fdc556452ab8cc91168ddb67c2078b806d8725be2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e66ea760a35f4e073d5ead7b0270164010b4dd14737b23202f83a10290f75d3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa739a6a66bd2196c9131cf929bdb8a133e3e40c3dfa9a105bb3ea33fa2ede20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92d196e489f72bd3c04ada6d0ea993f0ad89eb42497efc8723720ca3a7720509\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0b0fe57d51f2684ba60b1818c1e3010e5364c6d196433972b46cb3c3f9b5e61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccd9efb7096722c8a48318444b235a1970fbec711faf7448d47696ff84da5d37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccd9efb7096722c8a48318444b235a1970fbec711faf7448d47696ff84da5d37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1020dca4733e38925646f97eb80524c4060630e33323e9a5a0fdc4634c6b468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1020dca4733e38925646f97eb80524c4060630e33323e9a5a0fdc4634c6b468\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bf0f78147bc50d98a5ba239c2456467778fb4724433d914b9ee4300ce3af6e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf0f78147bc50d98a5ba239c2456467778fb4724433d914b9ee4300ce3af6e4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:06Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.800226 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.800278 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.800295 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.800322 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.800342 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:06Z","lastTransitionTime":"2026-01-31T14:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.904211 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.904281 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.904302 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.904332 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:06 crc kubenswrapper[4751]: I0131 14:43:06.904354 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:06Z","lastTransitionTime":"2026-01-31T14:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:07 crc kubenswrapper[4751]: I0131 14:43:07.006427 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:07 crc kubenswrapper[4751]: I0131 14:43:07.006916 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:07 crc kubenswrapper[4751]: I0131 14:43:07.007212 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:07 crc kubenswrapper[4751]: I0131 14:43:07.007435 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:07 crc kubenswrapper[4751]: I0131 14:43:07.007653 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:07Z","lastTransitionTime":"2026-01-31T14:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:07 crc kubenswrapper[4751]: I0131 14:43:07.111244 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:07 crc kubenswrapper[4751]: I0131 14:43:07.111308 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:07 crc kubenswrapper[4751]: I0131 14:43:07.111329 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:07 crc kubenswrapper[4751]: I0131 14:43:07.111356 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:07 crc kubenswrapper[4751]: I0131 14:43:07.111375 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:07Z","lastTransitionTime":"2026-01-31T14:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:07 crc kubenswrapper[4751]: I0131 14:43:07.214924 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:07 crc kubenswrapper[4751]: I0131 14:43:07.214969 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:07 crc kubenswrapper[4751]: I0131 14:43:07.214981 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:07 crc kubenswrapper[4751]: I0131 14:43:07.214999 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:07 crc kubenswrapper[4751]: I0131 14:43:07.215012 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:07Z","lastTransitionTime":"2026-01-31T14:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:07 crc kubenswrapper[4751]: I0131 14:43:07.317423 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:07 crc kubenswrapper[4751]: I0131 14:43:07.317476 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:07 crc kubenswrapper[4751]: I0131 14:43:07.317492 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:07 crc kubenswrapper[4751]: I0131 14:43:07.317513 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:07 crc kubenswrapper[4751]: I0131 14:43:07.317529 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:07Z","lastTransitionTime":"2026-01-31T14:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:07 crc kubenswrapper[4751]: I0131 14:43:07.405804 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:43:07 crc kubenswrapper[4751]: I0131 14:43:07.405910 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:43:07 crc kubenswrapper[4751]: I0131 14:43:07.405921 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:43:07 crc kubenswrapper[4751]: E0131 14:43:07.406538 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xtn6l" podUID="68aeb9c7-d3c3-4c34-96ab-bb947421c504" Jan 31 14:43:07 crc kubenswrapper[4751]: E0131 14:43:07.406710 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 14:43:07 crc kubenswrapper[4751]: I0131 14:43:07.406021 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:43:07 crc kubenswrapper[4751]: E0131 14:43:07.406918 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 14:43:07 crc kubenswrapper[4751]: E0131 14:43:07.406910 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 14:43:07 crc kubenswrapper[4751]: I0131 14:43:07.412837 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 04:48:19.34428897 +0000 UTC Jan 31 14:43:07 crc kubenswrapper[4751]: I0131 14:43:07.421004 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:07 crc kubenswrapper[4751]: I0131 14:43:07.421102 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:07 crc kubenswrapper[4751]: I0131 14:43:07.421127 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:07 crc kubenswrapper[4751]: I0131 14:43:07.421159 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:07 crc kubenswrapper[4751]: I0131 14:43:07.421185 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:07Z","lastTransitionTime":"2026-01-31T14:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:07 crc kubenswrapper[4751]: I0131 14:43:07.524924 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:07 crc kubenswrapper[4751]: I0131 14:43:07.525007 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:07 crc kubenswrapper[4751]: I0131 14:43:07.525025 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:07 crc kubenswrapper[4751]: I0131 14:43:07.525050 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:07 crc kubenswrapper[4751]: I0131 14:43:07.525113 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:07Z","lastTransitionTime":"2026-01-31T14:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:07 crc kubenswrapper[4751]: I0131 14:43:07.627672 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:07 crc kubenswrapper[4751]: I0131 14:43:07.627707 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:07 crc kubenswrapper[4751]: I0131 14:43:07.627716 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:07 crc kubenswrapper[4751]: I0131 14:43:07.627730 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:07 crc kubenswrapper[4751]: I0131 14:43:07.627740 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:07Z","lastTransitionTime":"2026-01-31T14:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:07 crc kubenswrapper[4751]: I0131 14:43:07.731641 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:07 crc kubenswrapper[4751]: I0131 14:43:07.731704 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:07 crc kubenswrapper[4751]: I0131 14:43:07.731725 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:07 crc kubenswrapper[4751]: I0131 14:43:07.731753 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:07 crc kubenswrapper[4751]: I0131 14:43:07.731771 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:07Z","lastTransitionTime":"2026-01-31T14:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:07 crc kubenswrapper[4751]: I0131 14:43:07.835055 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:07 crc kubenswrapper[4751]: I0131 14:43:07.835145 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:07 crc kubenswrapper[4751]: I0131 14:43:07.835163 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:07 crc kubenswrapper[4751]: I0131 14:43:07.835188 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:07 crc kubenswrapper[4751]: I0131 14:43:07.835205 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:07Z","lastTransitionTime":"2026-01-31T14:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:07 crc kubenswrapper[4751]: I0131 14:43:07.938139 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:07 crc kubenswrapper[4751]: I0131 14:43:07.938191 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:07 crc kubenswrapper[4751]: I0131 14:43:07.938203 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:07 crc kubenswrapper[4751]: I0131 14:43:07.938221 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:07 crc kubenswrapper[4751]: I0131 14:43:07.938233 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:07Z","lastTransitionTime":"2026-01-31T14:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:08 crc kubenswrapper[4751]: I0131 14:43:08.040809 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:08 crc kubenswrapper[4751]: I0131 14:43:08.040877 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:08 crc kubenswrapper[4751]: I0131 14:43:08.040896 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:08 crc kubenswrapper[4751]: I0131 14:43:08.040920 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:08 crc kubenswrapper[4751]: I0131 14:43:08.040938 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:08Z","lastTransitionTime":"2026-01-31T14:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:08 crc kubenswrapper[4751]: I0131 14:43:08.143604 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:08 crc kubenswrapper[4751]: I0131 14:43:08.143635 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:08 crc kubenswrapper[4751]: I0131 14:43:08.143643 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:08 crc kubenswrapper[4751]: I0131 14:43:08.143656 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:08 crc kubenswrapper[4751]: I0131 14:43:08.143667 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:08Z","lastTransitionTime":"2026-01-31T14:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:08 crc kubenswrapper[4751]: I0131 14:43:08.245688 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:08 crc kubenswrapper[4751]: I0131 14:43:08.245762 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:08 crc kubenswrapper[4751]: I0131 14:43:08.245788 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:08 crc kubenswrapper[4751]: I0131 14:43:08.245814 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:08 crc kubenswrapper[4751]: I0131 14:43:08.245831 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:08Z","lastTransitionTime":"2026-01-31T14:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:08 crc kubenswrapper[4751]: I0131 14:43:08.347936 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:08 crc kubenswrapper[4751]: I0131 14:43:08.347990 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:08 crc kubenswrapper[4751]: I0131 14:43:08.348008 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:08 crc kubenswrapper[4751]: I0131 14:43:08.348032 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:08 crc kubenswrapper[4751]: I0131 14:43:08.348048 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:08Z","lastTransitionTime":"2026-01-31T14:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:08 crc kubenswrapper[4751]: I0131 14:43:08.413763 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 23:02:36.237980655 +0000 UTC Jan 31 14:43:08 crc kubenswrapper[4751]: I0131 14:43:08.449991 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:08 crc kubenswrapper[4751]: I0131 14:43:08.450016 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:08 crc kubenswrapper[4751]: I0131 14:43:08.450023 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:08 crc kubenswrapper[4751]: I0131 14:43:08.450035 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:08 crc kubenswrapper[4751]: I0131 14:43:08.450043 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:08Z","lastTransitionTime":"2026-01-31T14:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:08 crc kubenswrapper[4751]: I0131 14:43:08.553116 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:08 crc kubenswrapper[4751]: I0131 14:43:08.553191 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:08 crc kubenswrapper[4751]: I0131 14:43:08.553212 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:08 crc kubenswrapper[4751]: I0131 14:43:08.553248 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:08 crc kubenswrapper[4751]: I0131 14:43:08.553270 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:08Z","lastTransitionTime":"2026-01-31T14:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:08 crc kubenswrapper[4751]: I0131 14:43:08.656306 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:08 crc kubenswrapper[4751]: I0131 14:43:08.656350 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:08 crc kubenswrapper[4751]: I0131 14:43:08.656361 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:08 crc kubenswrapper[4751]: I0131 14:43:08.656378 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:08 crc kubenswrapper[4751]: I0131 14:43:08.656390 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:08Z","lastTransitionTime":"2026-01-31T14:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:08 crc kubenswrapper[4751]: I0131 14:43:08.758606 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:08 crc kubenswrapper[4751]: I0131 14:43:08.758674 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:08 crc kubenswrapper[4751]: I0131 14:43:08.758692 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:08 crc kubenswrapper[4751]: I0131 14:43:08.758718 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:08 crc kubenswrapper[4751]: I0131 14:43:08.758736 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:08Z","lastTransitionTime":"2026-01-31T14:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:08 crc kubenswrapper[4751]: I0131 14:43:08.860787 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:08 crc kubenswrapper[4751]: I0131 14:43:08.860826 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:08 crc kubenswrapper[4751]: I0131 14:43:08.860834 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:08 crc kubenswrapper[4751]: I0131 14:43:08.860847 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:08 crc kubenswrapper[4751]: I0131 14:43:08.860856 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:08Z","lastTransitionTime":"2026-01-31T14:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:08 crc kubenswrapper[4751]: I0131 14:43:08.963648 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:08 crc kubenswrapper[4751]: I0131 14:43:08.963727 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:08 crc kubenswrapper[4751]: I0131 14:43:08.963747 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:08 crc kubenswrapper[4751]: I0131 14:43:08.963770 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:08 crc kubenswrapper[4751]: I0131 14:43:08.963787 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:08Z","lastTransitionTime":"2026-01-31T14:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:09 crc kubenswrapper[4751]: I0131 14:43:09.066326 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:09 crc kubenswrapper[4751]: I0131 14:43:09.066364 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:09 crc kubenswrapper[4751]: I0131 14:43:09.066375 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:09 crc kubenswrapper[4751]: I0131 14:43:09.066390 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:09 crc kubenswrapper[4751]: I0131 14:43:09.066402 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:09Z","lastTransitionTime":"2026-01-31T14:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:09 crc kubenswrapper[4751]: I0131 14:43:09.168898 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:09 crc kubenswrapper[4751]: I0131 14:43:09.168937 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:09 crc kubenswrapper[4751]: I0131 14:43:09.168948 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:09 crc kubenswrapper[4751]: I0131 14:43:09.168995 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:09 crc kubenswrapper[4751]: I0131 14:43:09.169012 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:09Z","lastTransitionTime":"2026-01-31T14:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:09 crc kubenswrapper[4751]: I0131 14:43:09.272532 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:09 crc kubenswrapper[4751]: I0131 14:43:09.272581 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:09 crc kubenswrapper[4751]: I0131 14:43:09.272597 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:09 crc kubenswrapper[4751]: I0131 14:43:09.272619 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:09 crc kubenswrapper[4751]: I0131 14:43:09.272635 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:09Z","lastTransitionTime":"2026-01-31T14:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:09 crc kubenswrapper[4751]: I0131 14:43:09.376017 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:09 crc kubenswrapper[4751]: I0131 14:43:09.376059 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:09 crc kubenswrapper[4751]: I0131 14:43:09.376091 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:09 crc kubenswrapper[4751]: I0131 14:43:09.376110 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:09 crc kubenswrapper[4751]: I0131 14:43:09.376122 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:09Z","lastTransitionTime":"2026-01-31T14:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:09 crc kubenswrapper[4751]: I0131 14:43:09.405190 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:43:09 crc kubenswrapper[4751]: I0131 14:43:09.405216 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:43:09 crc kubenswrapper[4751]: I0131 14:43:09.405302 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:43:09 crc kubenswrapper[4751]: E0131 14:43:09.405343 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 14:43:09 crc kubenswrapper[4751]: I0131 14:43:09.405383 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:43:09 crc kubenswrapper[4751]: E0131 14:43:09.405513 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 14:43:09 crc kubenswrapper[4751]: E0131 14:43:09.405606 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xtn6l" podUID="68aeb9c7-d3c3-4c34-96ab-bb947421c504" Jan 31 14:43:09 crc kubenswrapper[4751]: E0131 14:43:09.405691 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 14:43:09 crc kubenswrapper[4751]: I0131 14:43:09.413853 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 09:39:50.737406983 +0000 UTC Jan 31 14:43:09 crc kubenswrapper[4751]: I0131 14:43:09.479600 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:09 crc kubenswrapper[4751]: I0131 14:43:09.479702 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:09 crc kubenswrapper[4751]: I0131 14:43:09.479723 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:09 crc kubenswrapper[4751]: I0131 14:43:09.479785 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:09 crc kubenswrapper[4751]: I0131 14:43:09.479805 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:09Z","lastTransitionTime":"2026-01-31T14:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:09 crc kubenswrapper[4751]: I0131 14:43:09.582830 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:09 crc kubenswrapper[4751]: I0131 14:43:09.582891 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:09 crc kubenswrapper[4751]: I0131 14:43:09.582912 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:09 crc kubenswrapper[4751]: I0131 14:43:09.582939 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:09 crc kubenswrapper[4751]: I0131 14:43:09.582956 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:09Z","lastTransitionTime":"2026-01-31T14:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:09 crc kubenswrapper[4751]: I0131 14:43:09.685335 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:09 crc kubenswrapper[4751]: I0131 14:43:09.685400 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:09 crc kubenswrapper[4751]: I0131 14:43:09.685439 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:09 crc kubenswrapper[4751]: I0131 14:43:09.685470 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:09 crc kubenswrapper[4751]: I0131 14:43:09.685494 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:09Z","lastTransitionTime":"2026-01-31T14:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:09 crc kubenswrapper[4751]: I0131 14:43:09.787586 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:09 crc kubenswrapper[4751]: I0131 14:43:09.787640 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:09 crc kubenswrapper[4751]: I0131 14:43:09.787701 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:09 crc kubenswrapper[4751]: I0131 14:43:09.787727 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:09 crc kubenswrapper[4751]: I0131 14:43:09.787744 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:09Z","lastTransitionTime":"2026-01-31T14:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:09 crc kubenswrapper[4751]: I0131 14:43:09.890573 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:09 crc kubenswrapper[4751]: I0131 14:43:09.890640 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:09 crc kubenswrapper[4751]: I0131 14:43:09.890662 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:09 crc kubenswrapper[4751]: I0131 14:43:09.890692 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:09 crc kubenswrapper[4751]: I0131 14:43:09.890713 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:09Z","lastTransitionTime":"2026-01-31T14:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:09 crc kubenswrapper[4751]: I0131 14:43:09.993957 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:09 crc kubenswrapper[4751]: I0131 14:43:09.994020 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:09 crc kubenswrapper[4751]: I0131 14:43:09.994043 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:09 crc kubenswrapper[4751]: I0131 14:43:09.994105 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:09 crc kubenswrapper[4751]: I0131 14:43:09.994135 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:09Z","lastTransitionTime":"2026-01-31T14:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:10 crc kubenswrapper[4751]: I0131 14:43:10.097263 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:10 crc kubenswrapper[4751]: I0131 14:43:10.097317 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:10 crc kubenswrapper[4751]: I0131 14:43:10.097336 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:10 crc kubenswrapper[4751]: I0131 14:43:10.097359 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:10 crc kubenswrapper[4751]: I0131 14:43:10.097377 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:10Z","lastTransitionTime":"2026-01-31T14:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:10 crc kubenswrapper[4751]: I0131 14:43:10.199615 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:10 crc kubenswrapper[4751]: I0131 14:43:10.199650 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:10 crc kubenswrapper[4751]: I0131 14:43:10.199658 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:10 crc kubenswrapper[4751]: I0131 14:43:10.199673 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:10 crc kubenswrapper[4751]: I0131 14:43:10.199687 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:10Z","lastTransitionTime":"2026-01-31T14:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:10 crc kubenswrapper[4751]: I0131 14:43:10.302921 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:10 crc kubenswrapper[4751]: I0131 14:43:10.302972 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:10 crc kubenswrapper[4751]: I0131 14:43:10.302990 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:10 crc kubenswrapper[4751]: I0131 14:43:10.303013 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:10 crc kubenswrapper[4751]: I0131 14:43:10.303030 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:10Z","lastTransitionTime":"2026-01-31T14:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:10 crc kubenswrapper[4751]: I0131 14:43:10.405734 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:10 crc kubenswrapper[4751]: I0131 14:43:10.405769 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:10 crc kubenswrapper[4751]: I0131 14:43:10.405780 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:10 crc kubenswrapper[4751]: I0131 14:43:10.405793 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:10 crc kubenswrapper[4751]: I0131 14:43:10.405805 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:10Z","lastTransitionTime":"2026-01-31T14:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:10 crc kubenswrapper[4751]: I0131 14:43:10.414489 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 08:09:51.858982127 +0000 UTC Jan 31 14:43:10 crc kubenswrapper[4751]: I0131 14:43:10.508668 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:10 crc kubenswrapper[4751]: I0131 14:43:10.508732 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:10 crc kubenswrapper[4751]: I0131 14:43:10.508748 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:10 crc kubenswrapper[4751]: I0131 14:43:10.508773 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:10 crc kubenswrapper[4751]: I0131 14:43:10.508794 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:10Z","lastTransitionTime":"2026-01-31T14:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:10 crc kubenswrapper[4751]: I0131 14:43:10.611605 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:10 crc kubenswrapper[4751]: I0131 14:43:10.611670 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:10 crc kubenswrapper[4751]: I0131 14:43:10.611686 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:10 crc kubenswrapper[4751]: I0131 14:43:10.611711 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:10 crc kubenswrapper[4751]: I0131 14:43:10.611728 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:10Z","lastTransitionTime":"2026-01-31T14:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:10 crc kubenswrapper[4751]: I0131 14:43:10.714565 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:10 crc kubenswrapper[4751]: I0131 14:43:10.714610 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:10 crc kubenswrapper[4751]: I0131 14:43:10.714618 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:10 crc kubenswrapper[4751]: I0131 14:43:10.714632 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:10 crc kubenswrapper[4751]: I0131 14:43:10.714641 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:10Z","lastTransitionTime":"2026-01-31T14:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:10 crc kubenswrapper[4751]: I0131 14:43:10.818115 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:10 crc kubenswrapper[4751]: I0131 14:43:10.818179 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:10 crc kubenswrapper[4751]: I0131 14:43:10.818202 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:10 crc kubenswrapper[4751]: I0131 14:43:10.818231 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:10 crc kubenswrapper[4751]: I0131 14:43:10.818252 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:10Z","lastTransitionTime":"2026-01-31T14:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:10 crc kubenswrapper[4751]: I0131 14:43:10.921836 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:10 crc kubenswrapper[4751]: I0131 14:43:10.921907 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:10 crc kubenswrapper[4751]: I0131 14:43:10.921932 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:10 crc kubenswrapper[4751]: I0131 14:43:10.921960 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:10 crc kubenswrapper[4751]: I0131 14:43:10.921986 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:10Z","lastTransitionTime":"2026-01-31T14:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:11 crc kubenswrapper[4751]: I0131 14:43:11.024801 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:11 crc kubenswrapper[4751]: I0131 14:43:11.024876 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:11 crc kubenswrapper[4751]: I0131 14:43:11.024902 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:11 crc kubenswrapper[4751]: I0131 14:43:11.024931 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:11 crc kubenswrapper[4751]: I0131 14:43:11.024955 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:11Z","lastTransitionTime":"2026-01-31T14:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:11 crc kubenswrapper[4751]: I0131 14:43:11.127910 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:11 crc kubenswrapper[4751]: I0131 14:43:11.127977 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:11 crc kubenswrapper[4751]: I0131 14:43:11.127999 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:11 crc kubenswrapper[4751]: I0131 14:43:11.128028 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:11 crc kubenswrapper[4751]: I0131 14:43:11.128052 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:11Z","lastTransitionTime":"2026-01-31T14:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:11 crc kubenswrapper[4751]: I0131 14:43:11.231014 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:11 crc kubenswrapper[4751]: I0131 14:43:11.231101 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:11 crc kubenswrapper[4751]: I0131 14:43:11.231120 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:11 crc kubenswrapper[4751]: I0131 14:43:11.231143 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:11 crc kubenswrapper[4751]: I0131 14:43:11.231160 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:11Z","lastTransitionTime":"2026-01-31T14:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:11 crc kubenswrapper[4751]: I0131 14:43:11.334201 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:11 crc kubenswrapper[4751]: I0131 14:43:11.334493 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:11 crc kubenswrapper[4751]: I0131 14:43:11.334514 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:11 crc kubenswrapper[4751]: I0131 14:43:11.334538 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:11 crc kubenswrapper[4751]: I0131 14:43:11.334556 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:11Z","lastTransitionTime":"2026-01-31T14:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:11 crc kubenswrapper[4751]: I0131 14:43:11.405209 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:43:11 crc kubenswrapper[4751]: I0131 14:43:11.405274 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:43:11 crc kubenswrapper[4751]: I0131 14:43:11.405300 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:43:11 crc kubenswrapper[4751]: E0131 14:43:11.405446 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xtn6l" podUID="68aeb9c7-d3c3-4c34-96ab-bb947421c504" Jan 31 14:43:11 crc kubenswrapper[4751]: I0131 14:43:11.405486 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:43:11 crc kubenswrapper[4751]: E0131 14:43:11.405812 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 14:43:11 crc kubenswrapper[4751]: E0131 14:43:11.405896 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 14:43:11 crc kubenswrapper[4751]: E0131 14:43:11.406003 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 14:43:11 crc kubenswrapper[4751]: I0131 14:43:11.415595 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 17:28:56.781087954 +0000 UTC Jan 31 14:43:11 crc kubenswrapper[4751]: I0131 14:43:11.437293 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:11 crc kubenswrapper[4751]: I0131 14:43:11.437373 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:11 crc kubenswrapper[4751]: I0131 14:43:11.437400 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:11 crc kubenswrapper[4751]: I0131 14:43:11.437430 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:11 crc kubenswrapper[4751]: I0131 14:43:11.437457 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:11Z","lastTransitionTime":"2026-01-31T14:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:12 crc kubenswrapper[4751]: I0131 14:43:12.799399 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:12 crc kubenswrapper[4751]: I0131 14:43:12.799443 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:12 crc kubenswrapper[4751]: I0131 14:43:12.799459 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:12 crc kubenswrapper[4751]: I0131 14:43:12.799483 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:12 crc kubenswrapper[4751]: I0131 14:43:12.799500 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:12Z","lastTransitionTime":"2026-01-31T14:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:12 crc kubenswrapper[4751]: I0131 14:43:12.814640 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 14:44:52.235904482 +0000 UTC Jan 31 14:43:12 crc kubenswrapper[4751]: I0131 14:43:12.819437 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:43:12 crc kubenswrapper[4751]: E0131 14:43:12.819639 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 14:43:12 crc kubenswrapper[4751]: I0131 14:43:12.819944 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:43:12 crc kubenswrapper[4751]: E0131 14:43:12.820048 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 14:43:12 crc kubenswrapper[4751]: I0131 14:43:12.820295 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:43:12 crc kubenswrapper[4751]: E0131 14:43:12.820478 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 14:43:12 crc kubenswrapper[4751]: I0131 14:43:12.820555 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:43:12 crc kubenswrapper[4751]: E0131 14:43:12.820730 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xtn6l" podUID="68aeb9c7-d3c3-4c34-96ab-bb947421c504" Jan 31 14:43:12 crc kubenswrapper[4751]: I0131 14:43:12.902545 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:12 crc kubenswrapper[4751]: I0131 14:43:12.902630 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:12 crc kubenswrapper[4751]: I0131 14:43:12.902649 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:12 crc kubenswrapper[4751]: I0131 14:43:12.902672 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:12 crc kubenswrapper[4751]: I0131 14:43:12.902688 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:12Z","lastTransitionTime":"2026-01-31T14:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:13 crc kubenswrapper[4751]: I0131 14:43:13.005491 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:13 crc kubenswrapper[4751]: I0131 14:43:13.005537 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:13 crc kubenswrapper[4751]: I0131 14:43:13.005556 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:13 crc kubenswrapper[4751]: I0131 14:43:13.005578 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:13 crc kubenswrapper[4751]: I0131 14:43:13.005595 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:13Z","lastTransitionTime":"2026-01-31T14:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:13 crc kubenswrapper[4751]: I0131 14:43:13.108406 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:13 crc kubenswrapper[4751]: I0131 14:43:13.108463 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:13 crc kubenswrapper[4751]: I0131 14:43:13.108481 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:13 crc kubenswrapper[4751]: I0131 14:43:13.108508 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:13 crc kubenswrapper[4751]: I0131 14:43:13.108525 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:13Z","lastTransitionTime":"2026-01-31T14:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:13 crc kubenswrapper[4751]: I0131 14:43:13.210575 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:13 crc kubenswrapper[4751]: I0131 14:43:13.210608 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:13 crc kubenswrapper[4751]: I0131 14:43:13.210616 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:13 crc kubenswrapper[4751]: I0131 14:43:13.210629 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:13 crc kubenswrapper[4751]: I0131 14:43:13.210638 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:13Z","lastTransitionTime":"2026-01-31T14:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:13 crc kubenswrapper[4751]: I0131 14:43:13.313766 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:13 crc kubenswrapper[4751]: I0131 14:43:13.313810 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:13 crc kubenswrapper[4751]: I0131 14:43:13.313828 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:13 crc kubenswrapper[4751]: I0131 14:43:13.313850 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:13 crc kubenswrapper[4751]: I0131 14:43:13.313868 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:13Z","lastTransitionTime":"2026-01-31T14:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:13 crc kubenswrapper[4751]: I0131 14:43:13.417372 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:13 crc kubenswrapper[4751]: I0131 14:43:13.417411 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:13 crc kubenswrapper[4751]: I0131 14:43:13.417420 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:13 crc kubenswrapper[4751]: I0131 14:43:13.417435 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:13 crc kubenswrapper[4751]: I0131 14:43:13.417444 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:13Z","lastTransitionTime":"2026-01-31T14:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:13 crc kubenswrapper[4751]: I0131 14:43:13.520364 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:13 crc kubenswrapper[4751]: I0131 14:43:13.520434 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:13 crc kubenswrapper[4751]: I0131 14:43:13.520457 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:13 crc kubenswrapper[4751]: I0131 14:43:13.520488 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:13 crc kubenswrapper[4751]: I0131 14:43:13.520509 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:13Z","lastTransitionTime":"2026-01-31T14:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:13 crc kubenswrapper[4751]: I0131 14:43:13.623438 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:13 crc kubenswrapper[4751]: I0131 14:43:13.623483 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:13 crc kubenswrapper[4751]: I0131 14:43:13.623495 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:13 crc kubenswrapper[4751]: I0131 14:43:13.623514 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:13 crc kubenswrapper[4751]: I0131 14:43:13.623526 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:13Z","lastTransitionTime":"2026-01-31T14:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:13 crc kubenswrapper[4751]: I0131 14:43:13.730225 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:13 crc kubenswrapper[4751]: I0131 14:43:13.730288 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:13 crc kubenswrapper[4751]: I0131 14:43:13.730306 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:13 crc kubenswrapper[4751]: I0131 14:43:13.730331 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:13 crc kubenswrapper[4751]: I0131 14:43:13.730350 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:13Z","lastTransitionTime":"2026-01-31T14:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:13 crc kubenswrapper[4751]: I0131 14:43:13.814889 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 11:18:17.208372652 +0000 UTC Jan 31 14:43:13 crc kubenswrapper[4751]: I0131 14:43:13.832892 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:13 crc kubenswrapper[4751]: I0131 14:43:13.832958 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:13 crc kubenswrapper[4751]: I0131 14:43:13.832976 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:13 crc kubenswrapper[4751]: I0131 14:43:13.833001 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:13 crc kubenswrapper[4751]: I0131 14:43:13.833018 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:13Z","lastTransitionTime":"2026-01-31T14:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:13 crc kubenswrapper[4751]: I0131 14:43:13.937031 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:13 crc kubenswrapper[4751]: I0131 14:43:13.937118 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:13 crc kubenswrapper[4751]: I0131 14:43:13.937143 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:13 crc kubenswrapper[4751]: I0131 14:43:13.937189 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:13 crc kubenswrapper[4751]: I0131 14:43:13.937214 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:13Z","lastTransitionTime":"2026-01-31T14:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.040834 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.040885 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.040902 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.040927 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.040948 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:14Z","lastTransitionTime":"2026-01-31T14:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.144153 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.144233 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.144259 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.144290 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.144314 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:14Z","lastTransitionTime":"2026-01-31T14:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.247836 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.247887 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.247904 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.247928 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.247946 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:14Z","lastTransitionTime":"2026-01-31T14:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.350479 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.350539 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.350555 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.350580 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.350674 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:14Z","lastTransitionTime":"2026-01-31T14:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.405119 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.405226 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.405261 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:43:14 crc kubenswrapper[4751]: E0131 14:43:14.405383 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xtn6l" podUID="68aeb9c7-d3c3-4c34-96ab-bb947421c504" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.405426 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:43:14 crc kubenswrapper[4751]: E0131 14:43:14.405599 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 14:43:14 crc kubenswrapper[4751]: E0131 14:43:14.405761 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 14:43:14 crc kubenswrapper[4751]: E0131 14:43:14.405835 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.453597 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.453651 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.453670 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.453695 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.453712 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:14Z","lastTransitionTime":"2026-01-31T14:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.527769 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.527829 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.527844 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.527868 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.527892 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:14Z","lastTransitionTime":"2026-01-31T14:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:14 crc kubenswrapper[4751]: E0131 14:43:14.548658 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:43:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:43:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:43:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:43:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:43:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:43:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:43:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:43:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2bc08d22-1e39-4800-b402-ea260cc19637\\\",\\\"systemUUID\\\":\\\"ad4951a2-dbbf-4c4e-af3f-cce3c25b01ef\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:14Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.553725 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.553777 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.553794 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.553817 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.553834 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:14Z","lastTransitionTime":"2026-01-31T14:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:14 crc kubenswrapper[4751]: E0131 14:43:14.573812 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:43:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:43:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:43:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:43:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:43:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:43:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:43:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:43:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2bc08d22-1e39-4800-b402-ea260cc19637\\\",\\\"systemUUID\\\":\\\"ad4951a2-dbbf-4c4e-af3f-cce3c25b01ef\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:14Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.579172 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.579263 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.579312 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.579338 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.579355 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:14Z","lastTransitionTime":"2026-01-31T14:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:14 crc kubenswrapper[4751]: E0131 14:43:14.599713 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:43:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:43:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:43:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:43:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:43:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:43:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:43:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:43:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2bc08d22-1e39-4800-b402-ea260cc19637\\\",\\\"systemUUID\\\":\\\"ad4951a2-dbbf-4c4e-af3f-cce3c25b01ef\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:14Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.605017 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.605064 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.605110 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.605134 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.605151 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:14Z","lastTransitionTime":"2026-01-31T14:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:14 crc kubenswrapper[4751]: E0131 14:43:14.626449 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:43:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:43:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:43:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:43:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:43:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:43:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:43:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:43:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2bc08d22-1e39-4800-b402-ea260cc19637\\\",\\\"systemUUID\\\":\\\"ad4951a2-dbbf-4c4e-af3f-cce3c25b01ef\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:14Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.632582 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.632640 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.632658 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.632683 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.632712 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:14Z","lastTransitionTime":"2026-01-31T14:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:14 crc kubenswrapper[4751]: E0131 14:43:14.652528 4751 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:43:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:43:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:43:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:43:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:43:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:43:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T14:43:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T14:43:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2bc08d22-1e39-4800-b402-ea260cc19637\\\",\\\"systemUUID\\\":\\\"ad4951a2-dbbf-4c4e-af3f-cce3c25b01ef\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:14Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:14 crc kubenswrapper[4751]: E0131 14:43:14.652815 4751 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.654838 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.654893 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.654918 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.654946 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.654968 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:14Z","lastTransitionTime":"2026-01-31T14:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.757590 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.757658 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.757684 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.757712 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.757737 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:14Z","lastTransitionTime":"2026-01-31T14:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.815687 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 22:01:52.995827389 +0000 UTC Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.859903 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.859963 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.859984 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.860013 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.860033 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:14Z","lastTransitionTime":"2026-01-31T14:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.963465 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.963509 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.963525 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.963574 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:14 crc kubenswrapper[4751]: I0131 14:43:14.963594 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:14Z","lastTransitionTime":"2026-01-31T14:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:15 crc kubenswrapper[4751]: I0131 14:43:15.066759 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:15 crc kubenswrapper[4751]: I0131 14:43:15.066833 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:15 crc kubenswrapper[4751]: I0131 14:43:15.066849 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:15 crc kubenswrapper[4751]: I0131 14:43:15.066873 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:15 crc kubenswrapper[4751]: I0131 14:43:15.066889 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:15Z","lastTransitionTime":"2026-01-31T14:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:15 crc kubenswrapper[4751]: I0131 14:43:15.170093 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:15 crc kubenswrapper[4751]: I0131 14:43:15.170144 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:15 crc kubenswrapper[4751]: I0131 14:43:15.170160 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:15 crc kubenswrapper[4751]: I0131 14:43:15.170183 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:15 crc kubenswrapper[4751]: I0131 14:43:15.170202 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:15Z","lastTransitionTime":"2026-01-31T14:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:15 crc kubenswrapper[4751]: I0131 14:43:15.273368 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:15 crc kubenswrapper[4751]: I0131 14:43:15.273434 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:15 crc kubenswrapper[4751]: I0131 14:43:15.273451 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:15 crc kubenswrapper[4751]: I0131 14:43:15.273477 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:15 crc kubenswrapper[4751]: I0131 14:43:15.273495 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:15Z","lastTransitionTime":"2026-01-31T14:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:15 crc kubenswrapper[4751]: I0131 14:43:15.376315 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:15 crc kubenswrapper[4751]: I0131 14:43:15.376361 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:15 crc kubenswrapper[4751]: I0131 14:43:15.376370 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:15 crc kubenswrapper[4751]: I0131 14:43:15.376410 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:15 crc kubenswrapper[4751]: I0131 14:43:15.376420 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:15Z","lastTransitionTime":"2026-01-31T14:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:15 crc kubenswrapper[4751]: I0131 14:43:15.480062 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:15 crc kubenswrapper[4751]: I0131 14:43:15.480145 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:15 crc kubenswrapper[4751]: I0131 14:43:15.480162 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:15 crc kubenswrapper[4751]: I0131 14:43:15.480185 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:15 crc kubenswrapper[4751]: I0131 14:43:15.480204 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:15Z","lastTransitionTime":"2026-01-31T14:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:15 crc kubenswrapper[4751]: I0131 14:43:15.583150 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:15 crc kubenswrapper[4751]: I0131 14:43:15.583275 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:15 crc kubenswrapper[4751]: I0131 14:43:15.583298 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:15 crc kubenswrapper[4751]: I0131 14:43:15.583327 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:15 crc kubenswrapper[4751]: I0131 14:43:15.583348 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:15Z","lastTransitionTime":"2026-01-31T14:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:15 crc kubenswrapper[4751]: I0131 14:43:15.686041 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:15 crc kubenswrapper[4751]: I0131 14:43:15.686416 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:15 crc kubenswrapper[4751]: I0131 14:43:15.686608 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:15 crc kubenswrapper[4751]: I0131 14:43:15.686761 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:15 crc kubenswrapper[4751]: I0131 14:43:15.686886 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:15Z","lastTransitionTime":"2026-01-31T14:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:15 crc kubenswrapper[4751]: I0131 14:43:15.790985 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:15 crc kubenswrapper[4751]: I0131 14:43:15.791060 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:15 crc kubenswrapper[4751]: I0131 14:43:15.791108 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:15 crc kubenswrapper[4751]: I0131 14:43:15.791135 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:15 crc kubenswrapper[4751]: I0131 14:43:15.791154 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:15Z","lastTransitionTime":"2026-01-31T14:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:15 crc kubenswrapper[4751]: I0131 14:43:15.816752 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 21:13:21.548094262 +0000 UTC Jan 31 14:43:15 crc kubenswrapper[4751]: I0131 14:43:15.894342 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:15 crc kubenswrapper[4751]: I0131 14:43:15.894417 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:15 crc kubenswrapper[4751]: I0131 14:43:15.894436 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:15 crc kubenswrapper[4751]: I0131 14:43:15.894460 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:15 crc kubenswrapper[4751]: I0131 14:43:15.894477 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:15Z","lastTransitionTime":"2026-01-31T14:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:15 crc kubenswrapper[4751]: I0131 14:43:15.997739 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:15 crc kubenswrapper[4751]: I0131 14:43:15.997798 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:15 crc kubenswrapper[4751]: I0131 14:43:15.997815 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:15 crc kubenswrapper[4751]: I0131 14:43:15.997841 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:15 crc kubenswrapper[4751]: I0131 14:43:15.997860 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:15Z","lastTransitionTime":"2026-01-31T14:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.101528 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.101585 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.101603 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.101629 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.101647 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:16Z","lastTransitionTime":"2026-01-31T14:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.204617 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.204683 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.204706 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.204766 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.204785 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:16Z","lastTransitionTime":"2026-01-31T14:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.307191 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.307244 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.307262 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.307285 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.307305 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:16Z","lastTransitionTime":"2026-01-31T14:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.405472 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.405508 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:43:16 crc kubenswrapper[4751]: E0131 14:43:16.405848 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.406110 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:43:16 crc kubenswrapper[4751]: E0131 14:43:16.406928 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 14:43:16 crc kubenswrapper[4751]: E0131 14:43:16.407003 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xtn6l" podUID="68aeb9c7-d3c3-4c34-96ab-bb947421c504" Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.407319 4751 scope.go:117] "RemoveContainer" containerID="373c89defd0c3e17f3124be6af9afba6b241a48af85f558bb51d281d16ba27ac" Jan 31 14:43:16 crc kubenswrapper[4751]: E0131 14:43:16.407562 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-n8cdt_openshift-ovn-kubernetes(ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" podUID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.409127 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:43:16 crc kubenswrapper[4751]: E0131 14:43:16.409350 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.410105 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.410164 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.410188 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.410326 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.410353 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:16Z","lastTransitionTime":"2026-01-31T14:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.426168 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7adba94225f6d18961ccad4eab46916b607a2bff977a48d36fee6810aa4b7293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11d68e8091e3d18fff0b3c8f744c2e9002b6307e17815b221af7123715d8abf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:16Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.444451 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-68hvr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"658471aa-68b2-478e-9522-ef5533009174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5c5f03d09a887caf686dc5e5d446c4589c7e577cd65e9329cd73d1f33af3d53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbl8x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-68hvr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:16Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.475500 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3446cde960ab284f70a338a9a4bfee334fdbdef503d868692fb847f4df6acba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20273cc05bc490f8dd5258dccd02aa0e97d1d22c159ce2c177594920f4af83de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://701178503ef461547e687442de0607af20cee8bca075347d1d1cb2a439562232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5623c591d9a6806335a9b671648fc172e749568a7ad29c3bd41f1a990ba56010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://357afc89f219ac34037aaeb7086076b36f05907502c53006aeb588eaf440cc74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8f6719bdf2fef7e21f7c5cdcb3de4b4f08e69bd62e22cb10fe157f638b3c148\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://373c89defd0c3e17f3124be6af9afba6b241a48af85f558bb51d281d16ba27ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://373c89defd0c3e17f3124be6af9afba6b241a48af85f558bb51d281d16ba27ac\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T14:43:00Z\\\",\\\"message\\\":\\\"t handler 5\\\\nI0131 14:43:00.256116 6857 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 14:43:00.256190 6857 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 14:43:00.256429 6857 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0131 14:43:00.256447 6857 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0131 14:43:00.256455 6857 handler.go:208] Removed *v1.Node event handler 2\\\\nI0131 14:43:00.256450 6857 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0131 14:43:00.256470 6857 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0131 14:43:00.256481 6857 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0131 14:43:00.256463 6857 handler.go:208] Removed *v1.Node event handler 7\\\\nI0131 14:43:00.256712 6857 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-n8cdt_openshift-ovn-kubernetes(ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7f5e784895f315aa4d8fb6039b9e70c46f977a80fdd5d39ead81d75048abd6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zhmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n8cdt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:16Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.496900 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d4eee728dbd9b6b0ade5149fd481cb3a688a6856f56a3a25a6cb49c944ee27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:16Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.513584 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.513645 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.513660 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.513686 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.513703 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:16Z","lastTransitionTime":"2026-01-31T14:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.519110 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:16Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.538999 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:16Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.556952 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c170e8-22c9-43a9-8b34-9d626c2ccddc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d86496066c7ba97ec9852c160ecb2e25f49cb0b7ae16e667f6279be42fd52679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3956d143be77f4a50143f9678eb51ab7871e250cae73d87c9e7fce2575e466c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fv47c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wpj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:16Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.572704 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af117b1f-6308-4303-bff0-ebc3a310c356\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e9f62b49c0d916da6e1631f3216d52fd37ab407e878dc0509ccb19d0e5fb1df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0845dfce4ee156b5b52e07b6257d62908413eba9570b3767b9f00724e81e034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0845dfce4ee156b5b52e07b6257d62908413eba9570b3767b9f00724e81e034\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:16Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.608644 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23027958-cbc9-4206-8dd5-13f10df7f298\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f4a4eb52c2c850f91c212fdc556452ab8cc91168ddb67c2078b806d8725be2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e66ea760a35f4e073d5ead7b0270164010b4dd14737b23202f83a10290f75d3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa739a6a66bd2196c9131cf929bdb8a133e3e40c3dfa9a105bb3ea33fa2ede20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92d196e489f72bd3c04ada6d0ea993f0ad89eb42497efc8723720ca3a7720509\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0b0fe57d51f2684ba60b1818c1e3010e5364c6d196433972b46cb3c3f9b5e61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccd9efb7096722c8a48318444b235a1970fbec711faf7448d47696ff84da5d37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccd9efb7096722c8a48318444b235a1970fbec711faf7448d47696ff84da5d37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1020dca4733e38925646f97eb80524c4060630e33323e9a5a0fdc4634c6b468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1020dca4733e38925646f97eb80524c4060630e33323e9a5a0fdc4634c6b468\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bf0f78147bc50d98a5ba239c2456467778fb4724433d914b9ee4300ce3af6e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf0f78147bc50d98a5ba239c2456467778fb4724433d914b9ee4300ce3af6e4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:16Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.619182 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.619416 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.619622 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.619808 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.619936 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:16Z","lastTransitionTime":"2026-01-31T14:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.632694 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6afb84a0-c564-45aa-b7a1-cd6f8273fe45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39e4fe03f50f95359d284f7d5aebb0df697fba39743cf83857c7f17893cb2bff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08603a998264e94ef7871338a47019047495685fb867c5fa4f290f8a990dc6ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddd61fd469b2710e86edf5644cff4c4272ce73a59b6578c4cf941c5408e2d218\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df6c6abfaa0e16e8c6de87da9d3584b57a9e83e9dee03f98bb9de38cf71cc19\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f93de7a002a1e6a281d90042a003acf677ad136b29da511ead9acd31eb7ab2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T14:41:55Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 14:41:50.030836 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 14:41:50.033684 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-814856940/tls.crt::/tmp/serving-cert-814856940/tls.key\\\\\\\"\\\\nI0131 14:41:55.828999 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 14:41:55.832995 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 14:41:55.833028 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 14:41:55.833056 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 14:41:55.833065 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 14:41:55.854492 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 14:41:55.854645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 14:41:55.854778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 14:41:55.854833 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 14:41:55.854881 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 14:41:55.854891 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 14:41:55.855031 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 14:41:55.858196 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d34543b9a432a0dd029c18b90dd065749cf5a74b9221dcedf22a810afa749a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:16Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.652117 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64468352-f9fe-48bb-b204-b9f828c06bf8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c56c1e31014f9e3d0be8140f58cff1c752ad4be1c6c60a942bc18320bbd37b6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41a7c5739a571e6f3ec88c3798ad2604382b9320c44ddda3d41681a64c6ab2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a6478c4477b785bcb405d597f1c835faaf4ef7adb3a2bcd6e70cc2e692f44d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cbc9dfffff59a784e871d47bf2ab1b8419b11840eb329ce21c0f7c24e6e77c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cbc9dfffff59a784e871d47bf2ab1b8419b11840eb329ce21c0f7c24e6e77c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:41:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:16Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.669671 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4v79q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd8c0730-67df-445e-a6ce-c2edce5d9c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f27098d3c41dbb10f76ba04a8a989e91ff3eb6fe0fb0ca746e33417839235c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45q8c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34680c760b5c6a6e2a521731e962301b54aa3184d5d66792fb43e991c6502a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-45q8c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4v79q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:16Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.688901 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b7a3883-0c5a-4116-adb0-ad25f69cd7f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daae5a3d70247f744d33c832eaf3080a91e73ffe0b5327d111ec8e87175becdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37355e2b9d95a941be3552b713f7fe96973a49528f7a979524779b263412853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9317b698da5ee8a5f40c046d810783e523f27b6806319023e57e08f5cae267c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff7cd215439083e1e83ef38a79946be1c7e9bbc20d66cdb931c456a21bbdf17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:41:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:16Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.714478 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc4ac60a9db8d22605139ad928fb8fd667aaba0b8563be03e796ad077fb525bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:16Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.723465 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.723554 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.723574 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.723630 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.723649 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:16Z","lastTransitionTime":"2026-01-31T14:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.735760 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-rtthp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7dd989b-33df-4562-a60b-f273428fea3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bdcbdac0cc4b17e027947c041a0ee4a7d7f549aa6dbe5c07c370ca7c0c50475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ef2aa95d0f98ef0706040984786e2b48337984b508447f1b49b155ffb5f4608\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T14:42:49Z\\\",\\\"message\\\":\\\"2026-01-31T14:42:03+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_eaa2f7bd-5331-496e-8b4a-3784e7751508\\\\n2026-01-31T14:42:03+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_eaa2f7bd-5331-496e-8b4a-3784e7751508 to /host/opt/cni/bin/\\\\n2026-01-31T14:42:04Z [verbose] multus-daemon started\\\\n2026-01-31T14:42:04Z [verbose] Readiness Indicator file check\\\\n2026-01-31T14:42:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwrtf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-rtthp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:16Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.760733 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rp5sb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5353863-ec39-4357-9b86-9be42ca17916\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99c893782bcbcbd370030fe164c6849abe258ce88db65e3d22c269d20d0bef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53c2988d52c487b3b7faab22afc5682e10470560a0ce155a557f45b416496b39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8c316d1ca94c88d30ed85232f3d03c8790ddbb3776a0d7215a504e0344903db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d3c862143add96dd4656517f8402208608d33197fdb402a1cc3fde7f6c441e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c73fb39c6c9db93acb0fdc34044b1cafc05046a470e3897eeaa7fac8107691e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c73fb39c6c9db93acb0fdc34044b1cafc05046a470e3897eeaa7fac8107691e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd0a7bc51ee2198f74d0adbfa358d772dc5176a7671b6a551157ad301348181d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd0a7bc51ee2198f74d0adbfa358d772dc5176a7671b6a551157ad301348181d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dbe202f26fa5c09ba45db2a0bbb7fd7796b91e665df4529eddfdabd6ce083fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dbe202f26fa5c09ba45db2a0bbb7fd7796b91e665df4529eddfdabd6ce083fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T14:42:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T14:42:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgxmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rp5sb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:16Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.781611 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T14:41:57Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:16Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.797047 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lxrfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b895e2a-7887-41c3-b641-9c72bb085dda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d0546721c181de3a5f2f4e791dcaeba69f94467bfbf08b4470332a0b4b49fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T14:42:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s9hbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lxrfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:16Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.815774 4751 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xtn6l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68aeb9c7-d3c3-4c34-96ab-bb947421c504\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T14:42:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hljmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hljmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T14:42:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xtn6l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T14:43:16Z is after 2025-08-24T17:21:41Z" Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.817914 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 00:20:45.253881276 +0000 UTC Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.826338 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.826390 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.826407 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.826432 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.826450 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:16Z","lastTransitionTime":"2026-01-31T14:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.929382 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.929428 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.929444 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.929465 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:16 crc kubenswrapper[4751]: I0131 14:43:16.929481 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:16Z","lastTransitionTime":"2026-01-31T14:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:17 crc kubenswrapper[4751]: I0131 14:43:17.032647 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:17 crc kubenswrapper[4751]: I0131 14:43:17.032690 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:17 crc kubenswrapper[4751]: I0131 14:43:17.032705 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:17 crc kubenswrapper[4751]: I0131 14:43:17.032728 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:17 crc kubenswrapper[4751]: I0131 14:43:17.032745 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:17Z","lastTransitionTime":"2026-01-31T14:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:17 crc kubenswrapper[4751]: I0131 14:43:17.136196 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:17 crc kubenswrapper[4751]: I0131 14:43:17.136247 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:17 crc kubenswrapper[4751]: I0131 14:43:17.136264 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:17 crc kubenswrapper[4751]: I0131 14:43:17.136321 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:17 crc kubenswrapper[4751]: I0131 14:43:17.136341 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:17Z","lastTransitionTime":"2026-01-31T14:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:17 crc kubenswrapper[4751]: I0131 14:43:17.239464 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:17 crc kubenswrapper[4751]: I0131 14:43:17.239525 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:17 crc kubenswrapper[4751]: I0131 14:43:17.239534 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:17 crc kubenswrapper[4751]: I0131 14:43:17.239550 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:17 crc kubenswrapper[4751]: I0131 14:43:17.239559 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:17Z","lastTransitionTime":"2026-01-31T14:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:17 crc kubenswrapper[4751]: I0131 14:43:17.342923 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:17 crc kubenswrapper[4751]: I0131 14:43:17.342981 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:17 crc kubenswrapper[4751]: I0131 14:43:17.343000 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:17 crc kubenswrapper[4751]: I0131 14:43:17.343026 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:17 crc kubenswrapper[4751]: I0131 14:43:17.343046 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:17Z","lastTransitionTime":"2026-01-31T14:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:17 crc kubenswrapper[4751]: I0131 14:43:17.445787 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:17 crc kubenswrapper[4751]: I0131 14:43:17.445842 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:17 crc kubenswrapper[4751]: I0131 14:43:17.445858 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:17 crc kubenswrapper[4751]: I0131 14:43:17.445878 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:17 crc kubenswrapper[4751]: I0131 14:43:17.445895 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:17Z","lastTransitionTime":"2026-01-31T14:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:17 crc kubenswrapper[4751]: I0131 14:43:17.549321 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:17 crc kubenswrapper[4751]: I0131 14:43:17.549391 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:17 crc kubenswrapper[4751]: I0131 14:43:17.549410 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:17 crc kubenswrapper[4751]: I0131 14:43:17.549434 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:17 crc kubenswrapper[4751]: I0131 14:43:17.549453 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:17Z","lastTransitionTime":"2026-01-31T14:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:17 crc kubenswrapper[4751]: I0131 14:43:17.653061 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:17 crc kubenswrapper[4751]: I0131 14:43:17.653157 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:17 crc kubenswrapper[4751]: I0131 14:43:17.653176 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:17 crc kubenswrapper[4751]: I0131 14:43:17.653200 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:17 crc kubenswrapper[4751]: I0131 14:43:17.653219 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:17Z","lastTransitionTime":"2026-01-31T14:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:17 crc kubenswrapper[4751]: I0131 14:43:17.756489 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:17 crc kubenswrapper[4751]: I0131 14:43:17.756549 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:17 crc kubenswrapper[4751]: I0131 14:43:17.756565 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:17 crc kubenswrapper[4751]: I0131 14:43:17.756589 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:17 crc kubenswrapper[4751]: I0131 14:43:17.756605 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:17Z","lastTransitionTime":"2026-01-31T14:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:17 crc kubenswrapper[4751]: I0131 14:43:17.818863 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 01:54:51.983733595 +0000 UTC Jan 31 14:43:17 crc kubenswrapper[4751]: I0131 14:43:17.859793 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:17 crc kubenswrapper[4751]: I0131 14:43:17.859842 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:17 crc kubenswrapper[4751]: I0131 14:43:17.859858 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:17 crc kubenswrapper[4751]: I0131 14:43:17.859882 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:17 crc kubenswrapper[4751]: I0131 14:43:17.859899 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:17Z","lastTransitionTime":"2026-01-31T14:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:17 crc kubenswrapper[4751]: I0131 14:43:17.962583 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:17 crc kubenswrapper[4751]: I0131 14:43:17.962631 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:17 crc kubenswrapper[4751]: I0131 14:43:17.962648 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:17 crc kubenswrapper[4751]: I0131 14:43:17.962670 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:17 crc kubenswrapper[4751]: I0131 14:43:17.962687 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:17Z","lastTransitionTime":"2026-01-31T14:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:18 crc kubenswrapper[4751]: I0131 14:43:18.065245 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:18 crc kubenswrapper[4751]: I0131 14:43:18.065339 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:18 crc kubenswrapper[4751]: I0131 14:43:18.065362 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:18 crc kubenswrapper[4751]: I0131 14:43:18.065440 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:18 crc kubenswrapper[4751]: I0131 14:43:18.065466 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:18Z","lastTransitionTime":"2026-01-31T14:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:18 crc kubenswrapper[4751]: I0131 14:43:18.168656 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:18 crc kubenswrapper[4751]: I0131 14:43:18.169140 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:18 crc kubenswrapper[4751]: I0131 14:43:18.169344 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:18 crc kubenswrapper[4751]: I0131 14:43:18.169532 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:18 crc kubenswrapper[4751]: I0131 14:43:18.169686 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:18Z","lastTransitionTime":"2026-01-31T14:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:18 crc kubenswrapper[4751]: I0131 14:43:18.272921 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:18 crc kubenswrapper[4751]: I0131 14:43:18.273197 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:18 crc kubenswrapper[4751]: I0131 14:43:18.273372 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:18 crc kubenswrapper[4751]: I0131 14:43:18.273524 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:18 crc kubenswrapper[4751]: I0131 14:43:18.273655 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:18Z","lastTransitionTime":"2026-01-31T14:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:18 crc kubenswrapper[4751]: I0131 14:43:18.376627 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:18 crc kubenswrapper[4751]: I0131 14:43:18.376723 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:18 crc kubenswrapper[4751]: I0131 14:43:18.376738 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:18 crc kubenswrapper[4751]: I0131 14:43:18.376792 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:18 crc kubenswrapper[4751]: I0131 14:43:18.376809 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:18Z","lastTransitionTime":"2026-01-31T14:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:18 crc kubenswrapper[4751]: I0131 14:43:18.405145 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:43:18 crc kubenswrapper[4751]: I0131 14:43:18.405243 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:43:18 crc kubenswrapper[4751]: I0131 14:43:18.405272 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:43:18 crc kubenswrapper[4751]: E0131 14:43:18.405436 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xtn6l" podUID="68aeb9c7-d3c3-4c34-96ab-bb947421c504" Jan 31 14:43:18 crc kubenswrapper[4751]: I0131 14:43:18.405472 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:43:18 crc kubenswrapper[4751]: E0131 14:43:18.405572 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 14:43:18 crc kubenswrapper[4751]: E0131 14:43:18.405716 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 14:43:18 crc kubenswrapper[4751]: E0131 14:43:18.405954 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 14:43:18 crc kubenswrapper[4751]: I0131 14:43:18.478612 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:18 crc kubenswrapper[4751]: I0131 14:43:18.478648 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:18 crc kubenswrapper[4751]: I0131 14:43:18.478659 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:18 crc kubenswrapper[4751]: I0131 14:43:18.478675 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:18 crc kubenswrapper[4751]: I0131 14:43:18.478687 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:18Z","lastTransitionTime":"2026-01-31T14:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:18 crc kubenswrapper[4751]: I0131 14:43:18.581817 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:18 crc kubenswrapper[4751]: I0131 14:43:18.581939 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:18 crc kubenswrapper[4751]: I0131 14:43:18.581963 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:18 crc kubenswrapper[4751]: I0131 14:43:18.581997 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:18 crc kubenswrapper[4751]: I0131 14:43:18.582022 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:18Z","lastTransitionTime":"2026-01-31T14:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:18 crc kubenswrapper[4751]: I0131 14:43:18.685181 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:18 crc kubenswrapper[4751]: I0131 14:43:18.685265 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:18 crc kubenswrapper[4751]: I0131 14:43:18.685292 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:18 crc kubenswrapper[4751]: I0131 14:43:18.685324 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:18 crc kubenswrapper[4751]: I0131 14:43:18.685345 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:18Z","lastTransitionTime":"2026-01-31T14:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:18 crc kubenswrapper[4751]: I0131 14:43:18.787897 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:18 crc kubenswrapper[4751]: I0131 14:43:18.787967 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:18 crc kubenswrapper[4751]: I0131 14:43:18.787992 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:18 crc kubenswrapper[4751]: I0131 14:43:18.788024 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:18 crc kubenswrapper[4751]: I0131 14:43:18.788046 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:18Z","lastTransitionTime":"2026-01-31T14:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:18 crc kubenswrapper[4751]: I0131 14:43:18.819449 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 00:46:12.457118546 +0000 UTC Jan 31 14:43:18 crc kubenswrapper[4751]: I0131 14:43:18.891138 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:18 crc kubenswrapper[4751]: I0131 14:43:18.891216 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:18 crc kubenswrapper[4751]: I0131 14:43:18.891242 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:18 crc kubenswrapper[4751]: I0131 14:43:18.891271 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:18 crc kubenswrapper[4751]: I0131 14:43:18.891292 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:18Z","lastTransitionTime":"2026-01-31T14:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:18 crc kubenswrapper[4751]: I0131 14:43:18.993339 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:18 crc kubenswrapper[4751]: I0131 14:43:18.993413 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:18 crc kubenswrapper[4751]: I0131 14:43:18.993438 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:18 crc kubenswrapper[4751]: I0131 14:43:18.993468 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:18 crc kubenswrapper[4751]: I0131 14:43:18.993488 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:18Z","lastTransitionTime":"2026-01-31T14:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:19 crc kubenswrapper[4751]: I0131 14:43:19.095668 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:19 crc kubenswrapper[4751]: I0131 14:43:19.095722 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:19 crc kubenswrapper[4751]: I0131 14:43:19.095738 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:19 crc kubenswrapper[4751]: I0131 14:43:19.095761 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:19 crc kubenswrapper[4751]: I0131 14:43:19.095777 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:19Z","lastTransitionTime":"2026-01-31T14:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:19 crc kubenswrapper[4751]: I0131 14:43:19.198938 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:19 crc kubenswrapper[4751]: I0131 14:43:19.198997 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:19 crc kubenswrapper[4751]: I0131 14:43:19.199021 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:19 crc kubenswrapper[4751]: I0131 14:43:19.199051 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:19 crc kubenswrapper[4751]: I0131 14:43:19.199106 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:19Z","lastTransitionTime":"2026-01-31T14:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:19 crc kubenswrapper[4751]: I0131 14:43:19.302139 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:19 crc kubenswrapper[4751]: I0131 14:43:19.302194 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:19 crc kubenswrapper[4751]: I0131 14:43:19.302210 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:19 crc kubenswrapper[4751]: I0131 14:43:19.302234 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:19 crc kubenswrapper[4751]: I0131 14:43:19.302253 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:19Z","lastTransitionTime":"2026-01-31T14:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:19 crc kubenswrapper[4751]: I0131 14:43:19.405701 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:19 crc kubenswrapper[4751]: I0131 14:43:19.405786 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:19 crc kubenswrapper[4751]: I0131 14:43:19.405809 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:19 crc kubenswrapper[4751]: I0131 14:43:19.405834 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:19 crc kubenswrapper[4751]: I0131 14:43:19.405856 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:19Z","lastTransitionTime":"2026-01-31T14:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:19 crc kubenswrapper[4751]: I0131 14:43:19.508359 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:19 crc kubenswrapper[4751]: I0131 14:43:19.508420 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:19 crc kubenswrapper[4751]: I0131 14:43:19.508436 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:19 crc kubenswrapper[4751]: I0131 14:43:19.508460 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:19 crc kubenswrapper[4751]: I0131 14:43:19.508476 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:19Z","lastTransitionTime":"2026-01-31T14:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:19 crc kubenswrapper[4751]: I0131 14:43:19.611719 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:19 crc kubenswrapper[4751]: I0131 14:43:19.611773 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:19 crc kubenswrapper[4751]: I0131 14:43:19.611790 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:19 crc kubenswrapper[4751]: I0131 14:43:19.611813 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:19 crc kubenswrapper[4751]: I0131 14:43:19.611831 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:19Z","lastTransitionTime":"2026-01-31T14:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:19 crc kubenswrapper[4751]: I0131 14:43:19.714539 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:19 crc kubenswrapper[4751]: I0131 14:43:19.714608 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:19 crc kubenswrapper[4751]: I0131 14:43:19.714631 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:19 crc kubenswrapper[4751]: I0131 14:43:19.714660 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:19 crc kubenswrapper[4751]: I0131 14:43:19.714684 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:19Z","lastTransitionTime":"2026-01-31T14:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:19 crc kubenswrapper[4751]: I0131 14:43:19.816715 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:19 crc kubenswrapper[4751]: I0131 14:43:19.816777 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:19 crc kubenswrapper[4751]: I0131 14:43:19.816796 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:19 crc kubenswrapper[4751]: I0131 14:43:19.816820 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:19 crc kubenswrapper[4751]: I0131 14:43:19.816838 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:19Z","lastTransitionTime":"2026-01-31T14:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:19 crc kubenswrapper[4751]: I0131 14:43:19.819853 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 21:24:17.563291148 +0000 UTC Jan 31 14:43:19 crc kubenswrapper[4751]: I0131 14:43:19.918864 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:19 crc kubenswrapper[4751]: I0131 14:43:19.918914 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:19 crc kubenswrapper[4751]: I0131 14:43:19.918932 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:19 crc kubenswrapper[4751]: I0131 14:43:19.918956 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:19 crc kubenswrapper[4751]: I0131 14:43:19.918975 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:19Z","lastTransitionTime":"2026-01-31T14:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:19 crc kubenswrapper[4751]: I0131 14:43:19.953315 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/68aeb9c7-d3c3-4c34-96ab-bb947421c504-metrics-certs\") pod \"network-metrics-daemon-xtn6l\" (UID: \"68aeb9c7-d3c3-4c34-96ab-bb947421c504\") " pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:43:19 crc kubenswrapper[4751]: E0131 14:43:19.953484 4751 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 14:43:19 crc kubenswrapper[4751]: E0131 14:43:19.953554 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68aeb9c7-d3c3-4c34-96ab-bb947421c504-metrics-certs podName:68aeb9c7-d3c3-4c34-96ab-bb947421c504 nodeName:}" failed. No retries permitted until 2026-01-31 14:44:23.95353235 +0000 UTC m=+168.328245275 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/68aeb9c7-d3c3-4c34-96ab-bb947421c504-metrics-certs") pod "network-metrics-daemon-xtn6l" (UID: "68aeb9c7-d3c3-4c34-96ab-bb947421c504") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 14:43:20 crc kubenswrapper[4751]: I0131 14:43:20.021674 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:20 crc kubenswrapper[4751]: I0131 14:43:20.021719 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:20 crc kubenswrapper[4751]: I0131 14:43:20.021736 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:20 crc kubenswrapper[4751]: I0131 14:43:20.021758 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:20 crc kubenswrapper[4751]: I0131 14:43:20.021773 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:20Z","lastTransitionTime":"2026-01-31T14:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:20 crc kubenswrapper[4751]: I0131 14:43:20.125228 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:20 crc kubenswrapper[4751]: I0131 14:43:20.125286 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:20 crc kubenswrapper[4751]: I0131 14:43:20.125340 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:20 crc kubenswrapper[4751]: I0131 14:43:20.125367 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:20 crc kubenswrapper[4751]: I0131 14:43:20.125386 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:20Z","lastTransitionTime":"2026-01-31T14:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:20 crc kubenswrapper[4751]: I0131 14:43:20.228368 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:20 crc kubenswrapper[4751]: I0131 14:43:20.228448 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:20 crc kubenswrapper[4751]: I0131 14:43:20.228465 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:20 crc kubenswrapper[4751]: I0131 14:43:20.228899 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:20 crc kubenswrapper[4751]: I0131 14:43:20.228990 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:20Z","lastTransitionTime":"2026-01-31T14:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:20 crc kubenswrapper[4751]: I0131 14:43:20.332798 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:20 crc kubenswrapper[4751]: I0131 14:43:20.332863 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:20 crc kubenswrapper[4751]: I0131 14:43:20.332884 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:20 crc kubenswrapper[4751]: I0131 14:43:20.332915 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:20 crc kubenswrapper[4751]: I0131 14:43:20.332938 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:20Z","lastTransitionTime":"2026-01-31T14:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:20 crc kubenswrapper[4751]: I0131 14:43:20.405686 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:43:20 crc kubenswrapper[4751]: I0131 14:43:20.405822 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:43:20 crc kubenswrapper[4751]: I0131 14:43:20.406150 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:43:20 crc kubenswrapper[4751]: I0131 14:43:20.406193 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:43:20 crc kubenswrapper[4751]: E0131 14:43:20.406312 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 14:43:20 crc kubenswrapper[4751]: E0131 14:43:20.406512 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xtn6l" podUID="68aeb9c7-d3c3-4c34-96ab-bb947421c504" Jan 31 14:43:20 crc kubenswrapper[4751]: E0131 14:43:20.406656 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 14:43:20 crc kubenswrapper[4751]: E0131 14:43:20.406819 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 14:43:20 crc kubenswrapper[4751]: I0131 14:43:20.436327 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:20 crc kubenswrapper[4751]: I0131 14:43:20.436383 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:20 crc kubenswrapper[4751]: I0131 14:43:20.436399 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:20 crc kubenswrapper[4751]: I0131 14:43:20.436425 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:20 crc kubenswrapper[4751]: I0131 14:43:20.436446 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:20Z","lastTransitionTime":"2026-01-31T14:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:20 crc kubenswrapper[4751]: I0131 14:43:20.539177 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:20 crc kubenswrapper[4751]: I0131 14:43:20.539233 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:20 crc kubenswrapper[4751]: I0131 14:43:20.539249 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:20 crc kubenswrapper[4751]: I0131 14:43:20.539272 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:20 crc kubenswrapper[4751]: I0131 14:43:20.539289 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:20Z","lastTransitionTime":"2026-01-31T14:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:20 crc kubenswrapper[4751]: I0131 14:43:20.642634 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:20 crc kubenswrapper[4751]: I0131 14:43:20.642684 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:20 crc kubenswrapper[4751]: I0131 14:43:20.642700 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:20 crc kubenswrapper[4751]: I0131 14:43:20.642722 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:20 crc kubenswrapper[4751]: I0131 14:43:20.642738 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:20Z","lastTransitionTime":"2026-01-31T14:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:20 crc kubenswrapper[4751]: I0131 14:43:20.745570 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:20 crc kubenswrapper[4751]: I0131 14:43:20.745648 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:20 crc kubenswrapper[4751]: I0131 14:43:20.745670 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:20 crc kubenswrapper[4751]: I0131 14:43:20.745698 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:20 crc kubenswrapper[4751]: I0131 14:43:20.745725 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:20Z","lastTransitionTime":"2026-01-31T14:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:20 crc kubenswrapper[4751]: I0131 14:43:20.820934 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 08:20:27.620610725 +0000 UTC Jan 31 14:43:20 crc kubenswrapper[4751]: I0131 14:43:20.848058 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:20 crc kubenswrapper[4751]: I0131 14:43:20.848135 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:20 crc kubenswrapper[4751]: I0131 14:43:20.848152 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:20 crc kubenswrapper[4751]: I0131 14:43:20.848176 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:20 crc kubenswrapper[4751]: I0131 14:43:20.848193 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:20Z","lastTransitionTime":"2026-01-31T14:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:20 crc kubenswrapper[4751]: I0131 14:43:20.951438 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:20 crc kubenswrapper[4751]: I0131 14:43:20.951538 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:20 crc kubenswrapper[4751]: I0131 14:43:20.951553 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:20 crc kubenswrapper[4751]: I0131 14:43:20.951580 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:20 crc kubenswrapper[4751]: I0131 14:43:20.951611 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:20Z","lastTransitionTime":"2026-01-31T14:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:21 crc kubenswrapper[4751]: I0131 14:43:21.056141 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:21 crc kubenswrapper[4751]: I0131 14:43:21.056219 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:21 crc kubenswrapper[4751]: I0131 14:43:21.056243 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:21 crc kubenswrapper[4751]: I0131 14:43:21.056277 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:21 crc kubenswrapper[4751]: I0131 14:43:21.056300 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:21Z","lastTransitionTime":"2026-01-31T14:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:21 crc kubenswrapper[4751]: I0131 14:43:21.160100 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:21 crc kubenswrapper[4751]: I0131 14:43:21.160176 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:21 crc kubenswrapper[4751]: I0131 14:43:21.160195 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:21 crc kubenswrapper[4751]: I0131 14:43:21.160224 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:21 crc kubenswrapper[4751]: I0131 14:43:21.160243 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:21Z","lastTransitionTime":"2026-01-31T14:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:21 crc kubenswrapper[4751]: I0131 14:43:21.263799 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:21 crc kubenswrapper[4751]: I0131 14:43:21.263904 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:21 crc kubenswrapper[4751]: I0131 14:43:21.263920 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:21 crc kubenswrapper[4751]: I0131 14:43:21.263946 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:21 crc kubenswrapper[4751]: I0131 14:43:21.263968 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:21Z","lastTransitionTime":"2026-01-31T14:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:21 crc kubenswrapper[4751]: I0131 14:43:21.367480 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:21 crc kubenswrapper[4751]: I0131 14:43:21.367559 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:21 crc kubenswrapper[4751]: I0131 14:43:21.367584 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:21 crc kubenswrapper[4751]: I0131 14:43:21.367622 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:21 crc kubenswrapper[4751]: I0131 14:43:21.367651 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:21Z","lastTransitionTime":"2026-01-31T14:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:21 crc kubenswrapper[4751]: I0131 14:43:21.471498 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:21 crc kubenswrapper[4751]: I0131 14:43:21.471570 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:21 crc kubenswrapper[4751]: I0131 14:43:21.471620 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:21 crc kubenswrapper[4751]: I0131 14:43:21.471643 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:21 crc kubenswrapper[4751]: I0131 14:43:21.471659 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:21Z","lastTransitionTime":"2026-01-31T14:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:21 crc kubenswrapper[4751]: I0131 14:43:21.578234 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:21 crc kubenswrapper[4751]: I0131 14:43:21.578397 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:21 crc kubenswrapper[4751]: I0131 14:43:21.578425 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:21 crc kubenswrapper[4751]: I0131 14:43:21.578460 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:21 crc kubenswrapper[4751]: I0131 14:43:21.578497 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:21Z","lastTransitionTime":"2026-01-31T14:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:21 crc kubenswrapper[4751]: I0131 14:43:21.682866 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:21 crc kubenswrapper[4751]: I0131 14:43:21.683001 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:21 crc kubenswrapper[4751]: I0131 14:43:21.683030 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:21 crc kubenswrapper[4751]: I0131 14:43:21.683124 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:21 crc kubenswrapper[4751]: I0131 14:43:21.683152 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:21Z","lastTransitionTime":"2026-01-31T14:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:21 crc kubenswrapper[4751]: I0131 14:43:21.786874 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:21 crc kubenswrapper[4751]: I0131 14:43:21.786946 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:21 crc kubenswrapper[4751]: I0131 14:43:21.786965 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:21 crc kubenswrapper[4751]: I0131 14:43:21.786993 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:21 crc kubenswrapper[4751]: I0131 14:43:21.787014 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:21Z","lastTransitionTime":"2026-01-31T14:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:21 crc kubenswrapper[4751]: I0131 14:43:21.821775 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 22:33:08.975076182 +0000 UTC Jan 31 14:43:21 crc kubenswrapper[4751]: I0131 14:43:21.889806 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:21 crc kubenswrapper[4751]: I0131 14:43:21.889878 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:21 crc kubenswrapper[4751]: I0131 14:43:21.889901 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:21 crc kubenswrapper[4751]: I0131 14:43:21.889939 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:21 crc kubenswrapper[4751]: I0131 14:43:21.889966 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:21Z","lastTransitionTime":"2026-01-31T14:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:21 crc kubenswrapper[4751]: I0131 14:43:21.992705 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:21 crc kubenswrapper[4751]: I0131 14:43:21.992785 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:21 crc kubenswrapper[4751]: I0131 14:43:21.992808 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:21 crc kubenswrapper[4751]: I0131 14:43:21.992836 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:21 crc kubenswrapper[4751]: I0131 14:43:21.992859 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:21Z","lastTransitionTime":"2026-01-31T14:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:22 crc kubenswrapper[4751]: I0131 14:43:22.096731 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:22 crc kubenswrapper[4751]: I0131 14:43:22.096777 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:22 crc kubenswrapper[4751]: I0131 14:43:22.096798 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:22 crc kubenswrapper[4751]: I0131 14:43:22.096822 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:22 crc kubenswrapper[4751]: I0131 14:43:22.096840 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:22Z","lastTransitionTime":"2026-01-31T14:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:22 crc kubenswrapper[4751]: I0131 14:43:22.201031 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:22 crc kubenswrapper[4751]: I0131 14:43:22.201128 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:22 crc kubenswrapper[4751]: I0131 14:43:22.201145 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:22 crc kubenswrapper[4751]: I0131 14:43:22.201173 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:22 crc kubenswrapper[4751]: I0131 14:43:22.201193 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:22Z","lastTransitionTime":"2026-01-31T14:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:22 crc kubenswrapper[4751]: I0131 14:43:22.304731 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:22 crc kubenswrapper[4751]: I0131 14:43:22.304812 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:22 crc kubenswrapper[4751]: I0131 14:43:22.304834 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:22 crc kubenswrapper[4751]: I0131 14:43:22.304866 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:22 crc kubenswrapper[4751]: I0131 14:43:22.304895 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:22Z","lastTransitionTime":"2026-01-31T14:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:22 crc kubenswrapper[4751]: I0131 14:43:22.405064 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:43:22 crc kubenswrapper[4751]: I0131 14:43:22.405168 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:43:22 crc kubenswrapper[4751]: E0131 14:43:22.405342 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 14:43:22 crc kubenswrapper[4751]: I0131 14:43:22.405439 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:43:22 crc kubenswrapper[4751]: I0131 14:43:22.405555 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:43:22 crc kubenswrapper[4751]: E0131 14:43:22.405686 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xtn6l" podUID="68aeb9c7-d3c3-4c34-96ab-bb947421c504" Jan 31 14:43:22 crc kubenswrapper[4751]: E0131 14:43:22.405796 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 14:43:22 crc kubenswrapper[4751]: E0131 14:43:22.405892 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 14:43:22 crc kubenswrapper[4751]: I0131 14:43:22.408054 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:22 crc kubenswrapper[4751]: I0131 14:43:22.408126 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:22 crc kubenswrapper[4751]: I0131 14:43:22.408144 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:22 crc kubenswrapper[4751]: I0131 14:43:22.408166 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:22 crc kubenswrapper[4751]: I0131 14:43:22.408185 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:22Z","lastTransitionTime":"2026-01-31T14:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:22 crc kubenswrapper[4751]: I0131 14:43:22.511978 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:22 crc kubenswrapper[4751]: I0131 14:43:22.512032 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:22 crc kubenswrapper[4751]: I0131 14:43:22.512044 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:22 crc kubenswrapper[4751]: I0131 14:43:22.512093 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:22 crc kubenswrapper[4751]: I0131 14:43:22.512109 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:22Z","lastTransitionTime":"2026-01-31T14:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:22 crc kubenswrapper[4751]: I0131 14:43:22.615169 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:22 crc kubenswrapper[4751]: I0131 14:43:22.615222 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:22 crc kubenswrapper[4751]: I0131 14:43:22.615246 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:22 crc kubenswrapper[4751]: I0131 14:43:22.615276 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:22 crc kubenswrapper[4751]: I0131 14:43:22.615295 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:22Z","lastTransitionTime":"2026-01-31T14:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:22 crc kubenswrapper[4751]: I0131 14:43:22.718262 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:22 crc kubenswrapper[4751]: I0131 14:43:22.718329 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:22 crc kubenswrapper[4751]: I0131 14:43:22.718345 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:22 crc kubenswrapper[4751]: I0131 14:43:22.718368 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:22 crc kubenswrapper[4751]: I0131 14:43:22.718385 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:22Z","lastTransitionTime":"2026-01-31T14:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:22 crc kubenswrapper[4751]: I0131 14:43:22.821477 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:22 crc kubenswrapper[4751]: I0131 14:43:22.821535 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:22 crc kubenswrapper[4751]: I0131 14:43:22.821553 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:22 crc kubenswrapper[4751]: I0131 14:43:22.821582 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:22 crc kubenswrapper[4751]: I0131 14:43:22.821599 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:22Z","lastTransitionTime":"2026-01-31T14:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:22 crc kubenswrapper[4751]: I0131 14:43:22.822577 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 15:39:16.410631765 +0000 UTC Jan 31 14:43:22 crc kubenswrapper[4751]: I0131 14:43:22.925035 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:22 crc kubenswrapper[4751]: I0131 14:43:22.925110 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:22 crc kubenswrapper[4751]: I0131 14:43:22.925128 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:22 crc kubenswrapper[4751]: I0131 14:43:22.925152 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:22 crc kubenswrapper[4751]: I0131 14:43:22.925171 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:22Z","lastTransitionTime":"2026-01-31T14:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:23 crc kubenswrapper[4751]: I0131 14:43:23.027985 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:23 crc kubenswrapper[4751]: I0131 14:43:23.028048 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:23 crc kubenswrapper[4751]: I0131 14:43:23.028066 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:23 crc kubenswrapper[4751]: I0131 14:43:23.028120 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:23 crc kubenswrapper[4751]: I0131 14:43:23.028137 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:23Z","lastTransitionTime":"2026-01-31T14:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:23 crc kubenswrapper[4751]: I0131 14:43:23.131688 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:23 crc kubenswrapper[4751]: I0131 14:43:23.131776 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:23 crc kubenswrapper[4751]: I0131 14:43:23.131796 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:23 crc kubenswrapper[4751]: I0131 14:43:23.131828 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:23 crc kubenswrapper[4751]: I0131 14:43:23.131850 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:23Z","lastTransitionTime":"2026-01-31T14:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:23 crc kubenswrapper[4751]: I0131 14:43:23.235667 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:23 crc kubenswrapper[4751]: I0131 14:43:23.235731 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:23 crc kubenswrapper[4751]: I0131 14:43:23.235749 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:23 crc kubenswrapper[4751]: I0131 14:43:23.235774 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:23 crc kubenswrapper[4751]: I0131 14:43:23.235793 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:23Z","lastTransitionTime":"2026-01-31T14:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:23 crc kubenswrapper[4751]: I0131 14:43:23.338451 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:23 crc kubenswrapper[4751]: I0131 14:43:23.338506 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:23 crc kubenswrapper[4751]: I0131 14:43:23.338516 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:23 crc kubenswrapper[4751]: I0131 14:43:23.338536 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:23 crc kubenswrapper[4751]: I0131 14:43:23.338547 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:23Z","lastTransitionTime":"2026-01-31T14:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:23 crc kubenswrapper[4751]: I0131 14:43:23.442177 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:23 crc kubenswrapper[4751]: I0131 14:43:23.442240 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:23 crc kubenswrapper[4751]: I0131 14:43:23.442257 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:23 crc kubenswrapper[4751]: I0131 14:43:23.442281 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:23 crc kubenswrapper[4751]: I0131 14:43:23.442298 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:23Z","lastTransitionTime":"2026-01-31T14:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:23 crc kubenswrapper[4751]: I0131 14:43:23.546119 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:23 crc kubenswrapper[4751]: I0131 14:43:23.546198 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:23 crc kubenswrapper[4751]: I0131 14:43:23.546217 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:23 crc kubenswrapper[4751]: I0131 14:43:23.546246 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:23 crc kubenswrapper[4751]: I0131 14:43:23.546269 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:23Z","lastTransitionTime":"2026-01-31T14:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:23 crc kubenswrapper[4751]: I0131 14:43:23.649477 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:23 crc kubenswrapper[4751]: I0131 14:43:23.649572 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:23 crc kubenswrapper[4751]: I0131 14:43:23.649598 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:23 crc kubenswrapper[4751]: I0131 14:43:23.649638 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:23 crc kubenswrapper[4751]: I0131 14:43:23.649665 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:23Z","lastTransitionTime":"2026-01-31T14:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:23 crc kubenswrapper[4751]: I0131 14:43:23.753216 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:23 crc kubenswrapper[4751]: I0131 14:43:23.753289 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:23 crc kubenswrapper[4751]: I0131 14:43:23.753310 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:23 crc kubenswrapper[4751]: I0131 14:43:23.753337 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:23 crc kubenswrapper[4751]: I0131 14:43:23.753356 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:23Z","lastTransitionTime":"2026-01-31T14:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:23 crc kubenswrapper[4751]: I0131 14:43:23.822781 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 19:01:42.810278795 +0000 UTC Jan 31 14:43:23 crc kubenswrapper[4751]: I0131 14:43:23.857477 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:23 crc kubenswrapper[4751]: I0131 14:43:23.857543 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:23 crc kubenswrapper[4751]: I0131 14:43:23.857560 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:23 crc kubenswrapper[4751]: I0131 14:43:23.858529 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:23 crc kubenswrapper[4751]: I0131 14:43:23.858567 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:23Z","lastTransitionTime":"2026-01-31T14:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:23 crc kubenswrapper[4751]: I0131 14:43:23.962578 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:23 crc kubenswrapper[4751]: I0131 14:43:23.962998 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:23 crc kubenswrapper[4751]: I0131 14:43:23.963249 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:23 crc kubenswrapper[4751]: I0131 14:43:23.963446 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:23 crc kubenswrapper[4751]: I0131 14:43:23.963586 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:23Z","lastTransitionTime":"2026-01-31T14:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:24 crc kubenswrapper[4751]: I0131 14:43:24.067192 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:24 crc kubenswrapper[4751]: I0131 14:43:24.067589 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:24 crc kubenswrapper[4751]: I0131 14:43:24.067790 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:24 crc kubenswrapper[4751]: I0131 14:43:24.068033 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:24 crc kubenswrapper[4751]: I0131 14:43:24.068258 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:24Z","lastTransitionTime":"2026-01-31T14:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:24 crc kubenswrapper[4751]: I0131 14:43:24.171281 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:24 crc kubenswrapper[4751]: I0131 14:43:24.172024 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:24 crc kubenswrapper[4751]: I0131 14:43:24.172152 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:24 crc kubenswrapper[4751]: I0131 14:43:24.172202 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:24 crc kubenswrapper[4751]: I0131 14:43:24.172220 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:24Z","lastTransitionTime":"2026-01-31T14:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:24 crc kubenswrapper[4751]: I0131 14:43:24.275626 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:24 crc kubenswrapper[4751]: I0131 14:43:24.275696 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:24 crc kubenswrapper[4751]: I0131 14:43:24.275708 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:24 crc kubenswrapper[4751]: I0131 14:43:24.275732 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:24 crc kubenswrapper[4751]: I0131 14:43:24.275747 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:24Z","lastTransitionTime":"2026-01-31T14:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:24 crc kubenswrapper[4751]: I0131 14:43:24.379537 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:24 crc kubenswrapper[4751]: I0131 14:43:24.379597 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:24 crc kubenswrapper[4751]: I0131 14:43:24.379610 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:24 crc kubenswrapper[4751]: I0131 14:43:24.379630 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:24 crc kubenswrapper[4751]: I0131 14:43:24.379646 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:24Z","lastTransitionTime":"2026-01-31T14:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:24 crc kubenswrapper[4751]: I0131 14:43:24.405063 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:43:24 crc kubenswrapper[4751]: I0131 14:43:24.405118 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:43:24 crc kubenswrapper[4751]: I0131 14:43:24.405200 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:43:24 crc kubenswrapper[4751]: I0131 14:43:24.405123 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:43:24 crc kubenswrapper[4751]: E0131 14:43:24.405340 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 14:43:24 crc kubenswrapper[4751]: E0131 14:43:24.405463 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 14:43:24 crc kubenswrapper[4751]: E0131 14:43:24.405599 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xtn6l" podUID="68aeb9c7-d3c3-4c34-96ab-bb947421c504" Jan 31 14:43:24 crc kubenswrapper[4751]: E0131 14:43:24.405924 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 14:43:24 crc kubenswrapper[4751]: I0131 14:43:24.483498 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:24 crc kubenswrapper[4751]: I0131 14:43:24.483570 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:24 crc kubenswrapper[4751]: I0131 14:43:24.483588 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:24 crc kubenswrapper[4751]: I0131 14:43:24.483617 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:24 crc kubenswrapper[4751]: I0131 14:43:24.483638 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:24Z","lastTransitionTime":"2026-01-31T14:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:24 crc kubenswrapper[4751]: I0131 14:43:24.587393 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:24 crc kubenswrapper[4751]: I0131 14:43:24.587459 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:24 crc kubenswrapper[4751]: I0131 14:43:24.587476 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:24 crc kubenswrapper[4751]: I0131 14:43:24.587502 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:24 crc kubenswrapper[4751]: I0131 14:43:24.587521 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:24Z","lastTransitionTime":"2026-01-31T14:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:24 crc kubenswrapper[4751]: I0131 14:43:24.690771 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:24 crc kubenswrapper[4751]: I0131 14:43:24.690844 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:24 crc kubenswrapper[4751]: I0131 14:43:24.690866 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:24 crc kubenswrapper[4751]: I0131 14:43:24.690896 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:24 crc kubenswrapper[4751]: I0131 14:43:24.690919 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:24Z","lastTransitionTime":"2026-01-31T14:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:24 crc kubenswrapper[4751]: I0131 14:43:24.793900 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:24 crc kubenswrapper[4751]: I0131 14:43:24.793954 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:24 crc kubenswrapper[4751]: I0131 14:43:24.793966 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:24 crc kubenswrapper[4751]: I0131 14:43:24.793984 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:24 crc kubenswrapper[4751]: I0131 14:43:24.793997 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:24Z","lastTransitionTime":"2026-01-31T14:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:24 crc kubenswrapper[4751]: I0131 14:43:24.822978 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 12:51:38.261430905 +0000 UTC Jan 31 14:43:24 crc kubenswrapper[4751]: I0131 14:43:24.898042 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:24 crc kubenswrapper[4751]: I0131 14:43:24.898148 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:24 crc kubenswrapper[4751]: I0131 14:43:24.898167 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:24 crc kubenswrapper[4751]: I0131 14:43:24.898199 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:24 crc kubenswrapper[4751]: I0131 14:43:24.898223 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:24Z","lastTransitionTime":"2026-01-31T14:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:24 crc kubenswrapper[4751]: I0131 14:43:24.971887 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 14:43:24 crc kubenswrapper[4751]: I0131 14:43:24.971982 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 14:43:24 crc kubenswrapper[4751]: I0131 14:43:24.972010 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 14:43:24 crc kubenswrapper[4751]: I0131 14:43:24.972045 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 14:43:24 crc kubenswrapper[4751]: I0131 14:43:24.972111 4751 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T14:43:24Z","lastTransitionTime":"2026-01-31T14:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 14:43:25 crc kubenswrapper[4751]: I0131 14:43:25.044856 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-8ph7v"] Jan 31 14:43:25 crc kubenswrapper[4751]: I0131 14:43:25.045494 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8ph7v" Jan 31 14:43:25 crc kubenswrapper[4751]: I0131 14:43:25.049132 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 31 14:43:25 crc kubenswrapper[4751]: I0131 14:43:25.049412 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 31 14:43:25 crc kubenswrapper[4751]: I0131 14:43:25.049569 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 31 14:43:25 crc kubenswrapper[4751]: I0131 14:43:25.051034 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 31 14:43:25 crc kubenswrapper[4751]: I0131 14:43:25.109770 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-lxrfr" podStartSLOduration=85.109739001 podStartE2EDuration="1m25.109739001s" podCreationTimestamp="2026-01-31 14:42:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:43:25.092456282 +0000 UTC m=+109.467169197" watchObservedRunningTime="2026-01-31 14:43:25.109739001 +0000 UTC m=+109.484451926" Jan 31 14:43:25 crc kubenswrapper[4751]: I0131 14:43:25.171625 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-68hvr" podStartSLOduration=85.171585481 podStartE2EDuration="1m25.171585481s" podCreationTimestamp="2026-01-31 14:42:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:43:25.170432141 +0000 UTC m=+109.545145056" watchObservedRunningTime="2026-01-31 14:43:25.171585481 +0000 UTC m=+109.546298406" Jan 31 14:43:25 crc kubenswrapper[4751]: I0131 14:43:25.227727 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/c799c46b-62ca-4376-bcdf-6b77761ad60a-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-8ph7v\" (UID: \"c799c46b-62ca-4376-bcdf-6b77761ad60a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8ph7v" Jan 31 14:43:25 crc kubenswrapper[4751]: I0131 14:43:25.227841 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/c799c46b-62ca-4376-bcdf-6b77761ad60a-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-8ph7v\" (UID: \"c799c46b-62ca-4376-bcdf-6b77761ad60a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8ph7v" Jan 31 14:43:25 crc kubenswrapper[4751]: I0131 14:43:25.228038 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c799c46b-62ca-4376-bcdf-6b77761ad60a-service-ca\") pod \"cluster-version-operator-5c965bbfc6-8ph7v\" (UID: \"c799c46b-62ca-4376-bcdf-6b77761ad60a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8ph7v" Jan 31 14:43:25 crc kubenswrapper[4751]: I0131 14:43:25.228148 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c799c46b-62ca-4376-bcdf-6b77761ad60a-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-8ph7v\" (UID: \"c799c46b-62ca-4376-bcdf-6b77761ad60a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8ph7v" Jan 31 14:43:25 crc kubenswrapper[4751]: I0131 14:43:25.228254 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c799c46b-62ca-4376-bcdf-6b77761ad60a-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-8ph7v\" (UID: \"c799c46b-62ca-4376-bcdf-6b77761ad60a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8ph7v" Jan 31 14:43:25 crc kubenswrapper[4751]: I0131 14:43:25.245328 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=57.245296108 podStartE2EDuration="57.245296108s" podCreationTimestamp="2026-01-31 14:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:43:25.245008061 +0000 UTC m=+109.619720976" watchObservedRunningTime="2026-01-31 14:43:25.245296108 +0000 UTC m=+109.620009023" Jan 31 14:43:25 crc kubenswrapper[4751]: I0131 14:43:25.329669 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c799c46b-62ca-4376-bcdf-6b77761ad60a-service-ca\") pod \"cluster-version-operator-5c965bbfc6-8ph7v\" (UID: \"c799c46b-62ca-4376-bcdf-6b77761ad60a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8ph7v" Jan 31 14:43:25 crc kubenswrapper[4751]: I0131 14:43:25.329731 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c799c46b-62ca-4376-bcdf-6b77761ad60a-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-8ph7v\" (UID: \"c799c46b-62ca-4376-bcdf-6b77761ad60a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8ph7v" Jan 31 14:43:25 crc kubenswrapper[4751]: I0131 14:43:25.329787 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c799c46b-62ca-4376-bcdf-6b77761ad60a-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-8ph7v\" (UID: \"c799c46b-62ca-4376-bcdf-6b77761ad60a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8ph7v" Jan 31 14:43:25 crc kubenswrapper[4751]: I0131 14:43:25.329872 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/c799c46b-62ca-4376-bcdf-6b77761ad60a-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-8ph7v\" (UID: \"c799c46b-62ca-4376-bcdf-6b77761ad60a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8ph7v" Jan 31 14:43:25 crc kubenswrapper[4751]: I0131 14:43:25.329928 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/c799c46b-62ca-4376-bcdf-6b77761ad60a-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-8ph7v\" (UID: \"c799c46b-62ca-4376-bcdf-6b77761ad60a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8ph7v" Jan 31 14:43:25 crc kubenswrapper[4751]: I0131 14:43:25.330031 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/c799c46b-62ca-4376-bcdf-6b77761ad60a-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-8ph7v\" (UID: \"c799c46b-62ca-4376-bcdf-6b77761ad60a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8ph7v" Jan 31 14:43:25 crc kubenswrapper[4751]: I0131 14:43:25.330572 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/c799c46b-62ca-4376-bcdf-6b77761ad60a-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-8ph7v\" (UID: \"c799c46b-62ca-4376-bcdf-6b77761ad60a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8ph7v" Jan 31 14:43:25 crc kubenswrapper[4751]: I0131 14:43:25.331467 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c799c46b-62ca-4376-bcdf-6b77761ad60a-service-ca\") pod \"cluster-version-operator-5c965bbfc6-8ph7v\" (UID: \"c799c46b-62ca-4376-bcdf-6b77761ad60a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8ph7v" Jan 31 14:43:25 crc kubenswrapper[4751]: I0131 14:43:25.331744 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podStartSLOduration=85.331716767 podStartE2EDuration="1m25.331716767s" podCreationTimestamp="2026-01-31 14:42:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:43:25.331704577 +0000 UTC m=+109.706417502" watchObservedRunningTime="2026-01-31 14:43:25.331716767 +0000 UTC m=+109.706429692" Jan 31 14:43:25 crc kubenswrapper[4751]: I0131 14:43:25.339807 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c799c46b-62ca-4376-bcdf-6b77761ad60a-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-8ph7v\" (UID: \"c799c46b-62ca-4376-bcdf-6b77761ad60a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8ph7v" Jan 31 14:43:25 crc kubenswrapper[4751]: I0131 14:43:25.367045 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c799c46b-62ca-4376-bcdf-6b77761ad60a-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-8ph7v\" (UID: \"c799c46b-62ca-4376-bcdf-6b77761ad60a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8ph7v" Jan 31 14:43:25 crc kubenswrapper[4751]: I0131 14:43:25.374307 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8ph7v" Jan 31 14:43:25 crc kubenswrapper[4751]: I0131 14:43:25.403500 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=25.403479234 podStartE2EDuration="25.403479234s" podCreationTimestamp="2026-01-31 14:43:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:43:25.357392085 +0000 UTC m=+109.732105000" watchObservedRunningTime="2026-01-31 14:43:25.403479234 +0000 UTC m=+109.778192129" Jan 31 14:43:25 crc kubenswrapper[4751]: I0131 14:43:25.407313 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=23.407294434 podStartE2EDuration="23.407294434s" podCreationTimestamp="2026-01-31 14:43:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:43:25.402780176 +0000 UTC m=+109.777493071" watchObservedRunningTime="2026-01-31 14:43:25.407294434 +0000 UTC m=+109.782007319" Jan 31 14:43:25 crc kubenswrapper[4751]: I0131 14:43:25.429642 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=88.429622255 podStartE2EDuration="1m28.429622255s" podCreationTimestamp="2026-01-31 14:41:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:43:25.425840426 +0000 UTC m=+109.800553321" watchObservedRunningTime="2026-01-31 14:43:25.429622255 +0000 UTC m=+109.804335150" Jan 31 14:43:25 crc kubenswrapper[4751]: I0131 14:43:25.468189 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-rp5sb" podStartSLOduration=84.468162277 podStartE2EDuration="1m24.468162277s" podCreationTimestamp="2026-01-31 14:42:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:43:25.453482915 +0000 UTC m=+109.828195810" watchObservedRunningTime="2026-01-31 14:43:25.468162277 +0000 UTC m=+109.842875192" Jan 31 14:43:25 crc kubenswrapper[4751]: I0131 14:43:25.487172 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=88.487149741 podStartE2EDuration="1m28.487149741s" podCreationTimestamp="2026-01-31 14:41:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:43:25.486873224 +0000 UTC m=+109.861586129" watchObservedRunningTime="2026-01-31 14:43:25.487149741 +0000 UTC m=+109.861862646" Jan 31 14:43:25 crc kubenswrapper[4751]: I0131 14:43:25.487486 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4v79q" podStartSLOduration=84.48747655 podStartE2EDuration="1m24.48747655s" podCreationTimestamp="2026-01-31 14:42:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:43:25.468737672 +0000 UTC m=+109.843450577" watchObservedRunningTime="2026-01-31 14:43:25.48747655 +0000 UTC m=+109.862189445" Jan 31 14:43:25 crc kubenswrapper[4751]: I0131 14:43:25.514125 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-rtthp" podStartSLOduration=84.514060082 podStartE2EDuration="1m24.514060082s" podCreationTimestamp="2026-01-31 14:42:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:43:25.513890807 +0000 UTC m=+109.888603732" watchObservedRunningTime="2026-01-31 14:43:25.514060082 +0000 UTC m=+109.888772967" Jan 31 14:43:25 crc kubenswrapper[4751]: I0131 14:43:25.824211 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 06:16:53.999549957 +0000 UTC Jan 31 14:43:25 crc kubenswrapper[4751]: I0131 14:43:25.824317 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Jan 31 14:43:25 crc kubenswrapper[4751]: I0131 14:43:25.835635 4751 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 31 14:43:25 crc kubenswrapper[4751]: I0131 14:43:25.870202 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8ph7v" event={"ID":"c799c46b-62ca-4376-bcdf-6b77761ad60a","Type":"ContainerStarted","Data":"28aa82b3a8224fba26d4ab32d68fa1950fe98de42ee5b731dcf5aae8f47bfc4b"} Jan 31 14:43:25 crc kubenswrapper[4751]: I0131 14:43:25.870286 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8ph7v" event={"ID":"c799c46b-62ca-4376-bcdf-6b77761ad60a","Type":"ContainerStarted","Data":"5446e418d148c28288fe46409508bc2b5114d5b8e44072195c980de3cedbc769"} Jan 31 14:43:26 crc kubenswrapper[4751]: I0131 14:43:26.405385 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:43:26 crc kubenswrapper[4751]: I0131 14:43:26.405483 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:43:26 crc kubenswrapper[4751]: E0131 14:43:26.407224 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 14:43:26 crc kubenswrapper[4751]: I0131 14:43:26.407272 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:43:26 crc kubenswrapper[4751]: E0131 14:43:26.407477 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 14:43:26 crc kubenswrapper[4751]: I0131 14:43:26.407539 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:43:26 crc kubenswrapper[4751]: E0131 14:43:26.407724 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xtn6l" podUID="68aeb9c7-d3c3-4c34-96ab-bb947421c504" Jan 31 14:43:26 crc kubenswrapper[4751]: E0131 14:43:26.407563 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 14:43:28 crc kubenswrapper[4751]: I0131 14:43:28.404934 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:43:28 crc kubenswrapper[4751]: I0131 14:43:28.405050 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:43:28 crc kubenswrapper[4751]: E0131 14:43:28.405174 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 14:43:28 crc kubenswrapper[4751]: I0131 14:43:28.405050 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:43:28 crc kubenswrapper[4751]: E0131 14:43:28.405274 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 14:43:28 crc kubenswrapper[4751]: E0131 14:43:28.405336 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xtn6l" podUID="68aeb9c7-d3c3-4c34-96ab-bb947421c504" Jan 31 14:43:28 crc kubenswrapper[4751]: I0131 14:43:28.405351 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:43:28 crc kubenswrapper[4751]: E0131 14:43:28.405445 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 14:43:29 crc kubenswrapper[4751]: I0131 14:43:29.406807 4751 scope.go:117] "RemoveContainer" containerID="373c89defd0c3e17f3124be6af9afba6b241a48af85f558bb51d281d16ba27ac" Jan 31 14:43:29 crc kubenswrapper[4751]: E0131 14:43:29.407168 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-n8cdt_openshift-ovn-kubernetes(ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" podUID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" Jan 31 14:43:30 crc kubenswrapper[4751]: I0131 14:43:30.405900 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:43:30 crc kubenswrapper[4751]: I0131 14:43:30.405972 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:43:30 crc kubenswrapper[4751]: I0131 14:43:30.406017 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:43:30 crc kubenswrapper[4751]: I0131 14:43:30.405898 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:43:30 crc kubenswrapper[4751]: E0131 14:43:30.406200 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 14:43:30 crc kubenswrapper[4751]: E0131 14:43:30.406332 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xtn6l" podUID="68aeb9c7-d3c3-4c34-96ab-bb947421c504" Jan 31 14:43:30 crc kubenswrapper[4751]: E0131 14:43:30.406445 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 14:43:30 crc kubenswrapper[4751]: E0131 14:43:30.406589 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 14:43:32 crc kubenswrapper[4751]: I0131 14:43:32.404952 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:43:32 crc kubenswrapper[4751]: I0131 14:43:32.404964 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:43:32 crc kubenswrapper[4751]: I0131 14:43:32.405091 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:43:32 crc kubenswrapper[4751]: I0131 14:43:32.405278 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:43:32 crc kubenswrapper[4751]: E0131 14:43:32.405290 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 14:43:32 crc kubenswrapper[4751]: E0131 14:43:32.405389 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 14:43:32 crc kubenswrapper[4751]: E0131 14:43:32.405518 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xtn6l" podUID="68aeb9c7-d3c3-4c34-96ab-bb947421c504" Jan 31 14:43:32 crc kubenswrapper[4751]: E0131 14:43:32.405857 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 14:43:34 crc kubenswrapper[4751]: I0131 14:43:34.405012 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:43:34 crc kubenswrapper[4751]: I0131 14:43:34.405139 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:43:34 crc kubenswrapper[4751]: I0131 14:43:34.405139 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:43:34 crc kubenswrapper[4751]: E0131 14:43:34.406331 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xtn6l" podUID="68aeb9c7-d3c3-4c34-96ab-bb947421c504" Jan 31 14:43:34 crc kubenswrapper[4751]: I0131 14:43:34.405315 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:43:34 crc kubenswrapper[4751]: E0131 14:43:34.406759 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 14:43:34 crc kubenswrapper[4751]: E0131 14:43:34.406335 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 14:43:34 crc kubenswrapper[4751]: E0131 14:43:34.407159 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 14:43:35 crc kubenswrapper[4751]: I0131 14:43:35.908330 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rtthp_e7dd989b-33df-4562-a60b-f273428fea3d/kube-multus/1.log" Jan 31 14:43:35 crc kubenswrapper[4751]: I0131 14:43:35.909430 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rtthp_e7dd989b-33df-4562-a60b-f273428fea3d/kube-multus/0.log" Jan 31 14:43:35 crc kubenswrapper[4751]: I0131 14:43:35.909507 4751 generic.go:334] "Generic (PLEG): container finished" podID="e7dd989b-33df-4562-a60b-f273428fea3d" containerID="2bdcbdac0cc4b17e027947c041a0ee4a7d7f549aa6dbe5c07c370ca7c0c50475" exitCode=1 Jan 31 14:43:35 crc kubenswrapper[4751]: I0131 14:43:35.909563 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rtthp" event={"ID":"e7dd989b-33df-4562-a60b-f273428fea3d","Type":"ContainerDied","Data":"2bdcbdac0cc4b17e027947c041a0ee4a7d7f549aa6dbe5c07c370ca7c0c50475"} Jan 31 14:43:35 crc kubenswrapper[4751]: I0131 14:43:35.909623 4751 scope.go:117] "RemoveContainer" containerID="7ef2aa95d0f98ef0706040984786e2b48337984b508447f1b49b155ffb5f4608" Jan 31 14:43:35 crc kubenswrapper[4751]: I0131 14:43:35.911465 4751 scope.go:117] "RemoveContainer" containerID="2bdcbdac0cc4b17e027947c041a0ee4a7d7f549aa6dbe5c07c370ca7c0c50475" Jan 31 14:43:35 crc kubenswrapper[4751]: E0131 14:43:35.911906 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-rtthp_openshift-multus(e7dd989b-33df-4562-a60b-f273428fea3d)\"" pod="openshift-multus/multus-rtthp" podUID="e7dd989b-33df-4562-a60b-f273428fea3d" Jan 31 14:43:35 crc kubenswrapper[4751]: I0131 14:43:35.939158 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8ph7v" podStartSLOduration=95.939130477 podStartE2EDuration="1m35.939130477s" podCreationTimestamp="2026-01-31 14:42:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:43:25.893100314 +0000 UTC m=+110.267813209" watchObservedRunningTime="2026-01-31 14:43:35.939130477 +0000 UTC m=+120.313843402" Jan 31 14:43:36 crc kubenswrapper[4751]: E0131 14:43:36.380937 4751 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Jan 31 14:43:36 crc kubenswrapper[4751]: I0131 14:43:36.404910 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:43:36 crc kubenswrapper[4751]: I0131 14:43:36.404966 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:43:36 crc kubenswrapper[4751]: I0131 14:43:36.405011 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:43:36 crc kubenswrapper[4751]: I0131 14:43:36.405060 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:43:36 crc kubenswrapper[4751]: E0131 14:43:36.407146 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xtn6l" podUID="68aeb9c7-d3c3-4c34-96ab-bb947421c504" Jan 31 14:43:36 crc kubenswrapper[4751]: E0131 14:43:36.407253 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 14:43:36 crc kubenswrapper[4751]: E0131 14:43:36.407438 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 14:43:36 crc kubenswrapper[4751]: E0131 14:43:36.407587 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 14:43:36 crc kubenswrapper[4751]: E0131 14:43:36.503255 4751 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 31 14:43:36 crc kubenswrapper[4751]: I0131 14:43:36.915128 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rtthp_e7dd989b-33df-4562-a60b-f273428fea3d/kube-multus/1.log" Jan 31 14:43:38 crc kubenswrapper[4751]: I0131 14:43:38.405362 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:43:38 crc kubenswrapper[4751]: E0131 14:43:38.405508 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 14:43:38 crc kubenswrapper[4751]: I0131 14:43:38.405541 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:43:38 crc kubenswrapper[4751]: I0131 14:43:38.405580 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:43:38 crc kubenswrapper[4751]: E0131 14:43:38.405687 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xtn6l" podUID="68aeb9c7-d3c3-4c34-96ab-bb947421c504" Jan 31 14:43:38 crc kubenswrapper[4751]: E0131 14:43:38.405957 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 14:43:38 crc kubenswrapper[4751]: I0131 14:43:38.405382 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:43:38 crc kubenswrapper[4751]: E0131 14:43:38.406351 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 14:43:40 crc kubenswrapper[4751]: I0131 14:43:40.405665 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:43:40 crc kubenswrapper[4751]: I0131 14:43:40.405745 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:43:40 crc kubenswrapper[4751]: E0131 14:43:40.406374 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xtn6l" podUID="68aeb9c7-d3c3-4c34-96ab-bb947421c504" Jan 31 14:43:40 crc kubenswrapper[4751]: I0131 14:43:40.405859 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:43:40 crc kubenswrapper[4751]: E0131 14:43:40.406685 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 14:43:40 crc kubenswrapper[4751]: E0131 14:43:40.406567 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 14:43:40 crc kubenswrapper[4751]: I0131 14:43:40.405812 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:43:40 crc kubenswrapper[4751]: E0131 14:43:40.406988 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 14:43:41 crc kubenswrapper[4751]: E0131 14:43:41.504446 4751 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 31 14:43:42 crc kubenswrapper[4751]: I0131 14:43:42.405615 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:43:42 crc kubenswrapper[4751]: I0131 14:43:42.405681 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:43:42 crc kubenswrapper[4751]: I0131 14:43:42.405726 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:43:42 crc kubenswrapper[4751]: I0131 14:43:42.405772 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:43:42 crc kubenswrapper[4751]: E0131 14:43:42.406819 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xtn6l" podUID="68aeb9c7-d3c3-4c34-96ab-bb947421c504" Jan 31 14:43:42 crc kubenswrapper[4751]: E0131 14:43:42.406899 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 14:43:42 crc kubenswrapper[4751]: E0131 14:43:42.406972 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 14:43:42 crc kubenswrapper[4751]: I0131 14:43:42.407116 4751 scope.go:117] "RemoveContainer" containerID="373c89defd0c3e17f3124be6af9afba6b241a48af85f558bb51d281d16ba27ac" Jan 31 14:43:42 crc kubenswrapper[4751]: E0131 14:43:42.407515 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 14:43:42 crc kubenswrapper[4751]: I0131 14:43:42.938148 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n8cdt_ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b/ovnkube-controller/3.log" Jan 31 14:43:42 crc kubenswrapper[4751]: I0131 14:43:42.940681 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" event={"ID":"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b","Type":"ContainerStarted","Data":"153c98b7ebe36043f7ae094ec4ae3226c12652e95174c4ff2d00efc441bdb785"} Jan 31 14:43:42 crc kubenswrapper[4751]: I0131 14:43:42.941144 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:43:42 crc kubenswrapper[4751]: I0131 14:43:42.978656 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" podStartSLOduration=101.9786403 podStartE2EDuration="1m41.9786403s" podCreationTimestamp="2026-01-31 14:42:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:43:42.976955907 +0000 UTC m=+127.351668802" watchObservedRunningTime="2026-01-31 14:43:42.9786403 +0000 UTC m=+127.353353205" Jan 31 14:43:43 crc kubenswrapper[4751]: I0131 14:43:43.391776 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-xtn6l"] Jan 31 14:43:43 crc kubenswrapper[4751]: I0131 14:43:43.391878 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:43:43 crc kubenswrapper[4751]: E0131 14:43:43.391974 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xtn6l" podUID="68aeb9c7-d3c3-4c34-96ab-bb947421c504" Jan 31 14:43:44 crc kubenswrapper[4751]: I0131 14:43:44.404856 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:43:44 crc kubenswrapper[4751]: I0131 14:43:44.404952 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:43:44 crc kubenswrapper[4751]: E0131 14:43:44.405409 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 14:43:44 crc kubenswrapper[4751]: I0131 14:43:44.405051 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:43:44 crc kubenswrapper[4751]: E0131 14:43:44.405517 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 14:43:44 crc kubenswrapper[4751]: E0131 14:43:44.405679 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 14:43:45 crc kubenswrapper[4751]: I0131 14:43:45.405814 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:43:45 crc kubenswrapper[4751]: E0131 14:43:45.405954 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xtn6l" podUID="68aeb9c7-d3c3-4c34-96ab-bb947421c504" Jan 31 14:43:46 crc kubenswrapper[4751]: I0131 14:43:46.405305 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:43:46 crc kubenswrapper[4751]: I0131 14:43:46.405327 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:43:46 crc kubenswrapper[4751]: I0131 14:43:46.406462 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:43:46 crc kubenswrapper[4751]: E0131 14:43:46.406454 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 14:43:46 crc kubenswrapper[4751]: E0131 14:43:46.406638 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 14:43:46 crc kubenswrapper[4751]: E0131 14:43:46.406727 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 14:43:46 crc kubenswrapper[4751]: E0131 14:43:46.506371 4751 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 31 14:43:47 crc kubenswrapper[4751]: I0131 14:43:47.405565 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:43:47 crc kubenswrapper[4751]: E0131 14:43:47.405711 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xtn6l" podUID="68aeb9c7-d3c3-4c34-96ab-bb947421c504" Jan 31 14:43:47 crc kubenswrapper[4751]: I0131 14:43:47.406286 4751 scope.go:117] "RemoveContainer" containerID="2bdcbdac0cc4b17e027947c041a0ee4a7d7f549aa6dbe5c07c370ca7c0c50475" Jan 31 14:43:47 crc kubenswrapper[4751]: I0131 14:43:47.960597 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rtthp_e7dd989b-33df-4562-a60b-f273428fea3d/kube-multus/1.log" Jan 31 14:43:47 crc kubenswrapper[4751]: I0131 14:43:47.960990 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rtthp" event={"ID":"e7dd989b-33df-4562-a60b-f273428fea3d","Type":"ContainerStarted","Data":"98a2f0e75ca2c214fba50a70792a41195e5b7e674dbe1ae5b98cd015b7526483"} Jan 31 14:43:48 crc kubenswrapper[4751]: I0131 14:43:48.405208 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:43:48 crc kubenswrapper[4751]: I0131 14:43:48.405288 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:43:48 crc kubenswrapper[4751]: I0131 14:43:48.405328 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:43:48 crc kubenswrapper[4751]: E0131 14:43:48.405911 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 14:43:48 crc kubenswrapper[4751]: E0131 14:43:48.405689 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 14:43:48 crc kubenswrapper[4751]: E0131 14:43:48.406050 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 14:43:49 crc kubenswrapper[4751]: I0131 14:43:49.405641 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:43:49 crc kubenswrapper[4751]: E0131 14:43:49.405826 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xtn6l" podUID="68aeb9c7-d3c3-4c34-96ab-bb947421c504" Jan 31 14:43:50 crc kubenswrapper[4751]: I0131 14:43:50.405149 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:43:50 crc kubenswrapper[4751]: I0131 14:43:50.405147 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:43:50 crc kubenswrapper[4751]: E0131 14:43:50.405298 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 14:43:50 crc kubenswrapper[4751]: E0131 14:43:50.405371 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 14:43:50 crc kubenswrapper[4751]: I0131 14:43:50.405168 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:43:50 crc kubenswrapper[4751]: E0131 14:43:50.405568 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 14:43:51 crc kubenswrapper[4751]: I0131 14:43:51.405120 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:43:51 crc kubenswrapper[4751]: E0131 14:43:51.405254 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xtn6l" podUID="68aeb9c7-d3c3-4c34-96ab-bb947421c504" Jan 31 14:43:52 crc kubenswrapper[4751]: I0131 14:43:52.405369 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:43:52 crc kubenswrapper[4751]: I0131 14:43:52.405423 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:43:52 crc kubenswrapper[4751]: I0131 14:43:52.405587 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:43:52 crc kubenswrapper[4751]: I0131 14:43:52.409149 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 31 14:43:52 crc kubenswrapper[4751]: I0131 14:43:52.409209 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 31 14:43:52 crc kubenswrapper[4751]: I0131 14:43:52.409397 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 31 14:43:52 crc kubenswrapper[4751]: I0131 14:43:52.409620 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 31 14:43:53 crc kubenswrapper[4751]: I0131 14:43:53.405058 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:43:53 crc kubenswrapper[4751]: I0131 14:43:53.407943 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 31 14:43:53 crc kubenswrapper[4751]: I0131 14:43:53.407941 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.824953 4751 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.874323 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-sxjf5"] Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.874853 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-sxjf5" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.878599 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-wdsj4"] Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.879377 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdsj4" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.889251 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.889344 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.890029 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.890518 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.890815 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.891104 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.891464 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.891661 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.891831 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-z9dj7"] Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.891929 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.892327 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.892367 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-z9dj7" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.892632 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.892778 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.893054 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.895395 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.897487 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-pmglg"] Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.897958 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-pmglg" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.900147 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.900801 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.901058 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.901343 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.901567 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.906305 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-4gqrl"] Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.907009 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-4gqrl" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.907586 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-4m7jl"] Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.913207 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-4m7jl" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.922230 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-cc6m2"] Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.945411 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.945791 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.946003 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.946831 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.947002 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.947171 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.947421 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.947750 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.948009 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.948455 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8m7f4"] Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.948623 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.948811 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-db5pg"] Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.949004 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-5f7jc"] Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.949150 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8m7f4" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.949246 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-db5pg" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.948814 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.948877 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.949524 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cc6m2" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.950306 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-5f7jc" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.950606 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7762w"] Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.951635 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.952095 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.952259 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.952383 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.955631 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.955831 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.956026 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.956164 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.956244 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.956554 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.957242 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.957481 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.957636 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.957837 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.958000 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.958224 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.958383 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.960379 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-8fgxq"] Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.960706 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7762w" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.961179 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nz99n"] Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.961666 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nz99n" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.962250 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-8fgxq" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.962736 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cp47m"] Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.963303 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cp47m" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.966132 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.966447 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.966955 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.967514 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-5hn9b"] Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.967685 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.967996 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.968184 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.968257 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.968403 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.968483 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.968619 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-5hn9b" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.968935 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.969186 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-v8p8j"] Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.976720 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-v8p8j" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.968973 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.971773 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/bcd7a932-6db9-4cca-b619-852242324725-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-4gqrl\" (UID: \"bcd7a932-6db9-4cca-b619-852242324725\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4gqrl" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.977360 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1b479ec-e8d7-4fb6-8d0b-9fac28697df7-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-pmglg\" (UID: \"b1b479ec-e8d7-4fb6-8d0b-9fac28697df7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pmglg" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.977410 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/e14d9fb0-f377-4331-8bc1-8f4017bb95a3-available-featuregates\") pod \"openshift-config-operator-7777fb866f-z9dj7\" (UID: \"e14d9fb0-f377-4331-8bc1-8f4017bb95a3\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-z9dj7" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.977469 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e14d9fb0-f377-4331-8bc1-8f4017bb95a3-serving-cert\") pod \"openshift-config-operator-7777fb866f-z9dj7\" (UID: \"e14d9fb0-f377-4331-8bc1-8f4017bb95a3\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-z9dj7" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.977500 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1e92f9b-2291-4bd5-80b8-c2f9e667acf8-config\") pod \"controller-manager-879f6c89f-sxjf5\" (UID: \"c1e92f9b-2291-4bd5-80b8-c2f9e667acf8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sxjf5" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.977542 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2lr7\" (UniqueName: \"kubernetes.io/projected/e14d9fb0-f377-4331-8bc1-8f4017bb95a3-kube-api-access-x2lr7\") pod \"openshift-config-operator-7777fb866f-z9dj7\" (UID: \"e14d9fb0-f377-4331-8bc1-8f4017bb95a3\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-z9dj7" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.977579 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5f75ab4e-45c1-4ed9-b966-afa91dbc88a6-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-wdsj4\" (UID: \"5f75ab4e-45c1-4ed9-b966-afa91dbc88a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdsj4" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.977611 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stc4r\" (UniqueName: \"kubernetes.io/projected/bcd7a932-6db9-4cca-b619-852242324725-kube-api-access-stc4r\") pod \"machine-api-operator-5694c8668f-4gqrl\" (UID: \"bcd7a932-6db9-4cca-b619-852242324725\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4gqrl" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.977644 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/bcd7a932-6db9-4cca-b619-852242324725-images\") pod \"machine-api-operator-5694c8668f-4gqrl\" (UID: \"bcd7a932-6db9-4cca-b619-852242324725\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4gqrl" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.977701 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5f75ab4e-45c1-4ed9-b966-afa91dbc88a6-audit-policies\") pod \"apiserver-7bbb656c7d-wdsj4\" (UID: \"5f75ab4e-45c1-4ed9-b966-afa91dbc88a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdsj4" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.977732 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f75ab4e-45c1-4ed9-b966-afa91dbc88a6-audit-dir\") pod \"apiserver-7bbb656c7d-wdsj4\" (UID: \"5f75ab4e-45c1-4ed9-b966-afa91dbc88a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdsj4" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.977767 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c1e92f9b-2291-4bd5-80b8-c2f9e667acf8-client-ca\") pod \"controller-manager-879f6c89f-sxjf5\" (UID: \"c1e92f9b-2291-4bd5-80b8-c2f9e667acf8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sxjf5" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.977902 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dz6sg\" (UniqueName: \"kubernetes.io/projected/5f75ab4e-45c1-4ed9-b966-afa91dbc88a6-kube-api-access-dz6sg\") pod \"apiserver-7bbb656c7d-wdsj4\" (UID: \"5f75ab4e-45c1-4ed9-b966-afa91dbc88a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdsj4" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.977961 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8p9z\" (UniqueName: \"kubernetes.io/projected/d723501b-bb29-4d60-ad97-239eb749771f-kube-api-access-f8p9z\") pod \"downloads-7954f5f757-4m7jl\" (UID: \"d723501b-bb29-4d60-ad97-239eb749771f\") " pod="openshift-console/downloads-7954f5f757-4m7jl" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.977994 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5f75ab4e-45c1-4ed9-b966-afa91dbc88a6-encryption-config\") pod \"apiserver-7bbb656c7d-wdsj4\" (UID: \"5f75ab4e-45c1-4ed9-b966-afa91dbc88a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdsj4" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.978034 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1b479ec-e8d7-4fb6-8d0b-9fac28697df7-service-ca-bundle\") pod \"authentication-operator-69f744f599-pmglg\" (UID: \"b1b479ec-e8d7-4fb6-8d0b-9fac28697df7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pmglg" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.978100 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1b479ec-e8d7-4fb6-8d0b-9fac28697df7-config\") pod \"authentication-operator-69f744f599-pmglg\" (UID: \"b1b479ec-e8d7-4fb6-8d0b-9fac28697df7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pmglg" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.978146 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lcrc\" (UniqueName: \"kubernetes.io/projected/b1b479ec-e8d7-4fb6-8d0b-9fac28697df7-kube-api-access-6lcrc\") pod \"authentication-operator-69f744f599-pmglg\" (UID: \"b1b479ec-e8d7-4fb6-8d0b-9fac28697df7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pmglg" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.978181 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5f75ab4e-45c1-4ed9-b966-afa91dbc88a6-etcd-client\") pod \"apiserver-7bbb656c7d-wdsj4\" (UID: \"5f75ab4e-45c1-4ed9-b966-afa91dbc88a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdsj4" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.978212 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbprf\" (UniqueName: \"kubernetes.io/projected/c1e92f9b-2291-4bd5-80b8-c2f9e667acf8-kube-api-access-fbprf\") pod \"controller-manager-879f6c89f-sxjf5\" (UID: \"c1e92f9b-2291-4bd5-80b8-c2f9e667acf8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sxjf5" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.978246 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c1e92f9b-2291-4bd5-80b8-c2f9e667acf8-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-sxjf5\" (UID: \"c1e92f9b-2291-4bd5-80b8-c2f9e667acf8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sxjf5" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.978277 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f75ab4e-45c1-4ed9-b966-afa91dbc88a6-serving-cert\") pod \"apiserver-7bbb656c7d-wdsj4\" (UID: \"5f75ab4e-45c1-4ed9-b966-afa91dbc88a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdsj4" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.978310 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1e92f9b-2291-4bd5-80b8-c2f9e667acf8-serving-cert\") pod \"controller-manager-879f6c89f-sxjf5\" (UID: \"c1e92f9b-2291-4bd5-80b8-c2f9e667acf8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sxjf5" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.978381 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1b479ec-e8d7-4fb6-8d0b-9fac28697df7-serving-cert\") pod \"authentication-operator-69f744f599-pmglg\" (UID: \"b1b479ec-e8d7-4fb6-8d0b-9fac28697df7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pmglg" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.969114 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.978415 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcd7a932-6db9-4cca-b619-852242324725-config\") pod \"machine-api-operator-5694c8668f-4gqrl\" (UID: \"bcd7a932-6db9-4cca-b619-852242324725\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4gqrl" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.969239 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.978662 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f75ab4e-45c1-4ed9-b966-afa91dbc88a6-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-wdsj4\" (UID: \"5f75ab4e-45c1-4ed9-b966-afa91dbc88a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdsj4" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.969384 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.969431 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.969458 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.969481 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.969514 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.969598 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.969795 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.970994 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.971206 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.971235 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.971260 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.971287 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.971314 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.971355 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.971380 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.971406 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.971463 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.971490 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.980004 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mpbgx"] Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.980529 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.980859 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-hgs4c"] Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.983200 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.984472 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-hgs4c" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.984981 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.985013 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.985150 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.985258 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.985295 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.985303 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.985357 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.985442 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.985510 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.996031 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s7gwp"] Jan 31 14:43:55 crc kubenswrapper[4751]: I0131 14:43:55.996890 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s7gwp" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.005093 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-drp8h"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.027860 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.028418 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xr2gt"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.028492 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.028831 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.029162 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-drp8h" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.029808 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.029956 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.030142 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.030293 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.030403 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.030534 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.030664 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.030837 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.031412 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-h262z"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.031804 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-h262z" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.032071 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-xv2tk"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.032715 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xv2tk" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.032887 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.033676 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.035657 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.037159 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-czqdr"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.037641 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-czqdr" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.039552 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.043417 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.045531 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-w4lzx"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.046434 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w4lzx" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.048764 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-b44gm"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.049382 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xqgfv"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.049734 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v4px6"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.049832 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xqgfv" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.049949 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b44gm" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.050779 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kvfvk"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.050877 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v4px6" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.051759 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kvfvk" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.052321 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l76jv"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.052767 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l76jv" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.052955 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-wdsj4"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.057855 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h4drr"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.058290 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7hjp9"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.058554 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h4drr" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.058564 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-ghblb"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.058645 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7hjp9" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.059454 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ghblb" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.059499 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5r6kv"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.060559 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.060648 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vc9q2"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.060973 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vc9q2" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.061143 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-5r6kv" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.062120 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497830-rpwmp"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.070848 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7hc86"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.071041 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497830-rpwmp" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.071817 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-vbfvz"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.072123 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7hc86" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.072500 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-sxjf5"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.072539 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cp47m"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.072553 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-skzbg"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.073296 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-z9dj7"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.073376 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-skzbg" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.073409 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-vbfvz" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.074560 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-pmglg"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.075768 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-4gqrl"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.077189 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s7gwp"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.077735 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.078756 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v4px6"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.079430 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/bcd7a932-6db9-4cca-b619-852242324725-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-4gqrl\" (UID: \"bcd7a932-6db9-4cca-b619-852242324725\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4gqrl" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.079469 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/2d8dfb13-f0a0-465c-821d-95f0df0a98cf-etcd-ca\") pod \"etcd-operator-b45778765-hgs4c\" (UID: \"2d8dfb13-f0a0-465c-821d-95f0df0a98cf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hgs4c" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.079486 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/2d8dfb13-f0a0-465c-821d-95f0df0a98cf-etcd-service-ca\") pod \"etcd-operator-b45778765-hgs4c\" (UID: \"2d8dfb13-f0a0-465c-821d-95f0df0a98cf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hgs4c" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.079502 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57wkh\" (UniqueName: \"kubernetes.io/projected/6a74f65d-f8d2-41af-8469-6f8d020b41de-kube-api-access-57wkh\") pod \"console-operator-58897d9998-db5pg\" (UID: \"6a74f65d-f8d2-41af-8469-6f8d020b41de\") " pod="openshift-console-operator/console-operator-58897d9998-db5pg" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.079569 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1b479ec-e8d7-4fb6-8d0b-9fac28697df7-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-pmglg\" (UID: \"b1b479ec-e8d7-4fb6-8d0b-9fac28697df7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pmglg" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.079585 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/e14d9fb0-f377-4331-8bc1-8f4017bb95a3-available-featuregates\") pod \"openshift-config-operator-7777fb866f-z9dj7\" (UID: \"e14d9fb0-f377-4331-8bc1-8f4017bb95a3\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-z9dj7" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.079603 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-xr2gt\" (UID: \"802d5225-ef3f-485c-bb85-3c0f18e42952\") " pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.079622 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-xr2gt\" (UID: \"802d5225-ef3f-485c-bb85-3c0f18e42952\") " pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.079639 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-xr2gt\" (UID: \"802d5225-ef3f-485c-bb85-3c0f18e42952\") " pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.079654 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9tnj\" (UniqueName: \"kubernetes.io/projected/b9810521-7440-49d4-bf04-7dbe3324cc5b-kube-api-access-b9tnj\") pod \"multus-admission-controller-857f4d67dd-v8p8j\" (UID: \"b9810521-7440-49d4-bf04-7dbe3324cc5b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-v8p8j" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.079670 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6a74f65d-f8d2-41af-8469-6f8d020b41de-trusted-ca\") pod \"console-operator-58897d9998-db5pg\" (UID: \"6a74f65d-f8d2-41af-8469-6f8d020b41de\") " pod="openshift-console-operator/console-operator-58897d9998-db5pg" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.079685 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e14d9fb0-f377-4331-8bc1-8f4017bb95a3-serving-cert\") pod \"openshift-config-operator-7777fb866f-z9dj7\" (UID: \"e14d9fb0-f377-4331-8bc1-8f4017bb95a3\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-z9dj7" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.079701 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1e92f9b-2291-4bd5-80b8-c2f9e667acf8-config\") pod \"controller-manager-879f6c89f-sxjf5\" (UID: \"c1e92f9b-2291-4bd5-80b8-c2f9e667acf8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sxjf5" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.079717 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a74f65d-f8d2-41af-8469-6f8d020b41de-config\") pod \"console-operator-58897d9998-db5pg\" (UID: \"6a74f65d-f8d2-41af-8469-6f8d020b41de\") " pod="openshift-console-operator/console-operator-58897d9998-db5pg" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.079734 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2lr7\" (UniqueName: \"kubernetes.io/projected/e14d9fb0-f377-4331-8bc1-8f4017bb95a3-kube-api-access-x2lr7\") pod \"openshift-config-operator-7777fb866f-z9dj7\" (UID: \"e14d9fb0-f377-4331-8bc1-8f4017bb95a3\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-z9dj7" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.079751 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2d8dfb13-f0a0-465c-821d-95f0df0a98cf-etcd-client\") pod \"etcd-operator-b45778765-hgs4c\" (UID: \"2d8dfb13-f0a0-465c-821d-95f0df0a98cf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hgs4c" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.079767 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-xr2gt\" (UID: \"802d5225-ef3f-485c-bb85-3c0f18e42952\") " pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.079784 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5f75ab4e-45c1-4ed9-b966-afa91dbc88a6-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-wdsj4\" (UID: \"5f75ab4e-45c1-4ed9-b966-afa91dbc88a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdsj4" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.079799 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stc4r\" (UniqueName: \"kubernetes.io/projected/bcd7a932-6db9-4cca-b619-852242324725-kube-api-access-stc4r\") pod \"machine-api-operator-5694c8668f-4gqrl\" (UID: \"bcd7a932-6db9-4cca-b619-852242324725\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4gqrl" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.079813 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/802d5225-ef3f-485c-bb85-3c0f18e42952-audit-dir\") pod \"oauth-openshift-558db77b4-xr2gt\" (UID: \"802d5225-ef3f-485c-bb85-3c0f18e42952\") " pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.079831 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/bcd7a932-6db9-4cca-b619-852242324725-images\") pod \"machine-api-operator-5694c8668f-4gqrl\" (UID: \"bcd7a932-6db9-4cca-b619-852242324725\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4gqrl" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.079845 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d8dfb13-f0a0-465c-821d-95f0df0a98cf-serving-cert\") pod \"etcd-operator-b45778765-hgs4c\" (UID: \"2d8dfb13-f0a0-465c-821d-95f0df0a98cf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hgs4c" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.079871 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f13811e7-14eb-4a17-90a1-345619f9fb29-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-czqdr\" (UID: \"f13811e7-14eb-4a17-90a1-345619f9fb29\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-czqdr" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.079889 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5f75ab4e-45c1-4ed9-b966-afa91dbc88a6-audit-policies\") pod \"apiserver-7bbb656c7d-wdsj4\" (UID: \"5f75ab4e-45c1-4ed9-b966-afa91dbc88a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdsj4" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.079909 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f75ab4e-45c1-4ed9-b966-afa91dbc88a6-audit-dir\") pod \"apiserver-7bbb656c7d-wdsj4\" (UID: \"5f75ab4e-45c1-4ed9-b966-afa91dbc88a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdsj4" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.079928 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c1e92f9b-2291-4bd5-80b8-c2f9e667acf8-client-ca\") pod \"controller-manager-879f6c89f-sxjf5\" (UID: \"c1e92f9b-2291-4bd5-80b8-c2f9e667acf8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sxjf5" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.079952 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dz6sg\" (UniqueName: \"kubernetes.io/projected/5f75ab4e-45c1-4ed9-b966-afa91dbc88a6-kube-api-access-dz6sg\") pod \"apiserver-7bbb656c7d-wdsj4\" (UID: \"5f75ab4e-45c1-4ed9-b966-afa91dbc88a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdsj4" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.079967 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-xr2gt\" (UID: \"802d5225-ef3f-485c-bb85-3c0f18e42952\") " pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.079981 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f13811e7-14eb-4a17-90a1-345619f9fb29-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-czqdr\" (UID: \"f13811e7-14eb-4a17-90a1-345619f9fb29\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-czqdr" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.079999 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/466718f1-f118-4f13-a983-14060aef09e6-bound-sa-token\") pod \"ingress-operator-5b745b69d9-drp8h\" (UID: \"466718f1-f118-4f13-a983-14060aef09e6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-drp8h" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.080819 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5f75ab4e-45c1-4ed9-b966-afa91dbc88a6-audit-policies\") pod \"apiserver-7bbb656c7d-wdsj4\" (UID: \"5f75ab4e-45c1-4ed9-b966-afa91dbc88a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdsj4" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.080395 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f75ab4e-45c1-4ed9-b966-afa91dbc88a6-audit-dir\") pod \"apiserver-7bbb656c7d-wdsj4\" (UID: \"5f75ab4e-45c1-4ed9-b966-afa91dbc88a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdsj4" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.080249 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/e14d9fb0-f377-4331-8bc1-8f4017bb95a3-available-featuregates\") pod \"openshift-config-operator-7777fb866f-z9dj7\" (UID: \"e14d9fb0-f377-4331-8bc1-8f4017bb95a3\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-z9dj7" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.080978 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8894\" (UniqueName: \"kubernetes.io/projected/466718f1-f118-4f13-a983-14060aef09e6-kube-api-access-p8894\") pod \"ingress-operator-5b745b69d9-drp8h\" (UID: \"466718f1-f118-4f13-a983-14060aef09e6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-drp8h" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.081010 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-xr2gt\" (UID: \"802d5225-ef3f-485c-bb85-3c0f18e42952\") " pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.081101 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8p9z\" (UniqueName: \"kubernetes.io/projected/d723501b-bb29-4d60-ad97-239eb749771f-kube-api-access-f8p9z\") pod \"downloads-7954f5f757-4m7jl\" (UID: \"d723501b-bb29-4d60-ad97-239eb749771f\") " pod="openshift-console/downloads-7954f5f757-4m7jl" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.081126 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5f75ab4e-45c1-4ed9-b966-afa91dbc88a6-encryption-config\") pod \"apiserver-7bbb656c7d-wdsj4\" (UID: \"5f75ab4e-45c1-4ed9-b966-afa91dbc88a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdsj4" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.081142 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1b479ec-e8d7-4fb6-8d0b-9fac28697df7-service-ca-bundle\") pod \"authentication-operator-69f744f599-pmglg\" (UID: \"b1b479ec-e8d7-4fb6-8d0b-9fac28697df7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pmglg" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.081165 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/aceeef0f-cb36-43d6-8e09-35949fe73911-auth-proxy-config\") pod \"machine-config-operator-74547568cd-w4lzx\" (UID: \"aceeef0f-cb36-43d6-8e09-35949fe73911\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w4lzx" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.081199 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1b479ec-e8d7-4fb6-8d0b-9fac28697df7-config\") pod \"authentication-operator-69f744f599-pmglg\" (UID: \"b1b479ec-e8d7-4fb6-8d0b-9fac28697df7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pmglg" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.081220 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hhrf\" (UniqueName: \"kubernetes.io/projected/aceeef0f-cb36-43d6-8e09-35949fe73911-kube-api-access-7hhrf\") pod \"machine-config-operator-74547568cd-w4lzx\" (UID: \"aceeef0f-cb36-43d6-8e09-35949fe73911\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w4lzx" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.081240 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5246\" (UniqueName: \"kubernetes.io/projected/4ba2ceb2-34e1-487c-9b13-0a480d6cc521-kube-api-access-h5246\") pod \"dns-operator-744455d44c-8fgxq\" (UID: \"4ba2ceb2-34e1-487c-9b13-0a480d6cc521\") " pod="openshift-dns-operator/dns-operator-744455d44c-8fgxq" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.081258 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f13811e7-14eb-4a17-90a1-345619f9fb29-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-czqdr\" (UID: \"f13811e7-14eb-4a17-90a1-345619f9fb29\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-czqdr" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.081281 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lcrc\" (UniqueName: \"kubernetes.io/projected/b1b479ec-e8d7-4fb6-8d0b-9fac28697df7-kube-api-access-6lcrc\") pod \"authentication-operator-69f744f599-pmglg\" (UID: \"b1b479ec-e8d7-4fb6-8d0b-9fac28697df7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pmglg" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.081302 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/466718f1-f118-4f13-a983-14060aef09e6-metrics-tls\") pod \"ingress-operator-5b745b69d9-drp8h\" (UID: \"466718f1-f118-4f13-a983-14060aef09e6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-drp8h" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.081324 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5f75ab4e-45c1-4ed9-b966-afa91dbc88a6-etcd-client\") pod \"apiserver-7bbb656c7d-wdsj4\" (UID: \"5f75ab4e-45c1-4ed9-b966-afa91dbc88a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdsj4" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.081345 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbprf\" (UniqueName: \"kubernetes.io/projected/c1e92f9b-2291-4bd5-80b8-c2f9e667acf8-kube-api-access-fbprf\") pod \"controller-manager-879f6c89f-sxjf5\" (UID: \"c1e92f9b-2291-4bd5-80b8-c2f9e667acf8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sxjf5" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.081364 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/aceeef0f-cb36-43d6-8e09-35949fe73911-proxy-tls\") pod \"machine-config-operator-74547568cd-w4lzx\" (UID: \"aceeef0f-cb36-43d6-8e09-35949fe73911\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w4lzx" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.081377 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c1e92f9b-2291-4bd5-80b8-c2f9e667acf8-client-ca\") pod \"controller-manager-879f6c89f-sxjf5\" (UID: \"c1e92f9b-2291-4bd5-80b8-c2f9e667acf8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sxjf5" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.081382 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c1e92f9b-2291-4bd5-80b8-c2f9e667acf8-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-sxjf5\" (UID: \"c1e92f9b-2291-4bd5-80b8-c2f9e667acf8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sxjf5" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.081026 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1b479ec-e8d7-4fb6-8d0b-9fac28697df7-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-pmglg\" (UID: \"b1b479ec-e8d7-4fb6-8d0b-9fac28697df7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pmglg" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.081458 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-xr2gt\" (UID: \"802d5225-ef3f-485c-bb85-3c0f18e42952\") " pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.081523 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1e92f9b-2291-4bd5-80b8-c2f9e667acf8-config\") pod \"controller-manager-879f6c89f-sxjf5\" (UID: \"c1e92f9b-2291-4bd5-80b8-c2f9e667acf8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sxjf5" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.081532 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/bcd7a932-6db9-4cca-b619-852242324725-images\") pod \"machine-api-operator-5694c8668f-4gqrl\" (UID: \"bcd7a932-6db9-4cca-b619-852242324725\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4gqrl" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.081673 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f75ab4e-45c1-4ed9-b966-afa91dbc88a6-serving-cert\") pod \"apiserver-7bbb656c7d-wdsj4\" (UID: \"5f75ab4e-45c1-4ed9-b966-afa91dbc88a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdsj4" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.081747 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/aceeef0f-cb36-43d6-8e09-35949fe73911-images\") pod \"machine-config-operator-74547568cd-w4lzx\" (UID: \"aceeef0f-cb36-43d6-8e09-35949fe73911\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w4lzx" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.083656 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1b479ec-e8d7-4fb6-8d0b-9fac28697df7-service-ca-bundle\") pod \"authentication-operator-69f744f599-pmglg\" (UID: \"b1b479ec-e8d7-4fb6-8d0b-9fac28697df7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pmglg" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.084183 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5f75ab4e-45c1-4ed9-b966-afa91dbc88a6-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-wdsj4\" (UID: \"5f75ab4e-45c1-4ed9-b966-afa91dbc88a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdsj4" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.089200 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c1e92f9b-2291-4bd5-80b8-c2f9e667acf8-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-sxjf5\" (UID: \"c1e92f9b-2291-4bd5-80b8-c2f9e667acf8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sxjf5" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.090881 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwsmj\" (UniqueName: \"kubernetes.io/projected/2d8dfb13-f0a0-465c-821d-95f0df0a98cf-kube-api-access-xwsmj\") pod \"etcd-operator-b45778765-hgs4c\" (UID: \"2d8dfb13-f0a0-465c-821d-95f0df0a98cf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hgs4c" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.090966 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1e92f9b-2291-4bd5-80b8-c2f9e667acf8-serving-cert\") pod \"controller-manager-879f6c89f-sxjf5\" (UID: \"c1e92f9b-2291-4bd5-80b8-c2f9e667acf8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sxjf5" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.091025 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/802d5225-ef3f-485c-bb85-3c0f18e42952-audit-policies\") pod \"oauth-openshift-558db77b4-xr2gt\" (UID: \"802d5225-ef3f-485c-bb85-3c0f18e42952\") " pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.091515 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1b479ec-e8d7-4fb6-8d0b-9fac28697df7-serving-cert\") pod \"authentication-operator-69f744f599-pmglg\" (UID: \"b1b479ec-e8d7-4fb6-8d0b-9fac28697df7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pmglg" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.091686 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcd7a932-6db9-4cca-b619-852242324725-config\") pod \"machine-api-operator-5694c8668f-4gqrl\" (UID: \"bcd7a932-6db9-4cca-b619-852242324725\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4gqrl" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.093434 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1b479ec-e8d7-4fb6-8d0b-9fac28697df7-config\") pod \"authentication-operator-69f744f599-pmglg\" (UID: \"b1b479ec-e8d7-4fb6-8d0b-9fac28697df7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pmglg" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.093919 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1e92f9b-2291-4bd5-80b8-c2f9e667acf8-serving-cert\") pod \"controller-manager-879f6c89f-sxjf5\" (UID: \"c1e92f9b-2291-4bd5-80b8-c2f9e667acf8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sxjf5" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.094699 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcd7a932-6db9-4cca-b619-852242324725-config\") pod \"machine-api-operator-5694c8668f-4gqrl\" (UID: \"bcd7a932-6db9-4cca-b619-852242324725\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4gqrl" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.094701 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5f75ab4e-45c1-4ed9-b966-afa91dbc88a6-etcd-client\") pod \"apiserver-7bbb656c7d-wdsj4\" (UID: \"5f75ab4e-45c1-4ed9-b966-afa91dbc88a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdsj4" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.094790 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f75ab4e-45c1-4ed9-b966-afa91dbc88a6-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-wdsj4\" (UID: \"5f75ab4e-45c1-4ed9-b966-afa91dbc88a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdsj4" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.094852 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnk4h\" (UniqueName: \"kubernetes.io/projected/802d5225-ef3f-485c-bb85-3c0f18e42952-kube-api-access-rnk4h\") pod \"oauth-openshift-558db77b4-xr2gt\" (UID: \"802d5225-ef3f-485c-bb85-3c0f18e42952\") " pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.095316 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f75ab4e-45c1-4ed9-b966-afa91dbc88a6-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-wdsj4\" (UID: \"5f75ab4e-45c1-4ed9-b966-afa91dbc88a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdsj4" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.095422 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d8dfb13-f0a0-465c-821d-95f0df0a98cf-config\") pod \"etcd-operator-b45778765-hgs4c\" (UID: \"2d8dfb13-f0a0-465c-821d-95f0df0a98cf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hgs4c" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.095460 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-xr2gt\" (UID: \"802d5225-ef3f-485c-bb85-3c0f18e42952\") " pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.095509 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-xr2gt\" (UID: \"802d5225-ef3f-485c-bb85-3c0f18e42952\") " pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.095532 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a74f65d-f8d2-41af-8469-6f8d020b41de-serving-cert\") pod \"console-operator-58897d9998-db5pg\" (UID: \"6a74f65d-f8d2-41af-8469-6f8d020b41de\") " pod="openshift-console-operator/console-operator-58897d9998-db5pg" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.095523 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1b479ec-e8d7-4fb6-8d0b-9fac28697df7-serving-cert\") pod \"authentication-operator-69f744f599-pmglg\" (UID: \"b1b479ec-e8d7-4fb6-8d0b-9fac28697df7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pmglg" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.095550 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-xr2gt\" (UID: \"802d5225-ef3f-485c-bb85-3c0f18e42952\") " pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.095593 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/466718f1-f118-4f13-a983-14060aef09e6-trusted-ca\") pod \"ingress-operator-5b745b69d9-drp8h\" (UID: \"466718f1-f118-4f13-a983-14060aef09e6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-drp8h" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.095613 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4ba2ceb2-34e1-487c-9b13-0a480d6cc521-metrics-tls\") pod \"dns-operator-744455d44c-8fgxq\" (UID: \"4ba2ceb2-34e1-487c-9b13-0a480d6cc521\") " pod="openshift-dns-operator/dns-operator-744455d44c-8fgxq" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.095634 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-xr2gt\" (UID: \"802d5225-ef3f-485c-bb85-3c0f18e42952\") " pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.095653 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b9810521-7440-49d4-bf04-7dbe3324cc5b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-v8p8j\" (UID: \"b9810521-7440-49d4-bf04-7dbe3324cc5b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-v8p8j" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.096736 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.096748 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e14d9fb0-f377-4331-8bc1-8f4017bb95a3-serving-cert\") pod \"openshift-config-operator-7777fb866f-z9dj7\" (UID: \"e14d9fb0-f377-4331-8bc1-8f4017bb95a3\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-z9dj7" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.098412 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5f75ab4e-45c1-4ed9-b966-afa91dbc88a6-encryption-config\") pod \"apiserver-7bbb656c7d-wdsj4\" (UID: \"5f75ab4e-45c1-4ed9-b966-afa91dbc88a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdsj4" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.098619 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xr2gt"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.100994 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f75ab4e-45c1-4ed9-b966-afa91dbc88a6-serving-cert\") pod \"apiserver-7bbb656c7d-wdsj4\" (UID: \"5f75ab4e-45c1-4ed9-b966-afa91dbc88a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdsj4" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.098850 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-h262z"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.101527 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/bcd7a932-6db9-4cca-b619-852242324725-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-4gqrl\" (UID: \"bcd7a932-6db9-4cca-b619-852242324725\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4gqrl" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.102329 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7762w"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.104903 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-v8p8j"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.105654 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8m7f4"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.108983 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-db5pg"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.110103 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-8fgxq"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.111663 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-xv2tk"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.113608 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-5f7jc"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.114579 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-w4lzx"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.115838 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l76jv"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.117144 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-drp8h"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.117555 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.118811 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-hgs4c"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.120187 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h4drr"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.120208 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-b44gm"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.123172 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kvfvk"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.124737 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-vbfvz"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.125622 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-4m7jl"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.127020 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nz99n"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.128142 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xqgfv"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.129119 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5r6kv"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.130083 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-czqdr"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.131009 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-x4njd"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.131516 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-x4njd" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.132816 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-x4rnh"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.133631 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mpbgx"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.133683 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-x4rnh" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.133920 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7hc86"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.136107 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-x4rnh"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.137097 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497830-rpwmp"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.137206 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.138106 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-skzbg"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.139053 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vc9q2"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.140089 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7hjp9"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.141027 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-ghblb"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.141998 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-qdsgb"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.142582 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qdsgb" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.142964 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qdsgb"] Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.157019 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.177229 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.196271 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-xr2gt\" (UID: \"802d5225-ef3f-485c-bb85-3c0f18e42952\") " pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.196314 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-xr2gt\" (UID: \"802d5225-ef3f-485c-bb85-3c0f18e42952\") " pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.196366 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-xr2gt\" (UID: \"802d5225-ef3f-485c-bb85-3c0f18e42952\") " pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.196399 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9tnj\" (UniqueName: \"kubernetes.io/projected/b9810521-7440-49d4-bf04-7dbe3324cc5b-kube-api-access-b9tnj\") pod \"multus-admission-controller-857f4d67dd-v8p8j\" (UID: \"b9810521-7440-49d4-bf04-7dbe3324cc5b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-v8p8j" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.196426 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6a74f65d-f8d2-41af-8469-6f8d020b41de-trusted-ca\") pod \"console-operator-58897d9998-db5pg\" (UID: \"6a74f65d-f8d2-41af-8469-6f8d020b41de\") " pod="openshift-console-operator/console-operator-58897d9998-db5pg" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.196456 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-xr2gt\" (UID: \"802d5225-ef3f-485c-bb85-3c0f18e42952\") " pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.196479 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a74f65d-f8d2-41af-8469-6f8d020b41de-config\") pod \"console-operator-58897d9998-db5pg\" (UID: \"6a74f65d-f8d2-41af-8469-6f8d020b41de\") " pod="openshift-console-operator/console-operator-58897d9998-db5pg" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.196510 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2d8dfb13-f0a0-465c-821d-95f0df0a98cf-etcd-client\") pod \"etcd-operator-b45778765-hgs4c\" (UID: \"2d8dfb13-f0a0-465c-821d-95f0df0a98cf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hgs4c" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.196540 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/802d5225-ef3f-485c-bb85-3c0f18e42952-audit-dir\") pod \"oauth-openshift-558db77b4-xr2gt\" (UID: \"802d5225-ef3f-485c-bb85-3c0f18e42952\") " pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.196563 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d8dfb13-f0a0-465c-821d-95f0df0a98cf-serving-cert\") pod \"etcd-operator-b45778765-hgs4c\" (UID: \"2d8dfb13-f0a0-465c-821d-95f0df0a98cf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hgs4c" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.196584 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/802d5225-ef3f-485c-bb85-3c0f18e42952-audit-dir\") pod \"oauth-openshift-558db77b4-xr2gt\" (UID: \"802d5225-ef3f-485c-bb85-3c0f18e42952\") " pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.196598 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f13811e7-14eb-4a17-90a1-345619f9fb29-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-czqdr\" (UID: \"f13811e7-14eb-4a17-90a1-345619f9fb29\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-czqdr" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.196658 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-xr2gt\" (UID: \"802d5225-ef3f-485c-bb85-3c0f18e42952\") " pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.196702 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-xr2gt\" (UID: \"802d5225-ef3f-485c-bb85-3c0f18e42952\") " pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.196728 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f13811e7-14eb-4a17-90a1-345619f9fb29-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-czqdr\" (UID: \"f13811e7-14eb-4a17-90a1-345619f9fb29\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-czqdr" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.196754 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/466718f1-f118-4f13-a983-14060aef09e6-bound-sa-token\") pod \"ingress-operator-5b745b69d9-drp8h\" (UID: \"466718f1-f118-4f13-a983-14060aef09e6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-drp8h" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.196784 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8894\" (UniqueName: \"kubernetes.io/projected/466718f1-f118-4f13-a983-14060aef09e6-kube-api-access-p8894\") pod \"ingress-operator-5b745b69d9-drp8h\" (UID: \"466718f1-f118-4f13-a983-14060aef09e6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-drp8h" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.196824 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/aceeef0f-cb36-43d6-8e09-35949fe73911-auth-proxy-config\") pod \"machine-config-operator-74547568cd-w4lzx\" (UID: \"aceeef0f-cb36-43d6-8e09-35949fe73911\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w4lzx" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.196847 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f13811e7-14eb-4a17-90a1-345619f9fb29-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-czqdr\" (UID: \"f13811e7-14eb-4a17-90a1-345619f9fb29\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-czqdr" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.196883 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hhrf\" (UniqueName: \"kubernetes.io/projected/aceeef0f-cb36-43d6-8e09-35949fe73911-kube-api-access-7hhrf\") pod \"machine-config-operator-74547568cd-w4lzx\" (UID: \"aceeef0f-cb36-43d6-8e09-35949fe73911\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w4lzx" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.196908 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5246\" (UniqueName: \"kubernetes.io/projected/4ba2ceb2-34e1-487c-9b13-0a480d6cc521-kube-api-access-h5246\") pod \"dns-operator-744455d44c-8fgxq\" (UID: \"4ba2ceb2-34e1-487c-9b13-0a480d6cc521\") " pod="openshift-dns-operator/dns-operator-744455d44c-8fgxq" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.196938 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/466718f1-f118-4f13-a983-14060aef09e6-metrics-tls\") pod \"ingress-operator-5b745b69d9-drp8h\" (UID: \"466718f1-f118-4f13-a983-14060aef09e6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-drp8h" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.196972 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/aceeef0f-cb36-43d6-8e09-35949fe73911-proxy-tls\") pod \"machine-config-operator-74547568cd-w4lzx\" (UID: \"aceeef0f-cb36-43d6-8e09-35949fe73911\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w4lzx" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.196995 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-xr2gt\" (UID: \"802d5225-ef3f-485c-bb85-3c0f18e42952\") " pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.197025 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/aceeef0f-cb36-43d6-8e09-35949fe73911-images\") pod \"machine-config-operator-74547568cd-w4lzx\" (UID: \"aceeef0f-cb36-43d6-8e09-35949fe73911\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w4lzx" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.197045 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwsmj\" (UniqueName: \"kubernetes.io/projected/2d8dfb13-f0a0-465c-821d-95f0df0a98cf-kube-api-access-xwsmj\") pod \"etcd-operator-b45778765-hgs4c\" (UID: \"2d8dfb13-f0a0-465c-821d-95f0df0a98cf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hgs4c" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.197063 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/802d5225-ef3f-485c-bb85-3c0f18e42952-audit-policies\") pod \"oauth-openshift-558db77b4-xr2gt\" (UID: \"802d5225-ef3f-485c-bb85-3c0f18e42952\") " pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.197164 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnk4h\" (UniqueName: \"kubernetes.io/projected/802d5225-ef3f-485c-bb85-3c0f18e42952-kube-api-access-rnk4h\") pod \"oauth-openshift-558db77b4-xr2gt\" (UID: \"802d5225-ef3f-485c-bb85-3c0f18e42952\") " pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.197186 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d8dfb13-f0a0-465c-821d-95f0df0a98cf-config\") pod \"etcd-operator-b45778765-hgs4c\" (UID: \"2d8dfb13-f0a0-465c-821d-95f0df0a98cf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hgs4c" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.197229 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a74f65d-f8d2-41af-8469-6f8d020b41de-config\") pod \"console-operator-58897d9998-db5pg\" (UID: \"6a74f65d-f8d2-41af-8469-6f8d020b41de\") " pod="openshift-console-operator/console-operator-58897d9998-db5pg" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.197234 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-xr2gt\" (UID: \"802d5225-ef3f-485c-bb85-3c0f18e42952\") " pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.197335 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-xr2gt\" (UID: \"802d5225-ef3f-485c-bb85-3c0f18e42952\") " pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.197356 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a74f65d-f8d2-41af-8469-6f8d020b41de-serving-cert\") pod \"console-operator-58897d9998-db5pg\" (UID: \"6a74f65d-f8d2-41af-8469-6f8d020b41de\") " pod="openshift-console-operator/console-operator-58897d9998-db5pg" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.197376 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-xr2gt\" (UID: \"802d5225-ef3f-485c-bb85-3c0f18e42952\") " pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.197392 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-xr2gt\" (UID: \"802d5225-ef3f-485c-bb85-3c0f18e42952\") " pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.197411 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/466718f1-f118-4f13-a983-14060aef09e6-trusted-ca\") pod \"ingress-operator-5b745b69d9-drp8h\" (UID: \"466718f1-f118-4f13-a983-14060aef09e6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-drp8h" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.197427 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4ba2ceb2-34e1-487c-9b13-0a480d6cc521-metrics-tls\") pod \"dns-operator-744455d44c-8fgxq\" (UID: \"4ba2ceb2-34e1-487c-9b13-0a480d6cc521\") " pod="openshift-dns-operator/dns-operator-744455d44c-8fgxq" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.197444 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b9810521-7440-49d4-bf04-7dbe3324cc5b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-v8p8j\" (UID: \"b9810521-7440-49d4-bf04-7dbe3324cc5b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-v8p8j" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.197475 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/2d8dfb13-f0a0-465c-821d-95f0df0a98cf-etcd-ca\") pod \"etcd-operator-b45778765-hgs4c\" (UID: \"2d8dfb13-f0a0-465c-821d-95f0df0a98cf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hgs4c" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.197742 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.198268 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/aceeef0f-cb36-43d6-8e09-35949fe73911-auth-proxy-config\") pod \"machine-config-operator-74547568cd-w4lzx\" (UID: \"aceeef0f-cb36-43d6-8e09-35949fe73911\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w4lzx" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.198355 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/2d8dfb13-f0a0-465c-821d-95f0df0a98cf-etcd-service-ca\") pod \"etcd-operator-b45778765-hgs4c\" (UID: \"2d8dfb13-f0a0-465c-821d-95f0df0a98cf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hgs4c" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.198388 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57wkh\" (UniqueName: \"kubernetes.io/projected/6a74f65d-f8d2-41af-8469-6f8d020b41de-kube-api-access-57wkh\") pod \"console-operator-58897d9998-db5pg\" (UID: \"6a74f65d-f8d2-41af-8469-6f8d020b41de\") " pod="openshift-console-operator/console-operator-58897d9998-db5pg" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.198645 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/2d8dfb13-f0a0-465c-821d-95f0df0a98cf-etcd-ca\") pod \"etcd-operator-b45778765-hgs4c\" (UID: \"2d8dfb13-f0a0-465c-821d-95f0df0a98cf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hgs4c" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.198930 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6a74f65d-f8d2-41af-8469-6f8d020b41de-trusted-ca\") pod \"console-operator-58897d9998-db5pg\" (UID: \"6a74f65d-f8d2-41af-8469-6f8d020b41de\") " pod="openshift-console-operator/console-operator-58897d9998-db5pg" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.199115 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d8dfb13-f0a0-465c-821d-95f0df0a98cf-config\") pod \"etcd-operator-b45778765-hgs4c\" (UID: \"2d8dfb13-f0a0-465c-821d-95f0df0a98cf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hgs4c" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.199140 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/2d8dfb13-f0a0-465c-821d-95f0df0a98cf-etcd-service-ca\") pod \"etcd-operator-b45778765-hgs4c\" (UID: \"2d8dfb13-f0a0-465c-821d-95f0df0a98cf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hgs4c" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.200635 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d8dfb13-f0a0-465c-821d-95f0df0a98cf-serving-cert\") pod \"etcd-operator-b45778765-hgs4c\" (UID: \"2d8dfb13-f0a0-465c-821d-95f0df0a98cf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hgs4c" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.200793 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4ba2ceb2-34e1-487c-9b13-0a480d6cc521-metrics-tls\") pod \"dns-operator-744455d44c-8fgxq\" (UID: \"4ba2ceb2-34e1-487c-9b13-0a480d6cc521\") " pod="openshift-dns-operator/dns-operator-744455d44c-8fgxq" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.200984 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2d8dfb13-f0a0-465c-821d-95f0df0a98cf-etcd-client\") pod \"etcd-operator-b45778765-hgs4c\" (UID: \"2d8dfb13-f0a0-465c-821d-95f0df0a98cf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hgs4c" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.201365 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b9810521-7440-49d4-bf04-7dbe3324cc5b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-v8p8j\" (UID: \"b9810521-7440-49d4-bf04-7dbe3324cc5b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-v8p8j" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.201936 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a74f65d-f8d2-41af-8469-6f8d020b41de-serving-cert\") pod \"console-operator-58897d9998-db5pg\" (UID: \"6a74f65d-f8d2-41af-8469-6f8d020b41de\") " pod="openshift-console-operator/console-operator-58897d9998-db5pg" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.210821 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-xr2gt\" (UID: \"802d5225-ef3f-485c-bb85-3c0f18e42952\") " pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.226005 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.234490 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-xr2gt\" (UID: \"802d5225-ef3f-485c-bb85-3c0f18e42952\") " pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.241957 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.251715 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-xr2gt\" (UID: \"802d5225-ef3f-485c-bb85-3c0f18e42952\") " pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.257057 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.262263 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-xr2gt\" (UID: \"802d5225-ef3f-485c-bb85-3c0f18e42952\") " pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.276682 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.280345 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-xr2gt\" (UID: \"802d5225-ef3f-485c-bb85-3c0f18e42952\") " pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.297236 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.301381 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-xr2gt\" (UID: \"802d5225-ef3f-485c-bb85-3c0f18e42952\") " pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.317860 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.320540 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-xr2gt\" (UID: \"802d5225-ef3f-485c-bb85-3c0f18e42952\") " pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.337847 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.348579 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-xr2gt\" (UID: \"802d5225-ef3f-485c-bb85-3c0f18e42952\") " pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.357639 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.359180 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/802d5225-ef3f-485c-bb85-3c0f18e42952-audit-policies\") pod \"oauth-openshift-558db77b4-xr2gt\" (UID: \"802d5225-ef3f-485c-bb85-3c0f18e42952\") " pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.376810 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.387931 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-xr2gt\" (UID: \"802d5225-ef3f-485c-bb85-3c0f18e42952\") " pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.396865 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.398759 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-xr2gt\" (UID: \"802d5225-ef3f-485c-bb85-3c0f18e42952\") " pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.422333 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.428609 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-xr2gt\" (UID: \"802d5225-ef3f-485c-bb85-3c0f18e42952\") " pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.437740 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.457469 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.463487 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/466718f1-f118-4f13-a983-14060aef09e6-metrics-tls\") pod \"ingress-operator-5b745b69d9-drp8h\" (UID: \"466718f1-f118-4f13-a983-14060aef09e6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-drp8h" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.477247 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.504248 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.511253 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/466718f1-f118-4f13-a983-14060aef09e6-trusted-ca\") pod \"ingress-operator-5b745b69d9-drp8h\" (UID: \"466718f1-f118-4f13-a983-14060aef09e6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-drp8h" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.516981 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.537639 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.556712 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.577184 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.598047 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.617725 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.644862 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.657268 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.676985 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.698124 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.717750 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.738022 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.757697 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.762247 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f13811e7-14eb-4a17-90a1-345619f9fb29-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-czqdr\" (UID: \"f13811e7-14eb-4a17-90a1-345619f9fb29\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-czqdr" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.777978 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.788174 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f13811e7-14eb-4a17-90a1-345619f9fb29-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-czqdr\" (UID: \"f13811e7-14eb-4a17-90a1-345619f9fb29\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-czqdr" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.797334 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.817154 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.819148 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/aceeef0f-cb36-43d6-8e09-35949fe73911-images\") pod \"machine-config-operator-74547568cd-w4lzx\" (UID: \"aceeef0f-cb36-43d6-8e09-35949fe73911\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w4lzx" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.862775 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.873231 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/aceeef0f-cb36-43d6-8e09-35949fe73911-proxy-tls\") pod \"machine-config-operator-74547568cd-w4lzx\" (UID: \"aceeef0f-cb36-43d6-8e09-35949fe73911\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w4lzx" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.877551 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.917809 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.938264 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.957915 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.978762 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 31 14:43:56 crc kubenswrapper[4751]: I0131 14:43:56.997344 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 31 14:43:57 crc kubenswrapper[4751]: I0131 14:43:57.018539 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 31 14:43:57 crc kubenswrapper[4751]: I0131 14:43:57.038534 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 31 14:43:57 crc kubenswrapper[4751]: I0131 14:43:57.057647 4751 request.go:700] Waited for 1.004990057s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/secrets?fieldSelector=metadata.name%3Dcluster-image-registry-operator-dockercfg-m4qtx&limit=500&resourceVersion=0 Jan 31 14:43:57 crc kubenswrapper[4751]: I0131 14:43:57.059765 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 31 14:43:57 crc kubenswrapper[4751]: I0131 14:43:57.079286 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 31 14:43:57 crc kubenswrapper[4751]: I0131 14:43:57.097961 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 31 14:43:57 crc kubenswrapper[4751]: I0131 14:43:57.118128 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 31 14:43:57 crc kubenswrapper[4751]: I0131 14:43:57.139054 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 31 14:43:57 crc kubenswrapper[4751]: I0131 14:43:57.158549 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 31 14:43:57 crc kubenswrapper[4751]: I0131 14:43:57.177416 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 31 14:43:57 crc kubenswrapper[4751]: I0131 14:43:57.198925 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 31 14:43:57 crc kubenswrapper[4751]: I0131 14:43:57.218625 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 31 14:43:57 crc kubenswrapper[4751]: I0131 14:43:57.237557 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 31 14:43:57 crc kubenswrapper[4751]: I0131 14:43:57.258594 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 31 14:43:57 crc kubenswrapper[4751]: I0131 14:43:57.277849 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 31 14:43:57 crc kubenswrapper[4751]: I0131 14:43:57.301698 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 31 14:43:57 crc kubenswrapper[4751]: I0131 14:43:57.318054 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 31 14:43:57 crc kubenswrapper[4751]: I0131 14:43:57.338020 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 31 14:43:57 crc kubenswrapper[4751]: I0131 14:43:57.357899 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 31 14:43:57 crc kubenswrapper[4751]: I0131 14:43:57.379013 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 31 14:43:57 crc kubenswrapper[4751]: I0131 14:43:57.397732 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 31 14:43:57 crc kubenswrapper[4751]: I0131 14:43:57.417655 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 31 14:43:57 crc kubenswrapper[4751]: I0131 14:43:57.438393 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 31 14:43:57 crc kubenswrapper[4751]: I0131 14:43:57.458468 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 31 14:43:57 crc kubenswrapper[4751]: I0131 14:43:57.478458 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 31 14:43:57 crc kubenswrapper[4751]: I0131 14:43:57.498297 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 31 14:43:57 crc kubenswrapper[4751]: I0131 14:43:57.530293 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 31 14:43:57 crc kubenswrapper[4751]: I0131 14:43:57.538292 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 31 14:43:57 crc kubenswrapper[4751]: I0131 14:43:57.558310 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 31 14:43:57 crc kubenswrapper[4751]: I0131 14:43:57.577961 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 31 14:43:57 crc kubenswrapper[4751]: I0131 14:43:57.599280 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 31 14:43:57 crc kubenswrapper[4751]: I0131 14:43:57.618559 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 31 14:43:57 crc kubenswrapper[4751]: I0131 14:43:57.639128 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 31 14:43:57 crc kubenswrapper[4751]: I0131 14:43:57.658632 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 31 14:43:57 crc kubenswrapper[4751]: I0131 14:43:57.678163 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 31 14:43:57 crc kubenswrapper[4751]: I0131 14:43:57.698837 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 31 14:43:57 crc kubenswrapper[4751]: I0131 14:43:57.718732 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 31 14:43:57 crc kubenswrapper[4751]: I0131 14:43:57.737788 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 31 14:43:57 crc kubenswrapper[4751]: I0131 14:43:57.758635 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 31 14:43:57 crc kubenswrapper[4751]: I0131 14:43:57.806852 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dz6sg\" (UniqueName: \"kubernetes.io/projected/5f75ab4e-45c1-4ed9-b966-afa91dbc88a6-kube-api-access-dz6sg\") pod \"apiserver-7bbb656c7d-wdsj4\" (UID: \"5f75ab4e-45c1-4ed9-b966-afa91dbc88a6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdsj4" Jan 31 14:43:57 crc kubenswrapper[4751]: I0131 14:43:57.824269 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2lr7\" (UniqueName: \"kubernetes.io/projected/e14d9fb0-f377-4331-8bc1-8f4017bb95a3-kube-api-access-x2lr7\") pod \"openshift-config-operator-7777fb866f-z9dj7\" (UID: \"e14d9fb0-f377-4331-8bc1-8f4017bb95a3\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-z9dj7" Jan 31 14:43:57 crc kubenswrapper[4751]: I0131 14:43:57.849594 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stc4r\" (UniqueName: \"kubernetes.io/projected/bcd7a932-6db9-4cca-b619-852242324725-kube-api-access-stc4r\") pod \"machine-api-operator-5694c8668f-4gqrl\" (UID: \"bcd7a932-6db9-4cca-b619-852242324725\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4gqrl" Jan 31 14:43:57 crc kubenswrapper[4751]: I0131 14:43:57.852010 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-4gqrl" Jan 31 14:43:57 crc kubenswrapper[4751]: I0131 14:43:57.865522 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lcrc\" (UniqueName: \"kubernetes.io/projected/b1b479ec-e8d7-4fb6-8d0b-9fac28697df7-kube-api-access-6lcrc\") pod \"authentication-operator-69f744f599-pmglg\" (UID: \"b1b479ec-e8d7-4fb6-8d0b-9fac28697df7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pmglg" Jan 31 14:43:57 crc kubenswrapper[4751]: I0131 14:43:57.886323 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8p9z\" (UniqueName: \"kubernetes.io/projected/d723501b-bb29-4d60-ad97-239eb749771f-kube-api-access-f8p9z\") pod \"downloads-7954f5f757-4m7jl\" (UID: \"d723501b-bb29-4d60-ad97-239eb749771f\") " pod="openshift-console/downloads-7954f5f757-4m7jl" Jan 31 14:43:57 crc kubenswrapper[4751]: I0131 14:43:57.898966 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 31 14:43:57 crc kubenswrapper[4751]: I0131 14:43:57.904563 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbprf\" (UniqueName: \"kubernetes.io/projected/c1e92f9b-2291-4bd5-80b8-c2f9e667acf8-kube-api-access-fbprf\") pod \"controller-manager-879f6c89f-sxjf5\" (UID: \"c1e92f9b-2291-4bd5-80b8-c2f9e667acf8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sxjf5" Jan 31 14:43:57 crc kubenswrapper[4751]: I0131 14:43:57.918817 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 31 14:43:57 crc kubenswrapper[4751]: I0131 14:43:57.937552 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 31 14:43:57 crc kubenswrapper[4751]: I0131 14:43:57.958679 4751 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 31 14:43:57 crc kubenswrapper[4751]: I0131 14:43:57.977858 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 31 14:43:57 crc kubenswrapper[4751]: I0131 14:43:57.998679 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.013174 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-sxjf5" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.017977 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.028422 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdsj4" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.039318 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.057856 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.072721 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-z9dj7" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.076297 4751 request.go:700] Waited for 1.933476135s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-canary/secrets?fieldSelector=metadata.name%3Ddefault-dockercfg-2llfx&limit=500&resourceVersion=0 Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.078964 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.121275 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9tnj\" (UniqueName: \"kubernetes.io/projected/b9810521-7440-49d4-bf04-7dbe3324cc5b-kube-api-access-b9tnj\") pod \"multus-admission-controller-857f4d67dd-v8p8j\" (UID: \"b9810521-7440-49d4-bf04-7dbe3324cc5b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-v8p8j" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.132516 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-pmglg" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.144473 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/466718f1-f118-4f13-a983-14060aef09e6-bound-sa-token\") pod \"ingress-operator-5b745b69d9-drp8h\" (UID: \"466718f1-f118-4f13-a983-14060aef09e6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-drp8h" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.154301 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8894\" (UniqueName: \"kubernetes.io/projected/466718f1-f118-4f13-a983-14060aef09e6-kube-api-access-p8894\") pod \"ingress-operator-5b745b69d9-drp8h\" (UID: \"466718f1-f118-4f13-a983-14060aef09e6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-drp8h" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.165627 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-4m7jl" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.182177 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f13811e7-14eb-4a17-90a1-345619f9fb29-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-czqdr\" (UID: \"f13811e7-14eb-4a17-90a1-345619f9fb29\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-czqdr" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.204580 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hhrf\" (UniqueName: \"kubernetes.io/projected/aceeef0f-cb36-43d6-8e09-35949fe73911-kube-api-access-7hhrf\") pod \"machine-config-operator-74547568cd-w4lzx\" (UID: \"aceeef0f-cb36-43d6-8e09-35949fe73911\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w4lzx" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.210463 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5246\" (UniqueName: \"kubernetes.io/projected/4ba2ceb2-34e1-487c-9b13-0a480d6cc521-kube-api-access-h5246\") pod \"dns-operator-744455d44c-8fgxq\" (UID: \"4ba2ceb2-34e1-487c-9b13-0a480d6cc521\") " pod="openshift-dns-operator/dns-operator-744455d44c-8fgxq" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.236788 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwsmj\" (UniqueName: \"kubernetes.io/projected/2d8dfb13-f0a0-465c-821d-95f0df0a98cf-kube-api-access-xwsmj\") pod \"etcd-operator-b45778765-hgs4c\" (UID: \"2d8dfb13-f0a0-465c-821d-95f0df0a98cf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hgs4c" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.247218 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-4gqrl"] Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.250993 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-8fgxq" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.254251 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnk4h\" (UniqueName: \"kubernetes.io/projected/802d5225-ef3f-485c-bb85-3c0f18e42952-kube-api-access-rnk4h\") pod \"oauth-openshift-558db77b4-xr2gt\" (UID: \"802d5225-ef3f-485c-bb85-3c0f18e42952\") " pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.269235 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-v8p8j" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.271288 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57wkh\" (UniqueName: \"kubernetes.io/projected/6a74f65d-f8d2-41af-8469-6f8d020b41de-kube-api-access-57wkh\") pod \"console-operator-58897d9998-db5pg\" (UID: \"6a74f65d-f8d2-41af-8469-6f8d020b41de\") " pod="openshift-console-operator/console-operator-58897d9998-db5pg" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.281678 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-hgs4c" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.281911 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-wdsj4"] Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.295449 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" Jan 31 14:43:58 crc kubenswrapper[4751]: W0131 14:43:58.301062 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcd7a932_6db9_4cca_b619_852242324725.slice/crio-365a012e0dc256fca522addd41c83a63fb808621ec56987dcd5bb062ddda6006 WatchSource:0}: Error finding container 365a012e0dc256fca522addd41c83a63fb808621ec56987dcd5bb062ddda6006: Status 404 returned error can't find the container with id 365a012e0dc256fca522addd41c83a63fb808621ec56987dcd5bb062ddda6006 Jan 31 14:43:58 crc kubenswrapper[4751]: W0131 14:43:58.302736 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f75ab4e_45c1_4ed9_b966_afa91dbc88a6.slice/crio-ab97991b8db78d8e752d40d1152ee78f2b57fd461dd5402070894dd0d8f788b9 WatchSource:0}: Error finding container ab97991b8db78d8e752d40d1152ee78f2b57fd461dd5402070894dd0d8f788b9: Status 404 returned error can't find the container with id ab97991b8db78d8e752d40d1152ee78f2b57fd461dd5402070894dd0d8f788b9 Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.317950 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-drp8h" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.337342 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84e2930a-5ae3-4171-a3dd-e5eea62ef157-serving-cert\") pod \"route-controller-manager-6576b87f9c-7762w\" (UID: \"84e2930a-5ae3-4171-a3dd-e5eea62ef157\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7762w" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.337386 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4e18e163-6cf0-48ef-9a6f-90cbece870b0-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.337410 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4e18e163-6cf0-48ef-9a6f-90cbece870b0-bound-sa-token\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.337430 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5caeb3dc-2a42-41b5-ac91-c1c8a216fb43-trusted-ca-bundle\") pod \"console-f9d7485db-h262z\" (UID: \"5caeb3dc-2a42-41b5-ac91-c1c8a216fb43\") " pod="openshift-console/console-f9d7485db-h262z" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.337448 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/89314349-bbc8-4886-b93b-51358e4e71b0-etcd-client\") pod \"apiserver-76f77b778f-5f7jc\" (UID: \"89314349-bbc8-4886-b93b-51358e4e71b0\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7jc" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.337481 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4e18e163-6cf0-48ef-9a6f-90cbece870b0-trusted-ca\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.337505 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwdnj\" (UniqueName: \"kubernetes.io/projected/ac1cf81b-8ec8-4ae4-bfb3-d46bb75f24d4-kube-api-access-vwdnj\") pod \"openshift-controller-manager-operator-756b6f6bc6-s7gwp\" (UID: \"ac1cf81b-8ec8-4ae4-bfb3-d46bb75f24d4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s7gwp" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.337532 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/89314349-bbc8-4886-b93b-51358e4e71b0-audit\") pod \"apiserver-76f77b778f-5f7jc\" (UID: \"89314349-bbc8-4886-b93b-51358e4e71b0\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7jc" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.337930 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4e18e163-6cf0-48ef-9a6f-90cbece870b0-registry-certificates\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.337965 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/89314349-bbc8-4886-b93b-51358e4e71b0-audit-dir\") pod \"apiserver-76f77b778f-5f7jc\" (UID: \"89314349-bbc8-4886-b93b-51358e4e71b0\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7jc" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.337989 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17d7ae01-24ad-448d-ae7c-10df353833f4-config\") pod \"kube-apiserver-operator-766d6c64bb-cp47m\" (UID: \"17d7ae01-24ad-448d-ae7c-10df353833f4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cp47m" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.338046 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/89314349-bbc8-4886-b93b-51358e4e71b0-etcd-serving-ca\") pod \"apiserver-76f77b778f-5f7jc\" (UID: \"89314349-bbc8-4886-b93b-51358e4e71b0\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7jc" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.338101 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/01ff1674-4e01-4cdc-aea3-1e91a6a389e3-service-ca-bundle\") pod \"router-default-5444994796-5hn9b\" (UID: \"01ff1674-4e01-4cdc-aea3-1e91a6a389e3\") " pod="openshift-ingress/router-default-5444994796-5hn9b" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.338465 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c4b193a-a01b-440a-a94a-55c4b5f06586-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-nz99n\" (UID: \"4c4b193a-a01b-440a-a94a-55c4b5f06586\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nz99n" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.338236 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-czqdr" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.338522 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac1cf81b-8ec8-4ae4-bfb3-d46bb75f24d4-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-s7gwp\" (UID: \"ac1cf81b-8ec8-4ae4-bfb3-d46bb75f24d4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s7gwp" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.338718 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89314349-bbc8-4886-b93b-51358e4e71b0-config\") pod \"apiserver-76f77b778f-5f7jc\" (UID: \"89314349-bbc8-4886-b93b-51358e4e71b0\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7jc" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.338775 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5caeb3dc-2a42-41b5-ac91-c1c8a216fb43-service-ca\") pod \"console-f9d7485db-h262z\" (UID: \"5caeb3dc-2a42-41b5-ac91-c1c8a216fb43\") " pod="openshift-console/console-f9d7485db-h262z" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.338807 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sjfc\" (UniqueName: \"kubernetes.io/projected/853ca050-beae-4089-a5df-9556eeda508b-kube-api-access-7sjfc\") pod \"cluster-samples-operator-665b6dd947-8m7f4\" (UID: \"853ca050-beae-4089-a5df-9556eeda508b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8m7f4" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.338845 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c4b193a-a01b-440a-a94a-55c4b5f06586-config\") pod \"openshift-apiserver-operator-796bbdcf4f-nz99n\" (UID: \"4c4b193a-a01b-440a-a94a-55c4b5f06586\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nz99n" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.338883 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/853ca050-beae-4089-a5df-9556eeda508b-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-8m7f4\" (UID: \"853ca050-beae-4089-a5df-9556eeda508b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8m7f4" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.338926 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/01ff1674-4e01-4cdc-aea3-1e91a6a389e3-stats-auth\") pod \"router-default-5444994796-5hn9b\" (UID: \"01ff1674-4e01-4cdc-aea3-1e91a6a389e3\") " pod="openshift-ingress/router-default-5444994796-5hn9b" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.338965 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5caeb3dc-2a42-41b5-ac91-c1c8a216fb43-console-oauth-config\") pod \"console-f9d7485db-h262z\" (UID: \"5caeb3dc-2a42-41b5-ac91-c1c8a216fb43\") " pod="openshift-console/console-f9d7485db-h262z" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.339092 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84e2930a-5ae3-4171-a3dd-e5eea62ef157-config\") pod \"route-controller-manager-6576b87f9c-7762w\" (UID: \"84e2930a-5ae3-4171-a3dd-e5eea62ef157\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7762w" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.339146 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8662\" (UniqueName: \"kubernetes.io/projected/01ff1674-4e01-4cdc-aea3-1e91a6a389e3-kube-api-access-n8662\") pod \"router-default-5444994796-5hn9b\" (UID: \"01ff1674-4e01-4cdc-aea3-1e91a6a389e3\") " pod="openshift-ingress/router-default-5444994796-5hn9b" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.339185 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/17d7ae01-24ad-448d-ae7c-10df353833f4-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-cp47m\" (UID: \"17d7ae01-24ad-448d-ae7c-10df353833f4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cp47m" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.339238 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/01ff1674-4e01-4cdc-aea3-1e91a6a389e3-default-certificate\") pod \"router-default-5444994796-5hn9b\" (UID: \"01ff1674-4e01-4cdc-aea3-1e91a6a389e3\") " pod="openshift-ingress/router-default-5444994796-5hn9b" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.339287 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/89314349-bbc8-4886-b93b-51358e4e71b0-image-import-ca\") pod \"apiserver-76f77b778f-5f7jc\" (UID: \"89314349-bbc8-4886-b93b-51358e4e71b0\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7jc" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.339321 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wswl\" (UniqueName: \"kubernetes.io/projected/4c4b193a-a01b-440a-a94a-55c4b5f06586-kube-api-access-9wswl\") pod \"openshift-apiserver-operator-796bbdcf4f-nz99n\" (UID: \"4c4b193a-a01b-440a-a94a-55c4b5f06586\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nz99n" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.339345 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89314349-bbc8-4886-b93b-51358e4e71b0-trusted-ca-bundle\") pod \"apiserver-76f77b778f-5f7jc\" (UID: \"89314349-bbc8-4886-b93b-51358e4e71b0\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7jc" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.339415 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4e18e163-6cf0-48ef-9a6f-90cbece870b0-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.339437 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5caeb3dc-2a42-41b5-ac91-c1c8a216fb43-console-config\") pod \"console-f9d7485db-h262z\" (UID: \"5caeb3dc-2a42-41b5-ac91-c1c8a216fb43\") " pod="openshift-console/console-f9d7485db-h262z" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.339480 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvlhx\" (UniqueName: \"kubernetes.io/projected/5caeb3dc-2a42-41b5-ac91-c1c8a216fb43-kube-api-access-lvlhx\") pod \"console-f9d7485db-h262z\" (UID: \"5caeb3dc-2a42-41b5-ac91-c1c8a216fb43\") " pod="openshift-console/console-f9d7485db-h262z" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.339516 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17d7ae01-24ad-448d-ae7c-10df353833f4-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-cp47m\" (UID: \"17d7ae01-24ad-448d-ae7c-10df353833f4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cp47m" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.340333 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/0058e7f4-92db-444d-a979-2880c3f83442-machine-approver-tls\") pod \"machine-approver-56656f9798-cc6m2\" (UID: \"0058e7f4-92db-444d-a979-2880c3f83442\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cc6m2" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.340397 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/84e2930a-5ae3-4171-a3dd-e5eea62ef157-client-ca\") pod \"route-controller-manager-6576b87f9c-7762w\" (UID: \"84e2930a-5ae3-4171-a3dd-e5eea62ef157\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7762w" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.340452 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9th82\" (UniqueName: \"kubernetes.io/projected/d031fa1b-4d52-47d7-8c39-5fa21fb6c244-kube-api-access-9th82\") pod \"machine-config-controller-84d6567774-xv2tk\" (UID: \"d031fa1b-4d52-47d7-8c39-5fa21fb6c244\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xv2tk" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.340486 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89314349-bbc8-4886-b93b-51358e4e71b0-serving-cert\") pod \"apiserver-76f77b778f-5f7jc\" (UID: \"89314349-bbc8-4886-b93b-51358e4e71b0\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7jc" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.340509 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppx8c\" (UniqueName: \"kubernetes.io/projected/0058e7f4-92db-444d-a979-2880c3f83442-kube-api-access-ppx8c\") pod \"machine-approver-56656f9798-cc6m2\" (UID: \"0058e7f4-92db-444d-a979-2880c3f83442\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cc6m2" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.340571 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.340592 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtk42\" (UniqueName: \"kubernetes.io/projected/89314349-bbc8-4886-b93b-51358e4e71b0-kube-api-access-mtk42\") pod \"apiserver-76f77b778f-5f7jc\" (UID: \"89314349-bbc8-4886-b93b-51358e4e71b0\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7jc" Jan 31 14:43:58 crc kubenswrapper[4751]: E0131 14:43:58.342489 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:43:58.842463596 +0000 UTC m=+143.217176571 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.340621 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5caeb3dc-2a42-41b5-ac91-c1c8a216fb43-oauth-serving-cert\") pod \"console-f9d7485db-h262z\" (UID: \"5caeb3dc-2a42-41b5-ac91-c1c8a216fb43\") " pod="openshift-console/console-f9d7485db-h262z" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.343243 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4e18e163-6cf0-48ef-9a6f-90cbece870b0-registry-tls\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.343302 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/89314349-bbc8-4886-b93b-51358e4e71b0-node-pullsecrets\") pod \"apiserver-76f77b778f-5f7jc\" (UID: \"89314349-bbc8-4886-b93b-51358e4e71b0\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7jc" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.343335 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksqdw\" (UniqueName: \"kubernetes.io/projected/84e2930a-5ae3-4171-a3dd-e5eea62ef157-kube-api-access-ksqdw\") pod \"route-controller-manager-6576b87f9c-7762w\" (UID: \"84e2930a-5ae3-4171-a3dd-e5eea62ef157\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7762w" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.343371 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0058e7f4-92db-444d-a979-2880c3f83442-auth-proxy-config\") pod \"machine-approver-56656f9798-cc6m2\" (UID: \"0058e7f4-92db-444d-a979-2880c3f83442\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cc6m2" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.343457 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d031fa1b-4d52-47d7-8c39-5fa21fb6c244-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-xv2tk\" (UID: \"d031fa1b-4d52-47d7-8c39-5fa21fb6c244\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xv2tk" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.343492 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llx87\" (UniqueName: \"kubernetes.io/projected/4e18e163-6cf0-48ef-9a6f-90cbece870b0-kube-api-access-llx87\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.343638 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/01ff1674-4e01-4cdc-aea3-1e91a6a389e3-metrics-certs\") pod \"router-default-5444994796-5hn9b\" (UID: \"01ff1674-4e01-4cdc-aea3-1e91a6a389e3\") " pod="openshift-ingress/router-default-5444994796-5hn9b" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.345223 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w4lzx" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.343667 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5caeb3dc-2a42-41b5-ac91-c1c8a216fb43-console-serving-cert\") pod \"console-f9d7485db-h262z\" (UID: \"5caeb3dc-2a42-41b5-ac91-c1c8a216fb43\") " pod="openshift-console/console-f9d7485db-h262z" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.346395 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0058e7f4-92db-444d-a979-2880c3f83442-config\") pod \"machine-approver-56656f9798-cc6m2\" (UID: \"0058e7f4-92db-444d-a979-2880c3f83442\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cc6m2" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.346432 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/89314349-bbc8-4886-b93b-51358e4e71b0-encryption-config\") pod \"apiserver-76f77b778f-5f7jc\" (UID: \"89314349-bbc8-4886-b93b-51358e4e71b0\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7jc" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.346467 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d031fa1b-4d52-47d7-8c39-5fa21fb6c244-proxy-tls\") pod \"machine-config-controller-84d6567774-xv2tk\" (UID: \"d031fa1b-4d52-47d7-8c39-5fa21fb6c244\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xv2tk" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.346491 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac1cf81b-8ec8-4ae4-bfb3-d46bb75f24d4-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-s7gwp\" (UID: \"ac1cf81b-8ec8-4ae4-bfb3-d46bb75f24d4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s7gwp" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.354275 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-z9dj7"] Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.389509 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-pmglg"] Jan 31 14:43:58 crc kubenswrapper[4751]: W0131 14:43:58.422253 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode14d9fb0_f377_4331_8bc1_8f4017bb95a3.slice/crio-b04fce866f961463553a14856d3791616fdd2474de989b38b991dcb6c8a6211a WatchSource:0}: Error finding container b04fce866f961463553a14856d3791616fdd2474de989b38b991dcb6c8a6211a: Status 404 returned error can't find the container with id b04fce866f961463553a14856d3791616fdd2474de989b38b991dcb6c8a6211a Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.447723 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:43:58 crc kubenswrapper[4751]: E0131 14:43:58.447956 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:43:58.947912401 +0000 UTC m=+143.322625276 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.448015 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.448039 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtk42\" (UniqueName: \"kubernetes.io/projected/89314349-bbc8-4886-b93b-51358e4e71b0-kube-api-access-mtk42\") pod \"apiserver-76f77b778f-5f7jc\" (UID: \"89314349-bbc8-4886-b93b-51358e4e71b0\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7jc" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.448059 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/89a244ab-c405-48aa-893f-f50995384ede-webhook-cert\") pod \"packageserver-d55dfcdfc-7hjp9\" (UID: \"89a244ab-c405-48aa-893f-f50995384ede\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7hjp9" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.448092 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4e18e163-6cf0-48ef-9a6f-90cbece870b0-registry-tls\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.448107 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5caeb3dc-2a42-41b5-ac91-c1c8a216fb43-oauth-serving-cert\") pod \"console-f9d7485db-h262z\" (UID: \"5caeb3dc-2a42-41b5-ac91-c1c8a216fb43\") " pod="openshift-console/console-f9d7485db-h262z" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.448123 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f47a4e08-e21f-4a13-9ea2-bc1545a64cae-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-l76jv\" (UID: \"f47a4e08-e21f-4a13-9ea2-bc1545a64cae\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l76jv" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.448140 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6lk6\" (UniqueName: \"kubernetes.io/projected/89a244ab-c405-48aa-893f-f50995384ede-kube-api-access-f6lk6\") pod \"packageserver-d55dfcdfc-7hjp9\" (UID: \"89a244ab-c405-48aa-893f-f50995384ede\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7hjp9" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.448167 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f6be9bbf-6799-45e0-8d53-790a5484f3a4-metrics-tls\") pod \"dns-default-skzbg\" (UID: \"f6be9bbf-6799-45e0-8d53-790a5484f3a4\") " pod="openshift-dns/dns-default-skzbg" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.448191 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/89314349-bbc8-4886-b93b-51358e4e71b0-node-pullsecrets\") pod \"apiserver-76f77b778f-5f7jc\" (UID: \"89314349-bbc8-4886-b93b-51358e4e71b0\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7jc" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.448207 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z67k\" (UniqueName: \"kubernetes.io/projected/e999b5a4-2e54-4195-98fa-4c5fa36f1b3a-kube-api-access-2z67k\") pod \"ingress-canary-qdsgb\" (UID: \"e999b5a4-2e54-4195-98fa-4c5fa36f1b3a\") " pod="openshift-ingress-canary/ingress-canary-qdsgb" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.448230 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksqdw\" (UniqueName: \"kubernetes.io/projected/84e2930a-5ae3-4171-a3dd-e5eea62ef157-kube-api-access-ksqdw\") pod \"route-controller-manager-6576b87f9c-7762w\" (UID: \"84e2930a-5ae3-4171-a3dd-e5eea62ef157\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7762w" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.448245 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0058e7f4-92db-444d-a979-2880c3f83442-auth-proxy-config\") pod \"machine-approver-56656f9798-cc6m2\" (UID: \"0058e7f4-92db-444d-a979-2880c3f83442\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cc6m2" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.448260 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d031fa1b-4d52-47d7-8c39-5fa21fb6c244-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-xv2tk\" (UID: \"d031fa1b-4d52-47d7-8c39-5fa21fb6c244\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xv2tk" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.448286 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llx87\" (UniqueName: \"kubernetes.io/projected/4e18e163-6cf0-48ef-9a6f-90cbece870b0-kube-api-access-llx87\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.448301 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/01ff1674-4e01-4cdc-aea3-1e91a6a389e3-metrics-certs\") pod \"router-default-5444994796-5hn9b\" (UID: \"01ff1674-4e01-4cdc-aea3-1e91a6a389e3\") " pod="openshift-ingress/router-default-5444994796-5hn9b" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.448317 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a55fc688-004a-4d6f-a48e-c10b0ae1d8f1-registration-dir\") pod \"csi-hostpathplugin-x4rnh\" (UID: \"a55fc688-004a-4d6f-a48e-c10b0ae1d8f1\") " pod="hostpath-provisioner/csi-hostpathplugin-x4rnh" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.448321 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/89314349-bbc8-4886-b93b-51358e4e71b0-node-pullsecrets\") pod \"apiserver-76f77b778f-5f7jc\" (UID: \"89314349-bbc8-4886-b93b-51358e4e71b0\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7jc" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.448333 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pj5gk\" (UniqueName: \"kubernetes.io/projected/a55fc688-004a-4d6f-a48e-c10b0ae1d8f1-kube-api-access-pj5gk\") pod \"csi-hostpathplugin-x4rnh\" (UID: \"a55fc688-004a-4d6f-a48e-c10b0ae1d8f1\") " pod="hostpath-provisioner/csi-hostpathplugin-x4rnh" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.448386 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/a55fc688-004a-4d6f-a48e-c10b0ae1d8f1-plugins-dir\") pod \"csi-hostpathplugin-x4rnh\" (UID: \"a55fc688-004a-4d6f-a48e-c10b0ae1d8f1\") " pod="hostpath-provisioner/csi-hostpathplugin-x4rnh" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.448403 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4td9\" (UniqueName: \"kubernetes.io/projected/2ad3db81-4cb9-49a5-b4e0-55b546996fa0-kube-api-access-r4td9\") pod \"kube-storage-version-migrator-operator-b67b599dd-xqgfv\" (UID: \"2ad3db81-4cb9-49a5-b4e0-55b546996fa0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xqgfv" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.448423 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5caeb3dc-2a42-41b5-ac91-c1c8a216fb43-console-serving-cert\") pod \"console-f9d7485db-h262z\" (UID: \"5caeb3dc-2a42-41b5-ac91-c1c8a216fb43\") " pod="openshift-console/console-f9d7485db-h262z" Jan 31 14:43:58 crc kubenswrapper[4751]: E0131 14:43:58.448433 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:43:58.948424684 +0000 UTC m=+143.323137569 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.448545 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0058e7f4-92db-444d-a979-2880c3f83442-config\") pod \"machine-approver-56656f9798-cc6m2\" (UID: \"0058e7f4-92db-444d-a979-2880c3f83442\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cc6m2" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.448572 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/89314349-bbc8-4886-b93b-51358e4e71b0-encryption-config\") pod \"apiserver-76f77b778f-5f7jc\" (UID: \"89314349-bbc8-4886-b93b-51358e4e71b0\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7jc" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.448620 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d031fa1b-4d52-47d7-8c39-5fa21fb6c244-proxy-tls\") pod \"machine-config-controller-84d6567774-xv2tk\" (UID: \"d031fa1b-4d52-47d7-8c39-5fa21fb6c244\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xv2tk" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.448639 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac1cf81b-8ec8-4ae4-bfb3-d46bb75f24d4-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-s7gwp\" (UID: \"ac1cf81b-8ec8-4ae4-bfb3-d46bb75f24d4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s7gwp" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.448691 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4lkn\" (UniqueName: \"kubernetes.io/projected/7014a649-2d58-4772-9eb3-697e4b925923-kube-api-access-z4lkn\") pod \"package-server-manager-789f6589d5-kvfvk\" (UID: \"7014a649-2d58-4772-9eb3-697e4b925923\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kvfvk" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.448719 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5c65\" (UniqueName: \"kubernetes.io/projected/b17c8e83-275b-4777-946a-c7360ad8fa48-kube-api-access-r5c65\") pod \"migrator-59844c95c7-b44gm\" (UID: \"b17c8e83-275b-4777-946a-c7360ad8fa48\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b44gm" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.448771 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84e2930a-5ae3-4171-a3dd-e5eea62ef157-serving-cert\") pod \"route-controller-manager-6576b87f9c-7762w\" (UID: \"84e2930a-5ae3-4171-a3dd-e5eea62ef157\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7762w" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.448789 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/cc8ec6f8-52f3-4bb8-a00b-4f73276a3af4-signing-key\") pod \"service-ca-9c57cc56f-vbfvz\" (UID: \"cc8ec6f8-52f3-4bb8-a00b-4f73276a3af4\") " pod="openshift-service-ca/service-ca-9c57cc56f-vbfvz" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.448846 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/7014a649-2d58-4772-9eb3-697e4b925923-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-kvfvk\" (UID: \"7014a649-2d58-4772-9eb3-697e4b925923\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kvfvk" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.449178 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca236cfc-51d0-4d79-b90c-ddac400b4dbb-config\") pod \"service-ca-operator-777779d784-ghblb\" (UID: \"ca236cfc-51d0-4d79-b90c-ddac400b4dbb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ghblb" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.449227 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4e18e163-6cf0-48ef-9a6f-90cbece870b0-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.449247 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4e18e163-6cf0-48ef-9a6f-90cbece870b0-bound-sa-token\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.449279 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5caeb3dc-2a42-41b5-ac91-c1c8a216fb43-trusted-ca-bundle\") pod \"console-f9d7485db-h262z\" (UID: \"5caeb3dc-2a42-41b5-ac91-c1c8a216fb43\") " pod="openshift-console/console-f9d7485db-h262z" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.449317 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/89314349-bbc8-4886-b93b-51358e4e71b0-etcd-client\") pod \"apiserver-76f77b778f-5f7jc\" (UID: \"89314349-bbc8-4886-b93b-51358e4e71b0\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7jc" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.449336 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/5c630253-f658-44fb-891d-f560f1e2b577-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-h4drr\" (UID: \"5c630253-f658-44fb-891d-f560f1e2b577\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h4drr" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.449361 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4e18e163-6cf0-48ef-9a6f-90cbece870b0-trusted-ca\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.449401 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9edad05e-bd87-4a20-a947-6b09f9f7c93a-srv-cert\") pod \"catalog-operator-68c6474976-vc9q2\" (UID: \"9edad05e-bd87-4a20-a947-6b09f9f7c93a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vc9q2" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.449418 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9edad05e-bd87-4a20-a947-6b09f9f7c93a-profile-collector-cert\") pod \"catalog-operator-68c6474976-vc9q2\" (UID: \"9edad05e-bd87-4a20-a947-6b09f9f7c93a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vc9q2" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.449436 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwjbh\" (UniqueName: \"kubernetes.io/projected/eade01dc-846b-42a8-a6ed-8cf0a0663e82-kube-api-access-zwjbh\") pod \"collect-profiles-29497830-rpwmp\" (UID: \"eade01dc-846b-42a8-a6ed-8cf0a0663e82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497830-rpwmp" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.449487 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwdnj\" (UniqueName: \"kubernetes.io/projected/ac1cf81b-8ec8-4ae4-bfb3-d46bb75f24d4-kube-api-access-vwdnj\") pod \"openshift-controller-manager-operator-756b6f6bc6-s7gwp\" (UID: \"ac1cf81b-8ec8-4ae4-bfb3-d46bb75f24d4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s7gwp" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.449504 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/dc13f997-316e-4e81-a56e-0fa6e02d1502-certs\") pod \"machine-config-server-x4njd\" (UID: \"dc13f997-316e-4e81-a56e-0fa6e02d1502\") " pod="openshift-machine-config-operator/machine-config-server-x4njd" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.449518 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/89a244ab-c405-48aa-893f-f50995384ede-tmpfs\") pod \"packageserver-d55dfcdfc-7hjp9\" (UID: \"89a244ab-c405-48aa-893f-f50995384ede\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7hjp9" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.449536 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9f99779e-5e77-4b5c-8886-7accebe8a897-profile-collector-cert\") pod \"olm-operator-6b444d44fb-7hc86\" (UID: \"9f99779e-5e77-4b5c-8886-7accebe8a897\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7hc86" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.451342 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5caeb3dc-2a42-41b5-ac91-c1c8a216fb43-trusted-ca-bundle\") pod \"console-f9d7485db-h262z\" (UID: \"5caeb3dc-2a42-41b5-ac91-c1c8a216fb43\") " pod="openshift-console/console-f9d7485db-h262z" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.451340 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/89314349-bbc8-4886-b93b-51358e4e71b0-audit\") pod \"apiserver-76f77b778f-5f7jc\" (UID: \"89314349-bbc8-4886-b93b-51358e4e71b0\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7jc" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.451440 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eade01dc-846b-42a8-a6ed-8cf0a0663e82-config-volume\") pod \"collect-profiles-29497830-rpwmp\" (UID: \"eade01dc-846b-42a8-a6ed-8cf0a0663e82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497830-rpwmp" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.451558 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/89a244ab-c405-48aa-893f-f50995384ede-apiservice-cert\") pod \"packageserver-d55dfcdfc-7hjp9\" (UID: \"89a244ab-c405-48aa-893f-f50995384ede\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7hjp9" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.451575 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8tl6\" (UniqueName: \"kubernetes.io/projected/cc8ec6f8-52f3-4bb8-a00b-4f73276a3af4-kube-api-access-x8tl6\") pod \"service-ca-9c57cc56f-vbfvz\" (UID: \"cc8ec6f8-52f3-4bb8-a00b-4f73276a3af4\") " pod="openshift-service-ca/service-ca-9c57cc56f-vbfvz" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.451600 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4e18e163-6cf0-48ef-9a6f-90cbece870b0-registry-certificates\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.451617 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/89314349-bbc8-4886-b93b-51358e4e71b0-audit-dir\") pod \"apiserver-76f77b778f-5f7jc\" (UID: \"89314349-bbc8-4886-b93b-51358e4e71b0\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7jc" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.451634 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17d7ae01-24ad-448d-ae7c-10df353833f4-config\") pod \"kube-apiserver-operator-766d6c64bb-cp47m\" (UID: \"17d7ae01-24ad-448d-ae7c-10df353833f4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cp47m" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.451662 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d64a8f76-87cc-45eb-bc92-5802a3db6c3d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-v4px6\" (UID: \"d64a8f76-87cc-45eb-bc92-5802a3db6c3d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v4px6" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.451679 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9f99779e-5e77-4b5c-8886-7accebe8a897-srv-cert\") pod \"olm-operator-6b444d44fb-7hc86\" (UID: \"9f99779e-5e77-4b5c-8886-7accebe8a897\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7hc86" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.451700 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdrlx\" (UniqueName: \"kubernetes.io/projected/8bc2bde1-a50e-47bb-8211-4b5ed0ac74ea-kube-api-access-qdrlx\") pod \"marketplace-operator-79b997595-5r6kv\" (UID: \"8bc2bde1-a50e-47bb-8211-4b5ed0ac74ea\") " pod="openshift-marketplace/marketplace-operator-79b997595-5r6kv" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.451719 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/89314349-bbc8-4886-b93b-51358e4e71b0-etcd-serving-ca\") pod \"apiserver-76f77b778f-5f7jc\" (UID: \"89314349-bbc8-4886-b93b-51358e4e71b0\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7jc" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.451734 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca236cfc-51d0-4d79-b90c-ddac400b4dbb-serving-cert\") pod \"service-ca-operator-777779d784-ghblb\" (UID: \"ca236cfc-51d0-4d79-b90c-ddac400b4dbb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ghblb" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.451751 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/01ff1674-4e01-4cdc-aea3-1e91a6a389e3-service-ca-bundle\") pod \"router-default-5444994796-5hn9b\" (UID: \"01ff1674-4e01-4cdc-aea3-1e91a6a389e3\") " pod="openshift-ingress/router-default-5444994796-5hn9b" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.451767 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c4b193a-a01b-440a-a94a-55c4b5f06586-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-nz99n\" (UID: \"4c4b193a-a01b-440a-a94a-55c4b5f06586\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nz99n" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.451839 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0058e7f4-92db-444d-a979-2880c3f83442-config\") pod \"machine-approver-56656f9798-cc6m2\" (UID: \"0058e7f4-92db-444d-a979-2880c3f83442\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cc6m2" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.451877 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f6be9bbf-6799-45e0-8d53-790a5484f3a4-config-volume\") pod \"dns-default-skzbg\" (UID: \"f6be9bbf-6799-45e0-8d53-790a5484f3a4\") " pod="openshift-dns/dns-default-skzbg" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.451935 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/cc8ec6f8-52f3-4bb8-a00b-4f73276a3af4-signing-cabundle\") pod \"service-ca-9c57cc56f-vbfvz\" (UID: \"cc8ec6f8-52f3-4bb8-a00b-4f73276a3af4\") " pod="openshift-service-ca/service-ca-9c57cc56f-vbfvz" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.451953 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m898f\" (UniqueName: \"kubernetes.io/projected/dc13f997-316e-4e81-a56e-0fa6e02d1502-kube-api-access-m898f\") pod \"machine-config-server-x4njd\" (UID: \"dc13f997-316e-4e81-a56e-0fa6e02d1502\") " pod="openshift-machine-config-operator/machine-config-server-x4njd" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.452123 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89314349-bbc8-4886-b93b-51358e4e71b0-config\") pod \"apiserver-76f77b778f-5f7jc\" (UID: \"89314349-bbc8-4886-b93b-51358e4e71b0\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7jc" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.452176 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac1cf81b-8ec8-4ae4-bfb3-d46bb75f24d4-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-s7gwp\" (UID: \"ac1cf81b-8ec8-4ae4-bfb3-d46bb75f24d4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s7gwp" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.452200 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f47a4e08-e21f-4a13-9ea2-bc1545a64cae-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-l76jv\" (UID: \"f47a4e08-e21f-4a13-9ea2-bc1545a64cae\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l76jv" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.452283 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d64a8f76-87cc-45eb-bc92-5802a3db6c3d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-v4px6\" (UID: \"d64a8f76-87cc-45eb-bc92-5802a3db6c3d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v4px6" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.452307 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a55fc688-004a-4d6f-a48e-c10b0ae1d8f1-socket-dir\") pod \"csi-hostpathplugin-x4rnh\" (UID: \"a55fc688-004a-4d6f-a48e-c10b0ae1d8f1\") " pod="hostpath-provisioner/csi-hostpathplugin-x4rnh" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.452365 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw467\" (UniqueName: \"kubernetes.io/projected/9edad05e-bd87-4a20-a947-6b09f9f7c93a-kube-api-access-jw467\") pod \"catalog-operator-68c6474976-vc9q2\" (UID: \"9edad05e-bd87-4a20-a947-6b09f9f7c93a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vc9q2" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.452391 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5caeb3dc-2a42-41b5-ac91-c1c8a216fb43-service-ca\") pod \"console-f9d7485db-h262z\" (UID: \"5caeb3dc-2a42-41b5-ac91-c1c8a216fb43\") " pod="openshift-console/console-f9d7485db-h262z" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.452453 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sjfc\" (UniqueName: \"kubernetes.io/projected/853ca050-beae-4089-a5df-9556eeda508b-kube-api-access-7sjfc\") pod \"cluster-samples-operator-665b6dd947-8m7f4\" (UID: \"853ca050-beae-4089-a5df-9556eeda508b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8m7f4" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.452471 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c4b193a-a01b-440a-a94a-55c4b5f06586-config\") pod \"openshift-apiserver-operator-796bbdcf4f-nz99n\" (UID: \"4c4b193a-a01b-440a-a94a-55c4b5f06586\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nz99n" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.452537 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/853ca050-beae-4089-a5df-9556eeda508b-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-8m7f4\" (UID: \"853ca050-beae-4089-a5df-9556eeda508b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8m7f4" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.452621 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/01ff1674-4e01-4cdc-aea3-1e91a6a389e3-stats-auth\") pod \"router-default-5444994796-5hn9b\" (UID: \"01ff1674-4e01-4cdc-aea3-1e91a6a389e3\") " pod="openshift-ingress/router-default-5444994796-5hn9b" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.452641 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5caeb3dc-2a42-41b5-ac91-c1c8a216fb43-console-oauth-config\") pod \"console-f9d7485db-h262z\" (UID: \"5caeb3dc-2a42-41b5-ac91-c1c8a216fb43\") " pod="openshift-console/console-f9d7485db-h262z" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.452683 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84e2930a-5ae3-4171-a3dd-e5eea62ef157-config\") pod \"route-controller-manager-6576b87f9c-7762w\" (UID: \"84e2930a-5ae3-4171-a3dd-e5eea62ef157\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7762w" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.452845 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-4m7jl"] Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.452869 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8662\" (UniqueName: \"kubernetes.io/projected/01ff1674-4e01-4cdc-aea3-1e91a6a389e3-kube-api-access-n8662\") pod \"router-default-5444994796-5hn9b\" (UID: \"01ff1674-4e01-4cdc-aea3-1e91a6a389e3\") " pod="openshift-ingress/router-default-5444994796-5hn9b" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.452889 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e999b5a4-2e54-4195-98fa-4c5fa36f1b3a-cert\") pod \"ingress-canary-qdsgb\" (UID: \"e999b5a4-2e54-4195-98fa-4c5fa36f1b3a\") " pod="openshift-ingress-canary/ingress-canary-qdsgb" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.452945 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ad3db81-4cb9-49a5-b4e0-55b546996fa0-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-xqgfv\" (UID: \"2ad3db81-4cb9-49a5-b4e0-55b546996fa0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xqgfv" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.452962 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/17d7ae01-24ad-448d-ae7c-10df353833f4-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-cp47m\" (UID: \"17d7ae01-24ad-448d-ae7c-10df353833f4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cp47m" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.452990 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/01ff1674-4e01-4cdc-aea3-1e91a6a389e3-default-certificate\") pod \"router-default-5444994796-5hn9b\" (UID: \"01ff1674-4e01-4cdc-aea3-1e91a6a389e3\") " pod="openshift-ingress/router-default-5444994796-5hn9b" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.453007 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/89314349-bbc8-4886-b93b-51358e4e71b0-image-import-ca\") pod \"apiserver-76f77b778f-5f7jc\" (UID: \"89314349-bbc8-4886-b93b-51358e4e71b0\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7jc" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.453024 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wswl\" (UniqueName: \"kubernetes.io/projected/4c4b193a-a01b-440a-a94a-55c4b5f06586-kube-api-access-9wswl\") pod \"openshift-apiserver-operator-796bbdcf4f-nz99n\" (UID: \"4c4b193a-a01b-440a-a94a-55c4b5f06586\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nz99n" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.453044 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89314349-bbc8-4886-b93b-51358e4e71b0-trusted-ca-bundle\") pod \"apiserver-76f77b778f-5f7jc\" (UID: \"89314349-bbc8-4886-b93b-51358e4e71b0\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7jc" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.453062 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngnfg\" (UniqueName: \"kubernetes.io/projected/ca236cfc-51d0-4d79-b90c-ddac400b4dbb-kube-api-access-ngnfg\") pod \"service-ca-operator-777779d784-ghblb\" (UID: \"ca236cfc-51d0-4d79-b90c-ddac400b4dbb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ghblb" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.453089 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d64a8f76-87cc-45eb-bc92-5802a3db6c3d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-v4px6\" (UID: \"d64a8f76-87cc-45eb-bc92-5802a3db6c3d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v4px6" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.453107 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxcmz\" (UniqueName: \"kubernetes.io/projected/5c630253-f658-44fb-891d-f560f1e2b577-kube-api-access-gxcmz\") pod \"control-plane-machine-set-operator-78cbb6b69f-h4drr\" (UID: \"5c630253-f658-44fb-891d-f560f1e2b577\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h4drr" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.453151 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvlhx\" (UniqueName: \"kubernetes.io/projected/5caeb3dc-2a42-41b5-ac91-c1c8a216fb43-kube-api-access-lvlhx\") pod \"console-f9d7485db-h262z\" (UID: \"5caeb3dc-2a42-41b5-ac91-c1c8a216fb43\") " pod="openshift-console/console-f9d7485db-h262z" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.453167 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17d7ae01-24ad-448d-ae7c-10df353833f4-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-cp47m\" (UID: \"17d7ae01-24ad-448d-ae7c-10df353833f4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cp47m" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.453194 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4e18e163-6cf0-48ef-9a6f-90cbece870b0-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.453209 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5caeb3dc-2a42-41b5-ac91-c1c8a216fb43-console-config\") pod \"console-f9d7485db-h262z\" (UID: \"5caeb3dc-2a42-41b5-ac91-c1c8a216fb43\") " pod="openshift-console/console-f9d7485db-h262z" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.453226 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8bc2bde1-a50e-47bb-8211-4b5ed0ac74ea-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-5r6kv\" (UID: \"8bc2bde1-a50e-47bb-8211-4b5ed0ac74ea\") " pod="openshift-marketplace/marketplace-operator-79b997595-5r6kv" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.453251 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/0058e7f4-92db-444d-a979-2880c3f83442-machine-approver-tls\") pod \"machine-approver-56656f9798-cc6m2\" (UID: \"0058e7f4-92db-444d-a979-2880c3f83442\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cc6m2" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.453268 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eade01dc-846b-42a8-a6ed-8cf0a0663e82-secret-volume\") pod \"collect-profiles-29497830-rpwmp\" (UID: \"eade01dc-846b-42a8-a6ed-8cf0a0663e82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497830-rpwmp" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.453303 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/a55fc688-004a-4d6f-a48e-c10b0ae1d8f1-csi-data-dir\") pod \"csi-hostpathplugin-x4rnh\" (UID: \"a55fc688-004a-4d6f-a48e-c10b0ae1d8f1\") " pod="hostpath-provisioner/csi-hostpathplugin-x4rnh" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.453319 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ad3db81-4cb9-49a5-b4e0-55b546996fa0-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-xqgfv\" (UID: \"2ad3db81-4cb9-49a5-b4e0-55b546996fa0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xqgfv" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.453356 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/84e2930a-5ae3-4171-a3dd-e5eea62ef157-client-ca\") pod \"route-controller-manager-6576b87f9c-7762w\" (UID: \"84e2930a-5ae3-4171-a3dd-e5eea62ef157\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7762w" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.453372 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9th82\" (UniqueName: \"kubernetes.io/projected/d031fa1b-4d52-47d7-8c39-5fa21fb6c244-kube-api-access-9th82\") pod \"machine-config-controller-84d6567774-xv2tk\" (UID: \"d031fa1b-4d52-47d7-8c39-5fa21fb6c244\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xv2tk" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.453388 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f47a4e08-e21f-4a13-9ea2-bc1545a64cae-config\") pod \"kube-controller-manager-operator-78b949d7b-l76jv\" (UID: \"f47a4e08-e21f-4a13-9ea2-bc1545a64cae\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l76jv" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.453404 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8bc2bde1-a50e-47bb-8211-4b5ed0ac74ea-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-5r6kv\" (UID: \"8bc2bde1-a50e-47bb-8211-4b5ed0ac74ea\") " pod="openshift-marketplace/marketplace-operator-79b997595-5r6kv" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.453460 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89314349-bbc8-4886-b93b-51358e4e71b0-serving-cert\") pod \"apiserver-76f77b778f-5f7jc\" (UID: \"89314349-bbc8-4886-b93b-51358e4e71b0\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7jc" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.453476 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/a55fc688-004a-4d6f-a48e-c10b0ae1d8f1-mountpoint-dir\") pod \"csi-hostpathplugin-x4rnh\" (UID: \"a55fc688-004a-4d6f-a48e-c10b0ae1d8f1\") " pod="hostpath-provisioner/csi-hostpathplugin-x4rnh" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.453522 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grhf9\" (UniqueName: \"kubernetes.io/projected/9f99779e-5e77-4b5c-8886-7accebe8a897-kube-api-access-grhf9\") pod \"olm-operator-6b444d44fb-7hc86\" (UID: \"9f99779e-5e77-4b5c-8886-7accebe8a897\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7hc86" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.453542 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppx8c\" (UniqueName: \"kubernetes.io/projected/0058e7f4-92db-444d-a979-2880c3f83442-kube-api-access-ppx8c\") pod \"machine-approver-56656f9798-cc6m2\" (UID: \"0058e7f4-92db-444d-a979-2880c3f83442\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cc6m2" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.453559 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btktd\" (UniqueName: \"kubernetes.io/projected/f6be9bbf-6799-45e0-8d53-790a5484f3a4-kube-api-access-btktd\") pod \"dns-default-skzbg\" (UID: \"f6be9bbf-6799-45e0-8d53-790a5484f3a4\") " pod="openshift-dns/dns-default-skzbg" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.453642 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhsxz\" (UniqueName: \"kubernetes.io/projected/d64a8f76-87cc-45eb-bc92-5802a3db6c3d-kube-api-access-jhsxz\") pod \"cluster-image-registry-operator-dc59b4c8b-v4px6\" (UID: \"d64a8f76-87cc-45eb-bc92-5802a3db6c3d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v4px6" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.453728 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/dc13f997-316e-4e81-a56e-0fa6e02d1502-node-bootstrap-token\") pod \"machine-config-server-x4njd\" (UID: \"dc13f997-316e-4e81-a56e-0fa6e02d1502\") " pod="openshift-machine-config-operator/machine-config-server-x4njd" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.453857 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d031fa1b-4d52-47d7-8c39-5fa21fb6c244-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-xv2tk\" (UID: \"d031fa1b-4d52-47d7-8c39-5fa21fb6c244\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xv2tk" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.454161 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4e18e163-6cf0-48ef-9a6f-90cbece870b0-trusted-ca\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.449527 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5caeb3dc-2a42-41b5-ac91-c1c8a216fb43-oauth-serving-cert\") pod \"console-f9d7485db-h262z\" (UID: \"5caeb3dc-2a42-41b5-ac91-c1c8a216fb43\") " pod="openshift-console/console-f9d7485db-h262z" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.455986 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/01ff1674-4e01-4cdc-aea3-1e91a6a389e3-service-ca-bundle\") pod \"router-default-5444994796-5hn9b\" (UID: \"01ff1674-4e01-4cdc-aea3-1e91a6a389e3\") " pod="openshift-ingress/router-default-5444994796-5hn9b" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.456680 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac1cf81b-8ec8-4ae4-bfb3-d46bb75f24d4-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-s7gwp\" (UID: \"ac1cf81b-8ec8-4ae4-bfb3-d46bb75f24d4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s7gwp" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.457006 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/89314349-bbc8-4886-b93b-51358e4e71b0-audit-dir\") pod \"apiserver-76f77b778f-5f7jc\" (UID: \"89314349-bbc8-4886-b93b-51358e4e71b0\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7jc" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.452362 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0058e7f4-92db-444d-a979-2880c3f83442-auth-proxy-config\") pod \"machine-approver-56656f9798-cc6m2\" (UID: \"0058e7f4-92db-444d-a979-2880c3f83442\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cc6m2" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.459863 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5caeb3dc-2a42-41b5-ac91-c1c8a216fb43-service-ca\") pod \"console-f9d7485db-h262z\" (UID: \"5caeb3dc-2a42-41b5-ac91-c1c8a216fb43\") " pod="openshift-console/console-f9d7485db-h262z" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.460944 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4e18e163-6cf0-48ef-9a6f-90cbece870b0-registry-certificates\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.462021 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5caeb3dc-2a42-41b5-ac91-c1c8a216fb43-console-config\") pod \"console-f9d7485db-h262z\" (UID: \"5caeb3dc-2a42-41b5-ac91-c1c8a216fb43\") " pod="openshift-console/console-f9d7485db-h262z" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.462537 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4e18e163-6cf0-48ef-9a6f-90cbece870b0-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.462753 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/84e2930a-5ae3-4171-a3dd-e5eea62ef157-client-ca\") pod \"route-controller-manager-6576b87f9c-7762w\" (UID: \"84e2930a-5ae3-4171-a3dd-e5eea62ef157\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7762w" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.463932 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/89314349-bbc8-4886-b93b-51358e4e71b0-encryption-config\") pod \"apiserver-76f77b778f-5f7jc\" (UID: \"89314349-bbc8-4886-b93b-51358e4e71b0\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7jc" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.464024 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac1cf81b-8ec8-4ae4-bfb3-d46bb75f24d4-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-s7gwp\" (UID: \"ac1cf81b-8ec8-4ae4-bfb3-d46bb75f24d4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s7gwp" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.464082 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17d7ae01-24ad-448d-ae7c-10df353833f4-config\") pod \"kube-apiserver-operator-766d6c64bb-cp47m\" (UID: \"17d7ae01-24ad-448d-ae7c-10df353833f4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cp47m" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.465531 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/01ff1674-4e01-4cdc-aea3-1e91a6a389e3-default-certificate\") pod \"router-default-5444994796-5hn9b\" (UID: \"01ff1674-4e01-4cdc-aea3-1e91a6a389e3\") " pod="openshift-ingress/router-default-5444994796-5hn9b" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.465990 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d031fa1b-4d52-47d7-8c39-5fa21fb6c244-proxy-tls\") pod \"machine-config-controller-84d6567774-xv2tk\" (UID: \"d031fa1b-4d52-47d7-8c39-5fa21fb6c244\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xv2tk" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.466030 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5caeb3dc-2a42-41b5-ac91-c1c8a216fb43-console-oauth-config\") pod \"console-f9d7485db-h262z\" (UID: \"5caeb3dc-2a42-41b5-ac91-c1c8a216fb43\") " pod="openshift-console/console-f9d7485db-h262z" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.466148 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89314349-bbc8-4886-b93b-51358e4e71b0-serving-cert\") pod \"apiserver-76f77b778f-5f7jc\" (UID: \"89314349-bbc8-4886-b93b-51358e4e71b0\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7jc" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.466395 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c4b193a-a01b-440a-a94a-55c4b5f06586-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-nz99n\" (UID: \"4c4b193a-a01b-440a-a94a-55c4b5f06586\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nz99n" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.466661 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/89314349-bbc8-4886-b93b-51358e4e71b0-audit\") pod \"apiserver-76f77b778f-5f7jc\" (UID: \"89314349-bbc8-4886-b93b-51358e4e71b0\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7jc" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.466859 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/89314349-bbc8-4886-b93b-51358e4e71b0-etcd-serving-ca\") pod \"apiserver-76f77b778f-5f7jc\" (UID: \"89314349-bbc8-4886-b93b-51358e4e71b0\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7jc" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.467535 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5caeb3dc-2a42-41b5-ac91-c1c8a216fb43-console-serving-cert\") pod \"console-f9d7485db-h262z\" (UID: \"5caeb3dc-2a42-41b5-ac91-c1c8a216fb43\") " pod="openshift-console/console-f9d7485db-h262z" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.468330 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/89314349-bbc8-4886-b93b-51358e4e71b0-image-import-ca\") pod \"apiserver-76f77b778f-5f7jc\" (UID: \"89314349-bbc8-4886-b93b-51358e4e71b0\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7jc" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.468579 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84e2930a-5ae3-4171-a3dd-e5eea62ef157-config\") pod \"route-controller-manager-6576b87f9c-7762w\" (UID: \"84e2930a-5ae3-4171-a3dd-e5eea62ef157\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7762w" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.468635 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89314349-bbc8-4886-b93b-51358e4e71b0-trusted-ca-bundle\") pod \"apiserver-76f77b778f-5f7jc\" (UID: \"89314349-bbc8-4886-b93b-51358e4e71b0\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7jc" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.470657 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4e18e163-6cf0-48ef-9a6f-90cbece870b0-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.470842 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4e18e163-6cf0-48ef-9a6f-90cbece870b0-registry-tls\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.471374 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/0058e7f4-92db-444d-a979-2880c3f83442-machine-approver-tls\") pod \"machine-approver-56656f9798-cc6m2\" (UID: \"0058e7f4-92db-444d-a979-2880c3f83442\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cc6m2" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.471632 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/01ff1674-4e01-4cdc-aea3-1e91a6a389e3-metrics-certs\") pod \"router-default-5444994796-5hn9b\" (UID: \"01ff1674-4e01-4cdc-aea3-1e91a6a389e3\") " pod="openshift-ingress/router-default-5444994796-5hn9b" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.472294 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/01ff1674-4e01-4cdc-aea3-1e91a6a389e3-stats-auth\") pod \"router-default-5444994796-5hn9b\" (UID: \"01ff1674-4e01-4cdc-aea3-1e91a6a389e3\") " pod="openshift-ingress/router-default-5444994796-5hn9b" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.472432 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17d7ae01-24ad-448d-ae7c-10df353833f4-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-cp47m\" (UID: \"17d7ae01-24ad-448d-ae7c-10df353833f4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cp47m" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.473643 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84e2930a-5ae3-4171-a3dd-e5eea62ef157-serving-cert\") pod \"route-controller-manager-6576b87f9c-7762w\" (UID: \"84e2930a-5ae3-4171-a3dd-e5eea62ef157\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7762w" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.477520 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/853ca050-beae-4089-a5df-9556eeda508b-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-8m7f4\" (UID: \"853ca050-beae-4089-a5df-9556eeda508b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8m7f4" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.486510 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-db5pg" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.486712 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c4b193a-a01b-440a-a94a-55c4b5f06586-config\") pod \"openshift-apiserver-operator-796bbdcf4f-nz99n\" (UID: \"4c4b193a-a01b-440a-a94a-55c4b5f06586\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nz99n" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.489190 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89314349-bbc8-4886-b93b-51358e4e71b0-config\") pod \"apiserver-76f77b778f-5f7jc\" (UID: \"89314349-bbc8-4886-b93b-51358e4e71b0\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7jc" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.490450 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/89314349-bbc8-4886-b93b-51358e4e71b0-etcd-client\") pod \"apiserver-76f77b778f-5f7jc\" (UID: \"89314349-bbc8-4886-b93b-51358e4e71b0\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7jc" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.507379 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtk42\" (UniqueName: \"kubernetes.io/projected/89314349-bbc8-4886-b93b-51358e4e71b0-kube-api-access-mtk42\") pod \"apiserver-76f77b778f-5f7jc\" (UID: \"89314349-bbc8-4886-b93b-51358e4e71b0\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7jc" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.508890 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-sxjf5"] Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.521832 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-hgs4c"] Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.534642 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4e18e163-6cf0-48ef-9a6f-90cbece870b0-bound-sa-token\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.534942 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-5f7jc" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.553208 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llx87\" (UniqueName: \"kubernetes.io/projected/4e18e163-6cf0-48ef-9a6f-90cbece870b0-kube-api-access-llx87\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.555028 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.555265 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngnfg\" (UniqueName: \"kubernetes.io/projected/ca236cfc-51d0-4d79-b90c-ddac400b4dbb-kube-api-access-ngnfg\") pod \"service-ca-operator-777779d784-ghblb\" (UID: \"ca236cfc-51d0-4d79-b90c-ddac400b4dbb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ghblb" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.555324 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d64a8f76-87cc-45eb-bc92-5802a3db6c3d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-v4px6\" (UID: \"d64a8f76-87cc-45eb-bc92-5802a3db6c3d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v4px6" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.555343 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxcmz\" (UniqueName: \"kubernetes.io/projected/5c630253-f658-44fb-891d-f560f1e2b577-kube-api-access-gxcmz\") pod \"control-plane-machine-set-operator-78cbb6b69f-h4drr\" (UID: \"5c630253-f658-44fb-891d-f560f1e2b577\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h4drr" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.555374 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8bc2bde1-a50e-47bb-8211-4b5ed0ac74ea-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-5r6kv\" (UID: \"8bc2bde1-a50e-47bb-8211-4b5ed0ac74ea\") " pod="openshift-marketplace/marketplace-operator-79b997595-5r6kv" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.555405 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eade01dc-846b-42a8-a6ed-8cf0a0663e82-secret-volume\") pod \"collect-profiles-29497830-rpwmp\" (UID: \"eade01dc-846b-42a8-a6ed-8cf0a0663e82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497830-rpwmp" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.555420 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/a55fc688-004a-4d6f-a48e-c10b0ae1d8f1-csi-data-dir\") pod \"csi-hostpathplugin-x4rnh\" (UID: \"a55fc688-004a-4d6f-a48e-c10b0ae1d8f1\") " pod="hostpath-provisioner/csi-hostpathplugin-x4rnh" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.555441 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f47a4e08-e21f-4a13-9ea2-bc1545a64cae-config\") pod \"kube-controller-manager-operator-78b949d7b-l76jv\" (UID: \"f47a4e08-e21f-4a13-9ea2-bc1545a64cae\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l76jv" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.555455 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8bc2bde1-a50e-47bb-8211-4b5ed0ac74ea-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-5r6kv\" (UID: \"8bc2bde1-a50e-47bb-8211-4b5ed0ac74ea\") " pod="openshift-marketplace/marketplace-operator-79b997595-5r6kv" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.555472 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ad3db81-4cb9-49a5-b4e0-55b546996fa0-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-xqgfv\" (UID: \"2ad3db81-4cb9-49a5-b4e0-55b546996fa0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xqgfv" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.555496 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/a55fc688-004a-4d6f-a48e-c10b0ae1d8f1-mountpoint-dir\") pod \"csi-hostpathplugin-x4rnh\" (UID: \"a55fc688-004a-4d6f-a48e-c10b0ae1d8f1\") " pod="hostpath-provisioner/csi-hostpathplugin-x4rnh" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.555510 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grhf9\" (UniqueName: \"kubernetes.io/projected/9f99779e-5e77-4b5c-8886-7accebe8a897-kube-api-access-grhf9\") pod \"olm-operator-6b444d44fb-7hc86\" (UID: \"9f99779e-5e77-4b5c-8886-7accebe8a897\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7hc86" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.555531 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btktd\" (UniqueName: \"kubernetes.io/projected/f6be9bbf-6799-45e0-8d53-790a5484f3a4-kube-api-access-btktd\") pod \"dns-default-skzbg\" (UID: \"f6be9bbf-6799-45e0-8d53-790a5484f3a4\") " pod="openshift-dns/dns-default-skzbg" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.555547 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhsxz\" (UniqueName: \"kubernetes.io/projected/d64a8f76-87cc-45eb-bc92-5802a3db6c3d-kube-api-access-jhsxz\") pod \"cluster-image-registry-operator-dc59b4c8b-v4px6\" (UID: \"d64a8f76-87cc-45eb-bc92-5802a3db6c3d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v4px6" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.555565 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/dc13f997-316e-4e81-a56e-0fa6e02d1502-node-bootstrap-token\") pod \"machine-config-server-x4njd\" (UID: \"dc13f997-316e-4e81-a56e-0fa6e02d1502\") " pod="openshift-machine-config-operator/machine-config-server-x4njd" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.555587 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/89a244ab-c405-48aa-893f-f50995384ede-webhook-cert\") pod \"packageserver-d55dfcdfc-7hjp9\" (UID: \"89a244ab-c405-48aa-893f-f50995384ede\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7hjp9" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.555605 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f47a4e08-e21f-4a13-9ea2-bc1545a64cae-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-l76jv\" (UID: \"f47a4e08-e21f-4a13-9ea2-bc1545a64cae\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l76jv" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.555621 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6lk6\" (UniqueName: \"kubernetes.io/projected/89a244ab-c405-48aa-893f-f50995384ede-kube-api-access-f6lk6\") pod \"packageserver-d55dfcdfc-7hjp9\" (UID: \"89a244ab-c405-48aa-893f-f50995384ede\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7hjp9" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.555638 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f6be9bbf-6799-45e0-8d53-790a5484f3a4-metrics-tls\") pod \"dns-default-skzbg\" (UID: \"f6be9bbf-6799-45e0-8d53-790a5484f3a4\") " pod="openshift-dns/dns-default-skzbg" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.555655 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2z67k\" (UniqueName: \"kubernetes.io/projected/e999b5a4-2e54-4195-98fa-4c5fa36f1b3a-kube-api-access-2z67k\") pod \"ingress-canary-qdsgb\" (UID: \"e999b5a4-2e54-4195-98fa-4c5fa36f1b3a\") " pod="openshift-ingress-canary/ingress-canary-qdsgb" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.555686 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a55fc688-004a-4d6f-a48e-c10b0ae1d8f1-registration-dir\") pod \"csi-hostpathplugin-x4rnh\" (UID: \"a55fc688-004a-4d6f-a48e-c10b0ae1d8f1\") " pod="hostpath-provisioner/csi-hostpathplugin-x4rnh" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.555700 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pj5gk\" (UniqueName: \"kubernetes.io/projected/a55fc688-004a-4d6f-a48e-c10b0ae1d8f1-kube-api-access-pj5gk\") pod \"csi-hostpathplugin-x4rnh\" (UID: \"a55fc688-004a-4d6f-a48e-c10b0ae1d8f1\") " pod="hostpath-provisioner/csi-hostpathplugin-x4rnh" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.555715 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/a55fc688-004a-4d6f-a48e-c10b0ae1d8f1-plugins-dir\") pod \"csi-hostpathplugin-x4rnh\" (UID: \"a55fc688-004a-4d6f-a48e-c10b0ae1d8f1\") " pod="hostpath-provisioner/csi-hostpathplugin-x4rnh" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.555729 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4td9\" (UniqueName: \"kubernetes.io/projected/2ad3db81-4cb9-49a5-b4e0-55b546996fa0-kube-api-access-r4td9\") pod \"kube-storage-version-migrator-operator-b67b599dd-xqgfv\" (UID: \"2ad3db81-4cb9-49a5-b4e0-55b546996fa0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xqgfv" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.555771 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4lkn\" (UniqueName: \"kubernetes.io/projected/7014a649-2d58-4772-9eb3-697e4b925923-kube-api-access-z4lkn\") pod \"package-server-manager-789f6589d5-kvfvk\" (UID: \"7014a649-2d58-4772-9eb3-697e4b925923\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kvfvk" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.555786 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5c65\" (UniqueName: \"kubernetes.io/projected/b17c8e83-275b-4777-946a-c7360ad8fa48-kube-api-access-r5c65\") pod \"migrator-59844c95c7-b44gm\" (UID: \"b17c8e83-275b-4777-946a-c7360ad8fa48\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b44gm" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.555804 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/7014a649-2d58-4772-9eb3-697e4b925923-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-kvfvk\" (UID: \"7014a649-2d58-4772-9eb3-697e4b925923\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kvfvk" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.555818 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca236cfc-51d0-4d79-b90c-ddac400b4dbb-config\") pod \"service-ca-operator-777779d784-ghblb\" (UID: \"ca236cfc-51d0-4d79-b90c-ddac400b4dbb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ghblb" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.555832 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/cc8ec6f8-52f3-4bb8-a00b-4f73276a3af4-signing-key\") pod \"service-ca-9c57cc56f-vbfvz\" (UID: \"cc8ec6f8-52f3-4bb8-a00b-4f73276a3af4\") " pod="openshift-service-ca/service-ca-9c57cc56f-vbfvz" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.555849 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/5c630253-f658-44fb-891d-f560f1e2b577-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-h4drr\" (UID: \"5c630253-f658-44fb-891d-f560f1e2b577\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h4drr" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.555865 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9edad05e-bd87-4a20-a947-6b09f9f7c93a-srv-cert\") pod \"catalog-operator-68c6474976-vc9q2\" (UID: \"9edad05e-bd87-4a20-a947-6b09f9f7c93a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vc9q2" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.555882 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9edad05e-bd87-4a20-a947-6b09f9f7c93a-profile-collector-cert\") pod \"catalog-operator-68c6474976-vc9q2\" (UID: \"9edad05e-bd87-4a20-a947-6b09f9f7c93a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vc9q2" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.555896 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwjbh\" (UniqueName: \"kubernetes.io/projected/eade01dc-846b-42a8-a6ed-8cf0a0663e82-kube-api-access-zwjbh\") pod \"collect-profiles-29497830-rpwmp\" (UID: \"eade01dc-846b-42a8-a6ed-8cf0a0663e82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497830-rpwmp" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.555917 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/dc13f997-316e-4e81-a56e-0fa6e02d1502-certs\") pod \"machine-config-server-x4njd\" (UID: \"dc13f997-316e-4e81-a56e-0fa6e02d1502\") " pod="openshift-machine-config-operator/machine-config-server-x4njd" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.555937 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/89a244ab-c405-48aa-893f-f50995384ede-tmpfs\") pod \"packageserver-d55dfcdfc-7hjp9\" (UID: \"89a244ab-c405-48aa-893f-f50995384ede\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7hjp9" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.555951 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9f99779e-5e77-4b5c-8886-7accebe8a897-profile-collector-cert\") pod \"olm-operator-6b444d44fb-7hc86\" (UID: \"9f99779e-5e77-4b5c-8886-7accebe8a897\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7hc86" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.555968 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eade01dc-846b-42a8-a6ed-8cf0a0663e82-config-volume\") pod \"collect-profiles-29497830-rpwmp\" (UID: \"eade01dc-846b-42a8-a6ed-8cf0a0663e82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497830-rpwmp" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.555986 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/89a244ab-c405-48aa-893f-f50995384ede-apiservice-cert\") pod \"packageserver-d55dfcdfc-7hjp9\" (UID: \"89a244ab-c405-48aa-893f-f50995384ede\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7hjp9" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.556000 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8tl6\" (UniqueName: \"kubernetes.io/projected/cc8ec6f8-52f3-4bb8-a00b-4f73276a3af4-kube-api-access-x8tl6\") pod \"service-ca-9c57cc56f-vbfvz\" (UID: \"cc8ec6f8-52f3-4bb8-a00b-4f73276a3af4\") " pod="openshift-service-ca/service-ca-9c57cc56f-vbfvz" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.556016 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d64a8f76-87cc-45eb-bc92-5802a3db6c3d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-v4px6\" (UID: \"d64a8f76-87cc-45eb-bc92-5802a3db6c3d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v4px6" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.556032 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9f99779e-5e77-4b5c-8886-7accebe8a897-srv-cert\") pod \"olm-operator-6b444d44fb-7hc86\" (UID: \"9f99779e-5e77-4b5c-8886-7accebe8a897\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7hc86" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.556049 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdrlx\" (UniqueName: \"kubernetes.io/projected/8bc2bde1-a50e-47bb-8211-4b5ed0ac74ea-kube-api-access-qdrlx\") pod \"marketplace-operator-79b997595-5r6kv\" (UID: \"8bc2bde1-a50e-47bb-8211-4b5ed0ac74ea\") " pod="openshift-marketplace/marketplace-operator-79b997595-5r6kv" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.556063 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca236cfc-51d0-4d79-b90c-ddac400b4dbb-serving-cert\") pod \"service-ca-operator-777779d784-ghblb\" (UID: \"ca236cfc-51d0-4d79-b90c-ddac400b4dbb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ghblb" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.556093 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f6be9bbf-6799-45e0-8d53-790a5484f3a4-config-volume\") pod \"dns-default-skzbg\" (UID: \"f6be9bbf-6799-45e0-8d53-790a5484f3a4\") " pod="openshift-dns/dns-default-skzbg" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.556109 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f47a4e08-e21f-4a13-9ea2-bc1545a64cae-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-l76jv\" (UID: \"f47a4e08-e21f-4a13-9ea2-bc1545a64cae\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l76jv" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.556124 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d64a8f76-87cc-45eb-bc92-5802a3db6c3d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-v4px6\" (UID: \"d64a8f76-87cc-45eb-bc92-5802a3db6c3d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v4px6" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.556137 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a55fc688-004a-4d6f-a48e-c10b0ae1d8f1-socket-dir\") pod \"csi-hostpathplugin-x4rnh\" (UID: \"a55fc688-004a-4d6f-a48e-c10b0ae1d8f1\") " pod="hostpath-provisioner/csi-hostpathplugin-x4rnh" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.556163 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/cc8ec6f8-52f3-4bb8-a00b-4f73276a3af4-signing-cabundle\") pod \"service-ca-9c57cc56f-vbfvz\" (UID: \"cc8ec6f8-52f3-4bb8-a00b-4f73276a3af4\") " pod="openshift-service-ca/service-ca-9c57cc56f-vbfvz" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.556178 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m898f\" (UniqueName: \"kubernetes.io/projected/dc13f997-316e-4e81-a56e-0fa6e02d1502-kube-api-access-m898f\") pod \"machine-config-server-x4njd\" (UID: \"dc13f997-316e-4e81-a56e-0fa6e02d1502\") " pod="openshift-machine-config-operator/machine-config-server-x4njd" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.556192 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jw467\" (UniqueName: \"kubernetes.io/projected/9edad05e-bd87-4a20-a947-6b09f9f7c93a-kube-api-access-jw467\") pod \"catalog-operator-68c6474976-vc9q2\" (UID: \"9edad05e-bd87-4a20-a947-6b09f9f7c93a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vc9q2" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.556248 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e999b5a4-2e54-4195-98fa-4c5fa36f1b3a-cert\") pod \"ingress-canary-qdsgb\" (UID: \"e999b5a4-2e54-4195-98fa-4c5fa36f1b3a\") " pod="openshift-ingress-canary/ingress-canary-qdsgb" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.556267 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ad3db81-4cb9-49a5-b4e0-55b546996fa0-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-xqgfv\" (UID: \"2ad3db81-4cb9-49a5-b4e0-55b546996fa0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xqgfv" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.556841 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ad3db81-4cb9-49a5-b4e0-55b546996fa0-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-xqgfv\" (UID: \"2ad3db81-4cb9-49a5-b4e0-55b546996fa0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xqgfv" Jan 31 14:43:58 crc kubenswrapper[4751]: E0131 14:43:58.556939 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:43:59.056897538 +0000 UTC m=+143.431610423 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.559230 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/a55fc688-004a-4d6f-a48e-c10b0ae1d8f1-csi-data-dir\") pod \"csi-hostpathplugin-x4rnh\" (UID: \"a55fc688-004a-4d6f-a48e-c10b0ae1d8f1\") " pod="hostpath-provisioner/csi-hostpathplugin-x4rnh" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.560594 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a55fc688-004a-4d6f-a48e-c10b0ae1d8f1-socket-dir\") pod \"csi-hostpathplugin-x4rnh\" (UID: \"a55fc688-004a-4d6f-a48e-c10b0ae1d8f1\") " pod="hostpath-provisioner/csi-hostpathplugin-x4rnh" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.561550 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d64a8f76-87cc-45eb-bc92-5802a3db6c3d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-v4px6\" (UID: \"d64a8f76-87cc-45eb-bc92-5802a3db6c3d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v4px6" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.561729 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/cc8ec6f8-52f3-4bb8-a00b-4f73276a3af4-signing-cabundle\") pod \"service-ca-9c57cc56f-vbfvz\" (UID: \"cc8ec6f8-52f3-4bb8-a00b-4f73276a3af4\") " pod="openshift-service-ca/service-ca-9c57cc56f-vbfvz" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.562289 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eade01dc-846b-42a8-a6ed-8cf0a0663e82-secret-volume\") pod \"collect-profiles-29497830-rpwmp\" (UID: \"eade01dc-846b-42a8-a6ed-8cf0a0663e82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497830-rpwmp" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.562800 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f6be9bbf-6799-45e0-8d53-790a5484f3a4-config-volume\") pod \"dns-default-skzbg\" (UID: \"f6be9bbf-6799-45e0-8d53-790a5484f3a4\") " pod="openshift-dns/dns-default-skzbg" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.563222 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8bc2bde1-a50e-47bb-8211-4b5ed0ac74ea-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-5r6kv\" (UID: \"8bc2bde1-a50e-47bb-8211-4b5ed0ac74ea\") " pod="openshift-marketplace/marketplace-operator-79b997595-5r6kv" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.564932 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8bc2bde1-a50e-47bb-8211-4b5ed0ac74ea-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-5r6kv\" (UID: \"8bc2bde1-a50e-47bb-8211-4b5ed0ac74ea\") " pod="openshift-marketplace/marketplace-operator-79b997595-5r6kv" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.565348 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a55fc688-004a-4d6f-a48e-c10b0ae1d8f1-registration-dir\") pod \"csi-hostpathplugin-x4rnh\" (UID: \"a55fc688-004a-4d6f-a48e-c10b0ae1d8f1\") " pod="hostpath-provisioner/csi-hostpathplugin-x4rnh" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.565386 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/a55fc688-004a-4d6f-a48e-c10b0ae1d8f1-plugins-dir\") pod \"csi-hostpathplugin-x4rnh\" (UID: \"a55fc688-004a-4d6f-a48e-c10b0ae1d8f1\") " pod="hostpath-provisioner/csi-hostpathplugin-x4rnh" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.565909 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/a55fc688-004a-4d6f-a48e-c10b0ae1d8f1-mountpoint-dir\") pod \"csi-hostpathplugin-x4rnh\" (UID: \"a55fc688-004a-4d6f-a48e-c10b0ae1d8f1\") " pod="hostpath-provisioner/csi-hostpathplugin-x4rnh" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.566095 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/89a244ab-c405-48aa-893f-f50995384ede-tmpfs\") pod \"packageserver-d55dfcdfc-7hjp9\" (UID: \"89a244ab-c405-48aa-893f-f50995384ede\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7hjp9" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.566233 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca236cfc-51d0-4d79-b90c-ddac400b4dbb-config\") pod \"service-ca-operator-777779d784-ghblb\" (UID: \"ca236cfc-51d0-4d79-b90c-ddac400b4dbb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ghblb" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.566305 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eade01dc-846b-42a8-a6ed-8cf0a0663e82-config-volume\") pod \"collect-profiles-29497830-rpwmp\" (UID: \"eade01dc-846b-42a8-a6ed-8cf0a0663e82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497830-rpwmp" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.566982 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/cc8ec6f8-52f3-4bb8-a00b-4f73276a3af4-signing-key\") pod \"service-ca-9c57cc56f-vbfvz\" (UID: \"cc8ec6f8-52f3-4bb8-a00b-4f73276a3af4\") " pod="openshift-service-ca/service-ca-9c57cc56f-vbfvz" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.567087 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f47a4e08-e21f-4a13-9ea2-bc1545a64cae-config\") pod \"kube-controller-manager-operator-78b949d7b-l76jv\" (UID: \"f47a4e08-e21f-4a13-9ea2-bc1545a64cae\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l76jv" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.567859 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/dc13f997-316e-4e81-a56e-0fa6e02d1502-certs\") pod \"machine-config-server-x4njd\" (UID: \"dc13f997-316e-4e81-a56e-0fa6e02d1502\") " pod="openshift-machine-config-operator/machine-config-server-x4njd" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.569731 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9edad05e-bd87-4a20-a947-6b09f9f7c93a-profile-collector-cert\") pod \"catalog-operator-68c6474976-vc9q2\" (UID: \"9edad05e-bd87-4a20-a947-6b09f9f7c93a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vc9q2" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.571249 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9edad05e-bd87-4a20-a947-6b09f9f7c93a-srv-cert\") pod \"catalog-operator-68c6474976-vc9q2\" (UID: \"9edad05e-bd87-4a20-a947-6b09f9f7c93a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vc9q2" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.574257 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ad3db81-4cb9-49a5-b4e0-55b546996fa0-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-xqgfv\" (UID: \"2ad3db81-4cb9-49a5-b4e0-55b546996fa0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xqgfv" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.577272 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f47a4e08-e21f-4a13-9ea2-bc1545a64cae-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-l76jv\" (UID: \"f47a4e08-e21f-4a13-9ea2-bc1545a64cae\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l76jv" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.577396 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/5c630253-f658-44fb-891d-f560f1e2b577-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-h4drr\" (UID: \"5c630253-f658-44fb-891d-f560f1e2b577\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h4drr" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.577542 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/7014a649-2d58-4772-9eb3-697e4b925923-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-kvfvk\" (UID: \"7014a649-2d58-4772-9eb3-697e4b925923\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kvfvk" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.577567 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9f99779e-5e77-4b5c-8886-7accebe8a897-profile-collector-cert\") pod \"olm-operator-6b444d44fb-7hc86\" (UID: \"9f99779e-5e77-4b5c-8886-7accebe8a897\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7hc86" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.577718 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/89a244ab-c405-48aa-893f-f50995384ede-webhook-cert\") pod \"packageserver-d55dfcdfc-7hjp9\" (UID: \"89a244ab-c405-48aa-893f-f50995384ede\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7hjp9" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.578042 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/dc13f997-316e-4e81-a56e-0fa6e02d1502-node-bootstrap-token\") pod \"machine-config-server-x4njd\" (UID: \"dc13f997-316e-4e81-a56e-0fa6e02d1502\") " pod="openshift-machine-config-operator/machine-config-server-x4njd" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.578242 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/89a244ab-c405-48aa-893f-f50995384ede-apiservice-cert\") pod \"packageserver-d55dfcdfc-7hjp9\" (UID: \"89a244ab-c405-48aa-893f-f50995384ede\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7hjp9" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.578835 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9f99779e-5e77-4b5c-8886-7accebe8a897-srv-cert\") pod \"olm-operator-6b444d44fb-7hc86\" (UID: \"9f99779e-5e77-4b5c-8886-7accebe8a897\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7hc86" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.579295 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca236cfc-51d0-4d79-b90c-ddac400b4dbb-serving-cert\") pod \"service-ca-operator-777779d784-ghblb\" (UID: \"ca236cfc-51d0-4d79-b90c-ddac400b4dbb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ghblb" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.582815 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwdnj\" (UniqueName: \"kubernetes.io/projected/ac1cf81b-8ec8-4ae4-bfb3-d46bb75f24d4-kube-api-access-vwdnj\") pod \"openshift-controller-manager-operator-756b6f6bc6-s7gwp\" (UID: \"ac1cf81b-8ec8-4ae4-bfb3-d46bb75f24d4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s7gwp" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.592376 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s7gwp" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.594945 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/17d7ae01-24ad-448d-ae7c-10df353833f4-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-cp47m\" (UID: \"17d7ae01-24ad-448d-ae7c-10df353833f4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cp47m" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.612884 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wswl\" (UniqueName: \"kubernetes.io/projected/4c4b193a-a01b-440a-a94a-55c4b5f06586-kube-api-access-9wswl\") pod \"openshift-apiserver-operator-796bbdcf4f-nz99n\" (UID: \"4c4b193a-a01b-440a-a94a-55c4b5f06586\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nz99n" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.637172 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sjfc\" (UniqueName: \"kubernetes.io/projected/853ca050-beae-4089-a5df-9556eeda508b-kube-api-access-7sjfc\") pod \"cluster-samples-operator-665b6dd947-8m7f4\" (UID: \"853ca050-beae-4089-a5df-9556eeda508b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8m7f4" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.654891 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e999b5a4-2e54-4195-98fa-4c5fa36f1b3a-cert\") pod \"ingress-canary-qdsgb\" (UID: \"e999b5a4-2e54-4195-98fa-4c5fa36f1b3a\") " pod="openshift-ingress-canary/ingress-canary-qdsgb" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.654968 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksqdw\" (UniqueName: \"kubernetes.io/projected/84e2930a-5ae3-4171-a3dd-e5eea62ef157-kube-api-access-ksqdw\") pod \"route-controller-manager-6576b87f9c-7762w\" (UID: \"84e2930a-5ae3-4171-a3dd-e5eea62ef157\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7762w" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.655507 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d64a8f76-87cc-45eb-bc92-5802a3db6c3d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-v4px6\" (UID: \"d64a8f76-87cc-45eb-bc92-5802a3db6c3d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v4px6" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.659728 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f6be9bbf-6799-45e0-8d53-790a5484f3a4-metrics-tls\") pod \"dns-default-skzbg\" (UID: \"f6be9bbf-6799-45e0-8d53-790a5484f3a4\") " pod="openshift-dns/dns-default-skzbg" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.661497 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:43:58 crc kubenswrapper[4751]: E0131 14:43:58.662170 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:43:59.162153398 +0000 UTC m=+143.536866283 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.662809 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-drp8h"] Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.663785 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvlhx\" (UniqueName: \"kubernetes.io/projected/5caeb3dc-2a42-41b5-ac91-c1c8a216fb43-kube-api-access-lvlhx\") pod \"console-f9d7485db-h262z\" (UID: \"5caeb3dc-2a42-41b5-ac91-c1c8a216fb43\") " pod="openshift-console/console-f9d7485db-h262z" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.673012 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9th82\" (UniqueName: \"kubernetes.io/projected/d031fa1b-4d52-47d7-8c39-5fa21fb6c244-kube-api-access-9th82\") pod \"machine-config-controller-84d6567774-xv2tk\" (UID: \"d031fa1b-4d52-47d7-8c39-5fa21fb6c244\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xv2tk" Jan 31 14:43:58 crc kubenswrapper[4751]: W0131 14:43:58.687860 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d8dfb13_f0a0_465c_821d_95f0df0a98cf.slice/crio-f4985d6fa20c1f7c84743bb91673b6608bdb784531485a09c314e16cd026c074 WatchSource:0}: Error finding container f4985d6fa20c1f7c84743bb91673b6608bdb784531485a09c314e16cd026c074: Status 404 returned error can't find the container with id f4985d6fa20c1f7c84743bb91673b6608bdb784531485a09c314e16cd026c074 Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.691948 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8662\" (UniqueName: \"kubernetes.io/projected/01ff1674-4e01-4cdc-aea3-1e91a6a389e3-kube-api-access-n8662\") pod \"router-default-5444994796-5hn9b\" (UID: \"01ff1674-4e01-4cdc-aea3-1e91a6a389e3\") " pod="openshift-ingress/router-default-5444994796-5hn9b" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.693286 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-w4lzx"] Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.712806 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppx8c\" (UniqueName: \"kubernetes.io/projected/0058e7f4-92db-444d-a979-2880c3f83442-kube-api-access-ppx8c\") pod \"machine-approver-56656f9798-cc6m2\" (UID: \"0058e7f4-92db-444d-a979-2880c3f83442\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cc6m2" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.750246 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngnfg\" (UniqueName: \"kubernetes.io/projected/ca236cfc-51d0-4d79-b90c-ddac400b4dbb-kube-api-access-ngnfg\") pod \"service-ca-operator-777779d784-ghblb\" (UID: \"ca236cfc-51d0-4d79-b90c-ddac400b4dbb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ghblb" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.762991 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:43:58 crc kubenswrapper[4751]: E0131 14:43:58.763395 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:43:59.263381631 +0000 UTC m=+143.638094516 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.774105 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8m7f4" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.780885 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d64a8f76-87cc-45eb-bc92-5802a3db6c3d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-v4px6\" (UID: \"d64a8f76-87cc-45eb-bc92-5802a3db6c3d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v4px6" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.805254 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-v8p8j"] Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.808043 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxcmz\" (UniqueName: \"kubernetes.io/projected/5c630253-f658-44fb-891d-f560f1e2b577-kube-api-access-gxcmz\") pod \"control-plane-machine-set-operator-78cbb6b69f-h4drr\" (UID: \"5c630253-f658-44fb-891d-f560f1e2b577\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h4drr" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.811190 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xr2gt"] Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.811834 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-8fgxq"] Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.819881 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m898f\" (UniqueName: \"kubernetes.io/projected/dc13f997-316e-4e81-a56e-0fa6e02d1502-kube-api-access-m898f\") pod \"machine-config-server-x4njd\" (UID: \"dc13f997-316e-4e81-a56e-0fa6e02d1502\") " pod="openshift-machine-config-operator/machine-config-server-x4njd" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.825769 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cc6m2" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.830493 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jw467\" (UniqueName: \"kubernetes.io/projected/9edad05e-bd87-4a20-a947-6b09f9f7c93a-kube-api-access-jw467\") pod \"catalog-operator-68c6474976-vc9q2\" (UID: \"9edad05e-bd87-4a20-a947-6b09f9f7c93a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vc9q2" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.838009 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7762w" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.844457 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nz99n" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.847501 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s7gwp"] Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.852113 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwjbh\" (UniqueName: \"kubernetes.io/projected/eade01dc-846b-42a8-a6ed-8cf0a0663e82-kube-api-access-zwjbh\") pod \"collect-profiles-29497830-rpwmp\" (UID: \"eade01dc-846b-42a8-a6ed-8cf0a0663e82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497830-rpwmp" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.858021 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cp47m" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.863666 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-5hn9b" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.864444 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:43:58 crc kubenswrapper[4751]: E0131 14:43:58.864730 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:43:59.364716077 +0000 UTC m=+143.739428962 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.869589 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f47a4e08-e21f-4a13-9ea2-bc1545a64cae-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-l76jv\" (UID: \"f47a4e08-e21f-4a13-9ea2-bc1545a64cae\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l76jv" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.892984 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6lk6\" (UniqueName: \"kubernetes.io/projected/89a244ab-c405-48aa-893f-f50995384ede-kube-api-access-f6lk6\") pod \"packageserver-d55dfcdfc-7hjp9\" (UID: \"89a244ab-c405-48aa-893f-f50995384ede\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7hjp9" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.899137 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-5f7jc"] Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.910659 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdrlx\" (UniqueName: \"kubernetes.io/projected/8bc2bde1-a50e-47bb-8211-4b5ed0ac74ea-kube-api-access-qdrlx\") pod \"marketplace-operator-79b997595-5r6kv\" (UID: \"8bc2bde1-a50e-47bb-8211-4b5ed0ac74ea\") " pod="openshift-marketplace/marketplace-operator-79b997595-5r6kv" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.927466 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-h262z" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.930154 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-czqdr"] Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.930288 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xv2tk" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.934470 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btktd\" (UniqueName: \"kubernetes.io/projected/f6be9bbf-6799-45e0-8d53-790a5484f3a4-kube-api-access-btktd\") pod \"dns-default-skzbg\" (UID: \"f6be9bbf-6799-45e0-8d53-790a5484f3a4\") " pod="openshift-dns/dns-default-skzbg" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.938882 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-db5pg"] Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.951510 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4td9\" (UniqueName: \"kubernetes.io/projected/2ad3db81-4cb9-49a5-b4e0-55b546996fa0-kube-api-access-r4td9\") pod \"kube-storage-version-migrator-operator-b67b599dd-xqgfv\" (UID: \"2ad3db81-4cb9-49a5-b4e0-55b546996fa0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xqgfv" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.952325 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xqgfv" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.965899 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:43:58 crc kubenswrapper[4751]: E0131 14:43:58.966132 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:43:59.466091034 +0000 UTC m=+143.840803979 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.966366 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:43:58 crc kubenswrapper[4751]: E0131 14:43:58.966703 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:43:59.466687049 +0000 UTC m=+143.841399934 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:43:58 crc kubenswrapper[4751]: W0131 14:43:58.977334 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9810521_7440_49d4_bf04_7dbe3324cc5b.slice/crio-281ca335e21d1ef7df4a991d6128059fdc2c0654efcf33689fe07a1e7199b3d7 WatchSource:0}: Error finding container 281ca335e21d1ef7df4a991d6128059fdc2c0654efcf33689fe07a1e7199b3d7: Status 404 returned error can't find the container with id 281ca335e21d1ef7df4a991d6128059fdc2c0654efcf33689fe07a1e7199b3d7 Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.978894 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l76jv" Jan 31 14:43:58 crc kubenswrapper[4751]: W0131 14:43:58.981975 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod802d5225_ef3f_485c_bb85_3c0f18e42952.slice/crio-5084329a5b9f4efb799b2485cd137ef3c2a4c4cd5ed6e746dd1d5ef125ea23bd WatchSource:0}: Error finding container 5084329a5b9f4efb799b2485cd137ef3c2a4c4cd5ed6e746dd1d5ef125ea23bd: Status 404 returned error can't find the container with id 5084329a5b9f4efb799b2485cd137ef3c2a4c4cd5ed6e746dd1d5ef125ea23bd Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.983026 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pj5gk\" (UniqueName: \"kubernetes.io/projected/a55fc688-004a-4d6f-a48e-c10b0ae1d8f1-kube-api-access-pj5gk\") pod \"csi-hostpathplugin-x4rnh\" (UID: \"a55fc688-004a-4d6f-a48e-c10b0ae1d8f1\") " pod="hostpath-provisioner/csi-hostpathplugin-x4rnh" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.986360 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7hjp9" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.993202 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h4drr" Jan 31 14:43:58 crc kubenswrapper[4751]: I0131 14:43:58.995786 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z67k\" (UniqueName: \"kubernetes.io/projected/e999b5a4-2e54-4195-98fa-4c5fa36f1b3a-kube-api-access-2z67k\") pod \"ingress-canary-qdsgb\" (UID: \"e999b5a4-2e54-4195-98fa-4c5fa36f1b3a\") " pod="openshift-ingress-canary/ingress-canary-qdsgb" Jan 31 14:43:59 crc kubenswrapper[4751]: I0131 14:43:59.000498 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ghblb" Jan 31 14:43:59 crc kubenswrapper[4751]: I0131 14:43:59.008031 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vc9q2" Jan 31 14:43:59 crc kubenswrapper[4751]: I0131 14:43:59.012771 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grhf9\" (UniqueName: \"kubernetes.io/projected/9f99779e-5e77-4b5c-8886-7accebe8a897-kube-api-access-grhf9\") pod \"olm-operator-6b444d44fb-7hc86\" (UID: \"9f99779e-5e77-4b5c-8886-7accebe8a897\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7hc86" Jan 31 14:43:59 crc kubenswrapper[4751]: I0131 14:43:59.014682 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-5r6kv" Jan 31 14:43:59 crc kubenswrapper[4751]: W0131 14:43:59.015133 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf13811e7_14eb_4a17_90a1_345619f9fb29.slice/crio-0a19bbbc968d497d45ccaceb151b6f5c6cef19ea57a552f00fe6312a063c997d WatchSource:0}: Error finding container 0a19bbbc968d497d45ccaceb151b6f5c6cef19ea57a552f00fe6312a063c997d: Status 404 returned error can't find the container with id 0a19bbbc968d497d45ccaceb151b6f5c6cef19ea57a552f00fe6312a063c997d Jan 31 14:43:59 crc kubenswrapper[4751]: W0131 14:43:59.016175 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a74f65d_f8d2_41af_8469_6f8d020b41de.slice/crio-ad3a2c6a62881881dc1b8603705ab22d4020343bea6e9b8d294520bb64a2c111 WatchSource:0}: Error finding container ad3a2c6a62881881dc1b8603705ab22d4020343bea6e9b8d294520bb64a2c111: Status 404 returned error can't find the container with id ad3a2c6a62881881dc1b8603705ab22d4020343bea6e9b8d294520bb64a2c111 Jan 31 14:43:59 crc kubenswrapper[4751]: I0131 14:43:59.022011 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497830-rpwmp" Jan 31 14:43:59 crc kubenswrapper[4751]: I0131 14:43:59.036274 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7hc86" Jan 31 14:43:59 crc kubenswrapper[4751]: I0131 14:43:59.040238 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8tl6\" (UniqueName: \"kubernetes.io/projected/cc8ec6f8-52f3-4bb8-a00b-4f73276a3af4-kube-api-access-x8tl6\") pod \"service-ca-9c57cc56f-vbfvz\" (UID: \"cc8ec6f8-52f3-4bb8-a00b-4f73276a3af4\") " pod="openshift-service-ca/service-ca-9c57cc56f-vbfvz" Jan 31 14:43:59 crc kubenswrapper[4751]: I0131 14:43:59.041794 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-skzbg" Jan 31 14:43:59 crc kubenswrapper[4751]: I0131 14:43:59.048574 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-x4njd" Jan 31 14:43:59 crc kubenswrapper[4751]: I0131 14:43:59.050405 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" event={"ID":"802d5225-ef3f-485c-bb85-3c0f18e42952","Type":"ContainerStarted","Data":"5084329a5b9f4efb799b2485cd137ef3c2a4c4cd5ed6e746dd1d5ef125ea23bd"} Jan 31 14:43:59 crc kubenswrapper[4751]: I0131 14:43:59.053406 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-drp8h" event={"ID":"466718f1-f118-4f13-a983-14060aef09e6","Type":"ContainerStarted","Data":"fd0306ff302533f02a3392bf052bc01c096f3c7f4bb2c4d47d41b58b4fc67bb8"} Jan 31 14:43:59 crc kubenswrapper[4751]: I0131 14:43:59.053997 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-czqdr" event={"ID":"f13811e7-14eb-4a17-90a1-345619f9fb29","Type":"ContainerStarted","Data":"0a19bbbc968d497d45ccaceb151b6f5c6cef19ea57a552f00fe6312a063c997d"} Jan 31 14:43:59 crc kubenswrapper[4751]: I0131 14:43:59.055256 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w4lzx" event={"ID":"aceeef0f-cb36-43d6-8e09-35949fe73911","Type":"ContainerStarted","Data":"26ee040182e4644288f05d6a5f8a159c58f3e31b7f0c23c319060a2d9e2b325f"} Jan 31 14:43:59 crc kubenswrapper[4751]: I0131 14:43:59.056007 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s7gwp" event={"ID":"ac1cf81b-8ec8-4ae4-bfb3-d46bb75f24d4","Type":"ContainerStarted","Data":"a7e080d2bb12b2ede9d3754478b51602125694dbb1c64477101f2370cec9aee2"} Jan 31 14:43:59 crc kubenswrapper[4751]: I0131 14:43:59.056810 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-4gqrl" event={"ID":"bcd7a932-6db9-4cca-b619-852242324725","Type":"ContainerStarted","Data":"9a0b4fb67f53340e222c61b9f6faae13df637f62fa98532035ec8c6f50ad1e2e"} Jan 31 14:43:59 crc kubenswrapper[4751]: I0131 14:43:59.056839 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-4gqrl" event={"ID":"bcd7a932-6db9-4cca-b619-852242324725","Type":"ContainerStarted","Data":"365a012e0dc256fca522addd41c83a63fb808621ec56987dcd5bb062ddda6006"} Jan 31 14:43:59 crc kubenswrapper[4751]: I0131 14:43:59.057388 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-pmglg" event={"ID":"b1b479ec-e8d7-4fb6-8d0b-9fac28697df7","Type":"ContainerStarted","Data":"fd4bcb3bdf010859761ec964c7da87c34b995e58983fa9a8f2551a87d49615ca"} Jan 31 14:43:59 crc kubenswrapper[4751]: I0131 14:43:59.059272 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-hgs4c" event={"ID":"2d8dfb13-f0a0-465c-821d-95f0df0a98cf","Type":"ContainerStarted","Data":"f4985d6fa20c1f7c84743bb91673b6608bdb784531485a09c314e16cd026c074"} Jan 31 14:43:59 crc kubenswrapper[4751]: I0131 14:43:59.059795 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-8fgxq" event={"ID":"4ba2ceb2-34e1-487c-9b13-0a480d6cc521","Type":"ContainerStarted","Data":"9f4bdfb82d894279880a03632f547065caa26381e56b663179a66bc5f2693f47"} Jan 31 14:43:59 crc kubenswrapper[4751]: I0131 14:43:59.060058 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhsxz\" (UniqueName: \"kubernetes.io/projected/d64a8f76-87cc-45eb-bc92-5802a3db6c3d-kube-api-access-jhsxz\") pod \"cluster-image-registry-operator-dc59b4c8b-v4px6\" (UID: \"d64a8f76-87cc-45eb-bc92-5802a3db6c3d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v4px6" Jan 31 14:43:59 crc kubenswrapper[4751]: I0131 14:43:59.060509 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-sxjf5" event={"ID":"c1e92f9b-2291-4bd5-80b8-c2f9e667acf8","Type":"ContainerStarted","Data":"f58e74380a8c1e3f0d559b0c6a44b9911f247b06dc418233b9c41d9a25e6f05e"} Jan 31 14:43:59 crc kubenswrapper[4751]: I0131 14:43:59.061323 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-v8p8j" event={"ID":"b9810521-7440-49d4-bf04-7dbe3324cc5b","Type":"ContainerStarted","Data":"281ca335e21d1ef7df4a991d6128059fdc2c0654efcf33689fe07a1e7199b3d7"} Jan 31 14:43:59 crc kubenswrapper[4751]: I0131 14:43:59.062270 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-4m7jl" event={"ID":"d723501b-bb29-4d60-ad97-239eb749771f","Type":"ContainerStarted","Data":"55f3f39b8468ee5b14f5e58ff28b8af28a163986936638ae36501a8b019fc5b0"} Jan 31 14:43:59 crc kubenswrapper[4751]: I0131 14:43:59.063276 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-z9dj7" event={"ID":"e14d9fb0-f377-4331-8bc1-8f4017bb95a3","Type":"ContainerStarted","Data":"9041d69395149888edcf83d772a3e7d07e853e34f5584fc9f9c54da93668e0ec"} Jan 31 14:43:59 crc kubenswrapper[4751]: I0131 14:43:59.063299 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-z9dj7" event={"ID":"e14d9fb0-f377-4331-8bc1-8f4017bb95a3","Type":"ContainerStarted","Data":"b04fce866f961463553a14856d3791616fdd2474de989b38b991dcb6c8a6211a"} Jan 31 14:43:59 crc kubenswrapper[4751]: I0131 14:43:59.063760 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-db5pg" event={"ID":"6a74f65d-f8d2-41af-8469-6f8d020b41de","Type":"ContainerStarted","Data":"ad3a2c6a62881881dc1b8603705ab22d4020343bea6e9b8d294520bb64a2c111"} Jan 31 14:43:59 crc kubenswrapper[4751]: I0131 14:43:59.064247 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdsj4" event={"ID":"5f75ab4e-45c1-4ed9-b966-afa91dbc88a6","Type":"ContainerStarted","Data":"ab97991b8db78d8e752d40d1152ee78f2b57fd461dd5402070894dd0d8f788b9"} Jan 31 14:43:59 crc kubenswrapper[4751]: I0131 14:43:59.064995 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-5f7jc" event={"ID":"89314349-bbc8-4886-b93b-51358e4e71b0","Type":"ContainerStarted","Data":"967470003d103abe0f15cb3aee95279d6db37859b3df7fae4ba6e52e803a97ed"} Jan 31 14:43:59 crc kubenswrapper[4751]: I0131 14:43:59.069960 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-x4rnh" Jan 31 14:43:59 crc kubenswrapper[4751]: I0131 14:43:59.070154 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:43:59 crc kubenswrapper[4751]: E0131 14:43:59.070690 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:43:59.570671905 +0000 UTC m=+143.945384780 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:43:59 crc kubenswrapper[4751]: I0131 14:43:59.075429 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qdsgb" Jan 31 14:43:59 crc kubenswrapper[4751]: I0131 14:43:59.076525 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5c65\" (UniqueName: \"kubernetes.io/projected/b17c8e83-275b-4777-946a-c7360ad8fa48-kube-api-access-r5c65\") pod \"migrator-59844c95c7-b44gm\" (UID: \"b17c8e83-275b-4777-946a-c7360ad8fa48\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b44gm" Jan 31 14:43:59 crc kubenswrapper[4751]: I0131 14:43:59.109708 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4lkn\" (UniqueName: \"kubernetes.io/projected/7014a649-2d58-4772-9eb3-697e4b925923-kube-api-access-z4lkn\") pod \"package-server-manager-789f6589d5-kvfvk\" (UID: \"7014a649-2d58-4772-9eb3-697e4b925923\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kvfvk" Jan 31 14:43:59 crc kubenswrapper[4751]: I0131 14:43:59.172082 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:43:59 crc kubenswrapper[4751]: E0131 14:43:59.172367 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:43:59.67235567 +0000 UTC m=+144.047068555 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:43:59 crc kubenswrapper[4751]: I0131 14:43:59.259480 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b44gm" Jan 31 14:43:59 crc kubenswrapper[4751]: I0131 14:43:59.265947 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v4px6" Jan 31 14:43:59 crc kubenswrapper[4751]: I0131 14:43:59.273140 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:43:59 crc kubenswrapper[4751]: I0131 14:43:59.273214 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kvfvk" Jan 31 14:43:59 crc kubenswrapper[4751]: E0131 14:43:59.273504 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:43:59.773489841 +0000 UTC m=+144.148202726 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:43:59 crc kubenswrapper[4751]: I0131 14:43:59.293043 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-ghblb"] Jan 31 14:43:59 crc kubenswrapper[4751]: I0131 14:43:59.325942 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cp47m"] Jan 31 14:43:59 crc kubenswrapper[4751]: I0131 14:43:59.335676 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-vbfvz" Jan 31 14:43:59 crc kubenswrapper[4751]: I0131 14:43:59.375026 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:43:59 crc kubenswrapper[4751]: E0131 14:43:59.375394 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:43:59.875376821 +0000 UTC m=+144.250089706 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:43:59 crc kubenswrapper[4751]: I0131 14:43:59.475789 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:43:59 crc kubenswrapper[4751]: E0131 14:43:59.476048 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:43:59.976002299 +0000 UTC m=+144.350715184 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:43:59 crc kubenswrapper[4751]: I0131 14:43:59.476318 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:43:59 crc kubenswrapper[4751]: E0131 14:43:59.476662 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:43:59.976642785 +0000 UTC m=+144.351355760 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:43:59 crc kubenswrapper[4751]: I0131 14:43:59.531024 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-h262z"] Jan 31 14:43:59 crc kubenswrapper[4751]: I0131 14:43:59.569283 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7762w"] Jan 31 14:43:59 crc kubenswrapper[4751]: I0131 14:43:59.577022 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:43:59 crc kubenswrapper[4751]: E0131 14:43:59.577183 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:00.07715561 +0000 UTC m=+144.451868505 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:43:59 crc kubenswrapper[4751]: I0131 14:43:59.577611 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:43:59 crc kubenswrapper[4751]: E0131 14:43:59.577918 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:00.077906199 +0000 UTC m=+144.452619084 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:43:59 crc kubenswrapper[4751]: W0131 14:43:59.676924 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca236cfc_51d0_4d79_b90c_ddac400b4dbb.slice/crio-e8aeb7923ac83033c9a62d8ea8d64c7240a8bb3aff16808268712db72259da0c WatchSource:0}: Error finding container e8aeb7923ac83033c9a62d8ea8d64c7240a8bb3aff16808268712db72259da0c: Status 404 returned error can't find the container with id e8aeb7923ac83033c9a62d8ea8d64c7240a8bb3aff16808268712db72259da0c Jan 31 14:43:59 crc kubenswrapper[4751]: I0131 14:43:59.679456 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:43:59 crc kubenswrapper[4751]: E0131 14:43:59.679625 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:00.179596255 +0000 UTC m=+144.554309140 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:43:59 crc kubenswrapper[4751]: I0131 14:43:59.679759 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:43:59 crc kubenswrapper[4751]: E0131 14:43:59.680112 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:00.180095768 +0000 UTC m=+144.554808663 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:43:59 crc kubenswrapper[4751]: I0131 14:43:59.781492 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:43:59 crc kubenswrapper[4751]: E0131 14:43:59.781846 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:00.281833184 +0000 UTC m=+144.656546069 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:43:59 crc kubenswrapper[4751]: I0131 14:43:59.818069 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l76jv"] Jan 31 14:43:59 crc kubenswrapper[4751]: W0131 14:43:59.832940 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5caeb3dc_2a42_41b5_ac91_c1c8a216fb43.slice/crio-e4a212221c51a253a85cdbe4957b312fae8746c13753f6177155b8edc4477e15 WatchSource:0}: Error finding container e4a212221c51a253a85cdbe4957b312fae8746c13753f6177155b8edc4477e15: Status 404 returned error can't find the container with id e4a212221c51a253a85cdbe4957b312fae8746c13753f6177155b8edc4477e15 Jan 31 14:43:59 crc kubenswrapper[4751]: I0131 14:43:59.867956 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-xv2tk"] Jan 31 14:43:59 crc kubenswrapper[4751]: I0131 14:43:59.882993 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:43:59 crc kubenswrapper[4751]: E0131 14:43:59.883419 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:00.383399646 +0000 UTC m=+144.758112631 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:43:59 crc kubenswrapper[4751]: W0131 14:43:59.991301 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01ff1674_4e01_4cdc_aea3_1e91a6a389e3.slice/crio-840836a6c72b8c7d0d3660d9fcc654730f428224f8ceaf3e509f7e3cfce1d3ce WatchSource:0}: Error finding container 840836a6c72b8c7d0d3660d9fcc654730f428224f8ceaf3e509f7e3cfce1d3ce: Status 404 returned error can't find the container with id 840836a6c72b8c7d0d3660d9fcc654730f428224f8ceaf3e509f7e3cfce1d3ce Jan 31 14:43:59 crc kubenswrapper[4751]: I0131 14:43:59.993343 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:43:59 crc kubenswrapper[4751]: E0131 14:43:59.993553 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:00.493529584 +0000 UTC m=+144.868242469 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:43:59 crc kubenswrapper[4751]: I0131 14:43:59.993594 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:43:59 crc kubenswrapper[4751]: E0131 14:43:59.994018 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:00.494012207 +0000 UTC m=+144.868725092 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:00 crc kubenswrapper[4751]: I0131 14:44:00.073355 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cc6m2" event={"ID":"0058e7f4-92db-444d-a979-2880c3f83442","Type":"ContainerStarted","Data":"de9ae213b49446be120f1f31c36cc3cd78dec93edd36c195b61cb811975150e2"} Jan 31 14:44:00 crc kubenswrapper[4751]: I0131 14:44:00.079406 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-sxjf5" event={"ID":"c1e92f9b-2291-4bd5-80b8-c2f9e667acf8","Type":"ContainerStarted","Data":"96a0531e47323a9257c24b651a7067cc71a6c2a1c9189022bfa8c72e23c446c1"} Jan 31 14:44:00 crc kubenswrapper[4751]: I0131 14:44:00.082177 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xqgfv"] Jan 31 14:44:00 crc kubenswrapper[4751]: I0131 14:44:00.095008 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:00 crc kubenswrapper[4751]: E0131 14:44:00.095686 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:00.595664681 +0000 UTC m=+144.970377576 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:00 crc kubenswrapper[4751]: I0131 14:44:00.107570 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-pmglg" event={"ID":"b1b479ec-e8d7-4fb6-8d0b-9fac28697df7","Type":"ContainerStarted","Data":"b65a4ce95281de25d0b50e1636e0b5f707265472aa0f98791c8f5c8c9aacc9dd"} Jan 31 14:44:00 crc kubenswrapper[4751]: I0131 14:44:00.164744 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-5hn9b" event={"ID":"01ff1674-4e01-4cdc-aea3-1e91a6a389e3","Type":"ContainerStarted","Data":"840836a6c72b8c7d0d3660d9fcc654730f428224f8ceaf3e509f7e3cfce1d3ce"} Jan 31 14:44:00 crc kubenswrapper[4751]: I0131 14:44:00.177467 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8m7f4"] Jan 31 14:44:00 crc kubenswrapper[4751]: I0131 14:44:00.194058 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w4lzx" event={"ID":"aceeef0f-cb36-43d6-8e09-35949fe73911","Type":"ContainerStarted","Data":"58e4e3cb7775c4ff4af54ae9e1ddd2a0c18e2492ba19b4029ef965cebf70ade9"} Jan 31 14:44:00 crc kubenswrapper[4751]: I0131 14:44:00.198780 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:00 crc kubenswrapper[4751]: E0131 14:44:00.200150 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:00.700060348 +0000 UTC m=+145.074773233 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:00 crc kubenswrapper[4751]: I0131 14:44:00.202500 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cp47m" event={"ID":"17d7ae01-24ad-448d-ae7c-10df353833f4","Type":"ContainerStarted","Data":"11a9ef810e22dda615de31154140864da1e74955db441a99400f8429aea57e2b"} Jan 31 14:44:00 crc kubenswrapper[4751]: I0131 14:44:00.203620 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-4m7jl" event={"ID":"d723501b-bb29-4d60-ad97-239eb749771f","Type":"ContainerStarted","Data":"c30c3c72b70b50a48a10df46b410ef194d9cd97429bf90377450d5f25486d136"} Jan 31 14:44:00 crc kubenswrapper[4751]: I0131 14:44:00.203883 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-4m7jl" Jan 31 14:44:00 crc kubenswrapper[4751]: I0131 14:44:00.209545 4751 generic.go:334] "Generic (PLEG): container finished" podID="e14d9fb0-f377-4331-8bc1-8f4017bb95a3" containerID="9041d69395149888edcf83d772a3e7d07e853e34f5584fc9f9c54da93668e0ec" exitCode=0 Jan 31 14:44:00 crc kubenswrapper[4751]: I0131 14:44:00.209646 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-z9dj7" event={"ID":"e14d9fb0-f377-4331-8bc1-8f4017bb95a3","Type":"ContainerDied","Data":"9041d69395149888edcf83d772a3e7d07e853e34f5584fc9f9c54da93668e0ec"} Jan 31 14:44:00 crc kubenswrapper[4751]: I0131 14:44:00.211244 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-hgs4c" event={"ID":"2d8dfb13-f0a0-465c-821d-95f0df0a98cf","Type":"ContainerStarted","Data":"a0c496cc6bc86826d7fe98acbe97e89bc0f321821d928b23952b9f122be93fc1"} Jan 31 14:44:00 crc kubenswrapper[4751]: I0131 14:44:00.212695 4751 patch_prober.go:28] interesting pod/downloads-7954f5f757-4m7jl container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Jan 31 14:44:00 crc kubenswrapper[4751]: I0131 14:44:00.212712 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h4drr"] Jan 31 14:44:00 crc kubenswrapper[4751]: I0131 14:44:00.212731 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-4m7jl" podUID="d723501b-bb29-4d60-ad97-239eb749771f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Jan 31 14:44:00 crc kubenswrapper[4751]: I0131 14:44:00.214420 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-h262z" event={"ID":"5caeb3dc-2a42-41b5-ac91-c1c8a216fb43","Type":"ContainerStarted","Data":"e4a212221c51a253a85cdbe4957b312fae8746c13753f6177155b8edc4477e15"} Jan 31 14:44:00 crc kubenswrapper[4751]: I0131 14:44:00.216216 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l76jv" event={"ID":"f47a4e08-e21f-4a13-9ea2-bc1545a64cae","Type":"ContainerStarted","Data":"e428f36863e8b2a4199be5e87a3bbb26686432fcbc853e16723a120cc64b0c3b"} Jan 31 14:44:00 crc kubenswrapper[4751]: I0131 14:44:00.217723 4751 generic.go:334] "Generic (PLEG): container finished" podID="5f75ab4e-45c1-4ed9-b966-afa91dbc88a6" containerID="c94a9d68d990b53dfe15b33171acb76b1530c8276359ca051256f382f8210faf" exitCode=0 Jan 31 14:44:00 crc kubenswrapper[4751]: I0131 14:44:00.217776 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdsj4" event={"ID":"5f75ab4e-45c1-4ed9-b966-afa91dbc88a6","Type":"ContainerDied","Data":"c94a9d68d990b53dfe15b33171acb76b1530c8276359ca051256f382f8210faf"} Jan 31 14:44:00 crc kubenswrapper[4751]: I0131 14:44:00.225114 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s7gwp" event={"ID":"ac1cf81b-8ec8-4ae4-bfb3-d46bb75f24d4","Type":"ContainerStarted","Data":"1a9425425136bc0fcd1b4e7782f30ce168735073d7c66c4111fc0e6d33b533c2"} Jan 31 14:44:00 crc kubenswrapper[4751]: I0131 14:44:00.229574 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xv2tk" event={"ID":"d031fa1b-4d52-47d7-8c39-5fa21fb6c244","Type":"ContainerStarted","Data":"926223eb332d6bd0fd3e2c89d69a100612602a488e2aab796efe3cf85cf01deb"} Jan 31 14:44:00 crc kubenswrapper[4751]: I0131 14:44:00.231051 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7762w" event={"ID":"84e2930a-5ae3-4171-a3dd-e5eea62ef157","Type":"ContainerStarted","Data":"15e734ffd4fba2493be6a9b1bfbac50c0f6bd9a8e2ffdca45f856621c3703f44"} Jan 31 14:44:00 crc kubenswrapper[4751]: I0131 14:44:00.241379 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ghblb" event={"ID":"ca236cfc-51d0-4d79-b90c-ddac400b4dbb","Type":"ContainerStarted","Data":"e8aeb7923ac83033c9a62d8ea8d64c7240a8bb3aff16808268712db72259da0c"} Jan 31 14:44:00 crc kubenswrapper[4751]: I0131 14:44:00.243794 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-drp8h" event={"ID":"466718f1-f118-4f13-a983-14060aef09e6","Type":"ContainerStarted","Data":"aced264e1a5d786ba81e1dfd63c009187b1be53d0ee6ad4e1170392825ce2f8a"} Jan 31 14:44:00 crc kubenswrapper[4751]: I0131 14:44:00.300212 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:00 crc kubenswrapper[4751]: E0131 14:44:00.301148 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:00.801132577 +0000 UTC m=+145.175845462 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:00 crc kubenswrapper[4751]: I0131 14:44:00.402241 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:00 crc kubenswrapper[4751]: E0131 14:44:00.403105 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:00.903089898 +0000 UTC m=+145.277802783 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:00 crc kubenswrapper[4751]: I0131 14:44:00.426863 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nz99n"] Jan 31 14:44:00 crc kubenswrapper[4751]: I0131 14:44:00.451625 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7hc86"] Jan 31 14:44:00 crc kubenswrapper[4751]: I0131 14:44:00.504000 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:00 crc kubenswrapper[4751]: E0131 14:44:00.504353 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:01.004339322 +0000 UTC m=+145.379052207 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:00 crc kubenswrapper[4751]: I0131 14:44:00.606138 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:00 crc kubenswrapper[4751]: E0131 14:44:00.606472 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:01.106460339 +0000 UTC m=+145.481173224 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:00 crc kubenswrapper[4751]: I0131 14:44:00.700838 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5r6kv"] Jan 31 14:44:00 crc kubenswrapper[4751]: I0131 14:44:00.707458 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:00 crc kubenswrapper[4751]: E0131 14:44:00.707830 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:01.207816215 +0000 UTC m=+145.582529100 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:00 crc kubenswrapper[4751]: I0131 14:44:00.718895 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-skzbg"] Jan 31 14:44:00 crc kubenswrapper[4751]: I0131 14:44:00.720954 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-vbfvz"] Jan 31 14:44:00 crc kubenswrapper[4751]: I0131 14:44:00.733885 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497830-rpwmp"] Jan 31 14:44:00 crc kubenswrapper[4751]: I0131 14:44:00.739841 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7hjp9"] Jan 31 14:44:00 crc kubenswrapper[4751]: I0131 14:44:00.748750 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-4m7jl" podStartSLOduration=120.748735795 podStartE2EDuration="2m0.748735795s" podCreationTimestamp="2026-01-31 14:42:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:44:00.74623837 +0000 UTC m=+145.120951255" watchObservedRunningTime="2026-01-31 14:44:00.748735795 +0000 UTC m=+145.123448680" Jan 31 14:44:00 crc kubenswrapper[4751]: I0131 14:44:00.784553 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vc9q2"] Jan 31 14:44:00 crc kubenswrapper[4751]: I0131 14:44:00.793844 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-hgs4c" podStartSLOduration=119.793825696 podStartE2EDuration="1m59.793825696s" podCreationTimestamp="2026-01-31 14:42:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:44:00.788445974 +0000 UTC m=+145.163158849" watchObservedRunningTime="2026-01-31 14:44:00.793825696 +0000 UTC m=+145.168538581" Jan 31 14:44:00 crc kubenswrapper[4751]: I0131 14:44:00.808725 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:00 crc kubenswrapper[4751]: E0131 14:44:00.809038 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:01.309026738 +0000 UTC m=+145.683739623 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:00 crc kubenswrapper[4751]: I0131 14:44:00.865691 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s7gwp" podStartSLOduration=119.865673363 podStartE2EDuration="1m59.865673363s" podCreationTimestamp="2026-01-31 14:42:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:44:00.840059577 +0000 UTC m=+145.214772462" watchObservedRunningTime="2026-01-31 14:44:00.865673363 +0000 UTC m=+145.240386248" Jan 31 14:44:00 crc kubenswrapper[4751]: I0131 14:44:00.867251 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-b44gm"] Jan 31 14:44:00 crc kubenswrapper[4751]: I0131 14:44:00.867671 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qdsgb"] Jan 31 14:44:00 crc kubenswrapper[4751]: I0131 14:44:00.871229 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v4px6"] Jan 31 14:44:00 crc kubenswrapper[4751]: I0131 14:44:00.871262 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-x4rnh"] Jan 31 14:44:00 crc kubenswrapper[4751]: W0131 14:44:00.874968 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89a244ab_c405_48aa_893f_f50995384ede.slice/crio-cf6795000c69e913a437a37284db30d46d068223ad9c3aaaf739a528bd1f8eab WatchSource:0}: Error finding container cf6795000c69e913a437a37284db30d46d068223ad9c3aaaf739a528bd1f8eab: Status 404 returned error can't find the container with id cf6795000c69e913a437a37284db30d46d068223ad9c3aaaf739a528bd1f8eab Jan 31 14:44:00 crc kubenswrapper[4751]: I0131 14:44:00.910759 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:00 crc kubenswrapper[4751]: E0131 14:44:00.911325 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:01.411310768 +0000 UTC m=+145.786023653 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:00 crc kubenswrapper[4751]: I0131 14:44:00.935939 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kvfvk"] Jan 31 14:44:00 crc kubenswrapper[4751]: W0131 14:44:00.985887 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb17c8e83_275b_4777_946a_c7360ad8fa48.slice/crio-af4f3abc4a3b62a128d16eb2c611b54e459447c23307822acee137a849594c11 WatchSource:0}: Error finding container af4f3abc4a3b62a128d16eb2c611b54e459447c23307822acee137a849594c11: Status 404 returned error can't find the container with id af4f3abc4a3b62a128d16eb2c611b54e459447c23307822acee137a849594c11 Jan 31 14:44:00 crc kubenswrapper[4751]: I0131 14:44:00.996902 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-sxjf5" podStartSLOduration=119.996882888 podStartE2EDuration="1m59.996882888s" podCreationTimestamp="2026-01-31 14:42:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:44:00.959339907 +0000 UTC m=+145.334052792" watchObservedRunningTime="2026-01-31 14:44:00.996882888 +0000 UTC m=+145.371595773" Jan 31 14:44:00 crc kubenswrapper[4751]: I0131 14:44:00.997303 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-pmglg" podStartSLOduration=120.997297539 podStartE2EDuration="2m0.997297539s" podCreationTimestamp="2026-01-31 14:42:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:44:00.996104067 +0000 UTC m=+145.370816942" watchObservedRunningTime="2026-01-31 14:44:00.997297539 +0000 UTC m=+145.372010414" Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.012720 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:01 crc kubenswrapper[4751]: E0131 14:44:01.013058 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:01.513047385 +0000 UTC m=+145.887760270 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:01 crc kubenswrapper[4751]: W0131 14:44:01.050290 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd64a8f76_87cc_45eb_bc92_5802a3db6c3d.slice/crio-9394bf1aac35689cd86cf663972ec5310ddc561b1fc7c22e8b46fa9b64fffc79 WatchSource:0}: Error finding container 9394bf1aac35689cd86cf663972ec5310ddc561b1fc7c22e8b46fa9b64fffc79: Status 404 returned error can't find the container with id 9394bf1aac35689cd86cf663972ec5310ddc561b1fc7c22e8b46fa9b64fffc79 Jan 31 14:44:01 crc kubenswrapper[4751]: W0131 14:44:01.080469 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7014a649_2d58_4772_9eb3_697e4b925923.slice/crio-76846dec47e5b4970ff32470e578446f70d52dac43c1c3d5677e971c56d13a0f WatchSource:0}: Error finding container 76846dec47e5b4970ff32470e578446f70d52dac43c1c3d5677e971c56d13a0f: Status 404 returned error can't find the container with id 76846dec47e5b4970ff32470e578446f70d52dac43c1c3d5677e971c56d13a0f Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.124064 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:01 crc kubenswrapper[4751]: E0131 14:44:01.124227 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:01.62420718 +0000 UTC m=+145.998920065 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.124384 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:01 crc kubenswrapper[4751]: E0131 14:44:01.124827 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:01.624818906 +0000 UTC m=+145.999531791 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.224953 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:01 crc kubenswrapper[4751]: E0131 14:44:01.225311 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:01.72529562 +0000 UTC m=+146.100008505 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.225608 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:01 crc kubenswrapper[4751]: E0131 14:44:01.225861 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:01.725853164 +0000 UTC m=+146.100566049 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.250983 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-vbfvz" event={"ID":"cc8ec6f8-52f3-4bb8-a00b-4f73276a3af4","Type":"ContainerStarted","Data":"3bf0bb5c75e0d3e46cdfd98f66f2db57aa365840a1943b8b5415bf165d392066"} Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.251731 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qdsgb" event={"ID":"e999b5a4-2e54-4195-98fa-4c5fa36f1b3a","Type":"ContainerStarted","Data":"4574660b2ec0b8ca54c33feb5110d2f650949aa790d87c7c7eb817bea8cc9a00"} Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.255008 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-db5pg" event={"ID":"6a74f65d-f8d2-41af-8469-6f8d020b41de","Type":"ContainerStarted","Data":"59249b5db3559d44ce1d49fd55799f7f3bcc4d0ede7839ca802f94b7dd5b3b94"} Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.256268 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-5r6kv" event={"ID":"8bc2bde1-a50e-47bb-8211-4b5ed0ac74ea","Type":"ContainerStarted","Data":"8d8b4a1528af48d18db181db8a7bebc79bb86f32aba8601a554e74b7bcaef05b"} Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.270981 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-db5pg" podStartSLOduration=121.270962386 podStartE2EDuration="2m1.270962386s" podCreationTimestamp="2026-01-31 14:42:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:44:01.270722209 +0000 UTC m=+145.645435094" watchObservedRunningTime="2026-01-31 14:44:01.270962386 +0000 UTC m=+145.645675271" Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.271109 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-z9dj7" event={"ID":"e14d9fb0-f377-4331-8bc1-8f4017bb95a3","Type":"ContainerStarted","Data":"20879949a09c49e31dedd607f1479e96ea74daf3c078632d9eeaf5e4b3b68d85"} Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.284151 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vc9q2" event={"ID":"9edad05e-bd87-4a20-a947-6b09f9f7c93a","Type":"ContainerStarted","Data":"c82f7bca9875a2ef35eb3f4f5fbad8acfc64f20ad382ddc3364970333da29ca0"} Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.290528 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cp47m" event={"ID":"17d7ae01-24ad-448d-ae7c-10df353833f4","Type":"ContainerStarted","Data":"52542f554222fd78c4824c2265e7cc50a2c69472e6e2676a61d7a7affa119cfc"} Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.297361 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xv2tk" event={"ID":"d031fa1b-4d52-47d7-8c39-5fa21fb6c244","Type":"ContainerStarted","Data":"795803bb88a0e7f1bbf4e9896ee665ce49d04a6cdb547135406940042ea76f72"} Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.305613 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-czqdr" event={"ID":"f13811e7-14eb-4a17-90a1-345619f9fb29","Type":"ContainerStarted","Data":"903e9ce05a25a0620d64da0a5d11a7b3416512e6fbe058d2e17fdc2b278a385a"} Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.307647 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cp47m" podStartSLOduration=120.307614453 podStartE2EDuration="2m0.307614453s" podCreationTimestamp="2026-01-31 14:42:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:44:01.303495085 +0000 UTC m=+145.678207970" watchObservedRunningTime="2026-01-31 14:44:01.307614453 +0000 UTC m=+145.682327338" Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.354616 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:01 crc kubenswrapper[4751]: E0131 14:44:01.355115 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:01.855023005 +0000 UTC m=+146.229735890 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.356256 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:01 crc kubenswrapper[4751]: E0131 14:44:01.357016 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:01.857003067 +0000 UTC m=+146.231715952 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.359858 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-czqdr" podStartSLOduration=120.359841912 podStartE2EDuration="2m0.359841912s" podCreationTimestamp="2026-01-31 14:42:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:44:01.357764428 +0000 UTC m=+145.732477313" watchObservedRunningTime="2026-01-31 14:44:01.359841912 +0000 UTC m=+145.734554797" Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.364576 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-x4njd" event={"ID":"dc13f997-316e-4e81-a56e-0fa6e02d1502","Type":"ContainerStarted","Data":"dce6077c142973950a60f9709e36f8427a2008077ace6934308f24b16df27181"} Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.364611 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-x4njd" event={"ID":"dc13f997-316e-4e81-a56e-0fa6e02d1502","Type":"ContainerStarted","Data":"352db6cf586acb663510b8303b28633793e0196c8b93440629b7075d97e63518"} Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.369778 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-v8p8j" event={"ID":"b9810521-7440-49d4-bf04-7dbe3324cc5b","Type":"ContainerStarted","Data":"e2c726fff05f34e326d0ee1987f7f7c60d6d26af5f8fe6fa7c97fa815a3317d7"} Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.380430 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-x4njd" podStartSLOduration=6.380415966 podStartE2EDuration="6.380415966s" podCreationTimestamp="2026-01-31 14:43:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:44:01.377139179 +0000 UTC m=+145.751852064" watchObservedRunningTime="2026-01-31 14:44:01.380415966 +0000 UTC m=+145.755128851" Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.401448 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-h262z" event={"ID":"5caeb3dc-2a42-41b5-ac91-c1c8a216fb43","Type":"ContainerStarted","Data":"4ca9486c0c475df9706c69a8042a91ffb75ee83dea5dea8944cb0a263025ee01"} Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.429904 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w4lzx" event={"ID":"aceeef0f-cb36-43d6-8e09-35949fe73911","Type":"ContainerStarted","Data":"4baa46c448affe4dd56663a4e343cbc7c6fcd34dd03a41b5df88bb7dcb0741d8"} Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.437030 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-h262z" podStartSLOduration=121.43700477 podStartE2EDuration="2m1.43700477s" podCreationTimestamp="2026-01-31 14:42:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:44:01.436710622 +0000 UTC m=+145.811423507" watchObservedRunningTime="2026-01-31 14:44:01.43700477 +0000 UTC m=+145.811717655" Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.447763 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nz99n" event={"ID":"4c4b193a-a01b-440a-a94a-55c4b5f06586","Type":"ContainerStarted","Data":"08256894410bac7c47817ab162acd2e447ad5d18b741d1c9eb8858f122bb60c1"} Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.460490 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w4lzx" podStartSLOduration=120.4604697 podStartE2EDuration="2m0.4604697s" podCreationTimestamp="2026-01-31 14:42:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:44:01.453689961 +0000 UTC m=+145.828402846" watchObservedRunningTime="2026-01-31 14:44:01.4604697 +0000 UTC m=+145.835182585" Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.461304 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:01 crc kubenswrapper[4751]: E0131 14:44:01.461937 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:01.961904878 +0000 UTC m=+146.336617763 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.464989 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-drp8h" event={"ID":"466718f1-f118-4f13-a983-14060aef09e6","Type":"ContainerStarted","Data":"b09e25f06983841ed3cf3749bd8fc6428edcabe64175c718c26a79d1849563e7"} Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.469876 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-4gqrl" event={"ID":"bcd7a932-6db9-4cca-b619-852242324725","Type":"ContainerStarted","Data":"d2d4f70fe3b8dde349f58413464d1d27ca8ea76a8246d3c31d2f88d50401e6c2"} Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.475349 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ghblb" event={"ID":"ca236cfc-51d0-4d79-b90c-ddac400b4dbb","Type":"ContainerStarted","Data":"f69800f221f673da0026a0fae7aed9cdbc49c7f5de5433b68fb8c0b3dcc1115b"} Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.477856 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-drp8h" podStartSLOduration=120.477841848 podStartE2EDuration="2m0.477841848s" podCreationTimestamp="2026-01-31 14:42:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:44:01.477402117 +0000 UTC m=+145.852115002" watchObservedRunningTime="2026-01-31 14:44:01.477841848 +0000 UTC m=+145.852554733" Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.478053 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kvfvk" event={"ID":"7014a649-2d58-4772-9eb3-697e4b925923","Type":"ContainerStarted","Data":"76846dec47e5b4970ff32470e578446f70d52dac43c1c3d5677e971c56d13a0f"} Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.497222 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" event={"ID":"802d5225-ef3f-485c-bb85-3c0f18e42952","Type":"ContainerStarted","Data":"01af9b04a121e47de6d720ef96908370b377b2bf6ed16ab772bd8cea30c24502"} Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.498061 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.502430 4751 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-xr2gt container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.16:6443/healthz\": dial tcp 10.217.0.16:6443: connect: connection refused" start-of-body= Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.502468 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" podUID="802d5225-ef3f-485c-bb85-3c0f18e42952" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.16:6443/healthz\": dial tcp 10.217.0.16:6443: connect: connection refused" Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.508842 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v4px6" event={"ID":"d64a8f76-87cc-45eb-bc92-5802a3db6c3d","Type":"ContainerStarted","Data":"9394bf1aac35689cd86cf663972ec5310ddc561b1fc7c22e8b46fa9b64fffc79"} Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.512337 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7hjp9" event={"ID":"89a244ab-c405-48aa-893f-f50995384ede","Type":"ContainerStarted","Data":"cf6795000c69e913a437a37284db30d46d068223ad9c3aaaf739a528bd1f8eab"} Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.519322 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ghblb" podStartSLOduration=120.519303693 podStartE2EDuration="2m0.519303693s" podCreationTimestamp="2026-01-31 14:42:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:44:01.490506503 +0000 UTC m=+145.865219388" watchObservedRunningTime="2026-01-31 14:44:01.519303693 +0000 UTC m=+145.894016578" Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.521918 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b44gm" event={"ID":"b17c8e83-275b-4777-946a-c7360ad8fa48","Type":"ContainerStarted","Data":"af4f3abc4a3b62a128d16eb2c611b54e459447c23307822acee137a849594c11"} Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.526957 4751 generic.go:334] "Generic (PLEG): container finished" podID="89314349-bbc8-4886-b93b-51358e4e71b0" containerID="5eb306a96af5104746662256225468d9f5c6caad943f48faa1ea6569b8191d66" exitCode=0 Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.527057 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-5f7jc" event={"ID":"89314349-bbc8-4886-b93b-51358e4e71b0","Type":"ContainerDied","Data":"5eb306a96af5104746662256225468d9f5c6caad943f48faa1ea6569b8191d66"} Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.533710 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xqgfv" event={"ID":"2ad3db81-4cb9-49a5-b4e0-55b546996fa0","Type":"ContainerStarted","Data":"f3b82becccb0b3b0a7ccc90a858d15265859ba0425fe811f0ea63bd98090d636"} Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.533776 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xqgfv" event={"ID":"2ad3db81-4cb9-49a5-b4e0-55b546996fa0","Type":"ContainerStarted","Data":"48f9659ec28b8818f31ecc6ee3c28403c7bcd4830214a7011852e9664deee8e5"} Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.535956 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-8fgxq" event={"ID":"4ba2ceb2-34e1-487c-9b13-0a480d6cc521","Type":"ContainerStarted","Data":"6899da3ff15b265325d63b73a135e1e7647ed2f68a975cd0a5b3abfc7c692122"} Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.542550 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cc6m2" event={"ID":"0058e7f4-92db-444d-a979-2880c3f83442","Type":"ContainerStarted","Data":"32af7db5826454a8966a23c401bb0e748ca145604ab881bdd690506b73645f81"} Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.546805 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8m7f4" event={"ID":"853ca050-beae-4089-a5df-9556eeda508b","Type":"ContainerStarted","Data":"f8a8687cb73c8ea914386022ed953012a318696a73bafd13ca6351ecf6baedb0"} Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.560267 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-4gqrl" podStartSLOduration=120.560249074 podStartE2EDuration="2m0.560249074s" podCreationTimestamp="2026-01-31 14:42:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:44:01.528948748 +0000 UTC m=+145.903661633" watchObservedRunningTime="2026-01-31 14:44:01.560249074 +0000 UTC m=+145.934961959" Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.561630 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" podStartSLOduration=121.56160889 podStartE2EDuration="2m1.56160889s" podCreationTimestamp="2026-01-31 14:42:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:44:01.556623049 +0000 UTC m=+145.931335934" watchObservedRunningTime="2026-01-31 14:44:01.56160889 +0000 UTC m=+145.936321765" Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.563593 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:01 crc kubenswrapper[4751]: E0131 14:44:01.566895 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:02.06687985 +0000 UTC m=+146.441592745 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.585388 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7hc86" event={"ID":"9f99779e-5e77-4b5c-8886-7accebe8a897","Type":"ContainerStarted","Data":"4040eb76ab95246c66d1300eb50d9756c93fadedb6446beab730cd9f803ccec1"} Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.607290 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xqgfv" podStartSLOduration=120.607271216 podStartE2EDuration="2m0.607271216s" podCreationTimestamp="2026-01-31 14:42:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:44:01.607188214 +0000 UTC m=+145.981901089" watchObservedRunningTime="2026-01-31 14:44:01.607271216 +0000 UTC m=+145.981984101" Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.621631 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7762w" event={"ID":"84e2930a-5ae3-4171-a3dd-e5eea62ef157","Type":"ContainerStarted","Data":"8e402889398f0b5d93bacd46f42378e3cdc7f2ee478995578d04804d8ec0f029"} Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.622923 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7762w" Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.624823 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-x4rnh" event={"ID":"a55fc688-004a-4d6f-a48e-c10b0ae1d8f1","Type":"ContainerStarted","Data":"7abf360dd9c0e2e95e4396aa0bbb3d62d9791b073cdcaa6e42b4b4a4c5cef71e"} Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.630367 4751 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-7762w container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.630442 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7762w" podUID="84e2930a-5ae3-4171-a3dd-e5eea62ef157" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.648518 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7762w" podStartSLOduration=120.648500335 podStartE2EDuration="2m0.648500335s" podCreationTimestamp="2026-01-31 14:42:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:44:01.647530229 +0000 UTC m=+146.022243134" watchObservedRunningTime="2026-01-31 14:44:01.648500335 +0000 UTC m=+146.023213220" Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.652276 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-skzbg" event={"ID":"f6be9bbf-6799-45e0-8d53-790a5484f3a4","Type":"ContainerStarted","Data":"0c2cf46331a59d6d049bbbbf297f3bf3ef89c230accfb3d1d65334c10776daf5"} Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.668590 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.669710 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497830-rpwmp" event={"ID":"eade01dc-846b-42a8-a6ed-8cf0a0663e82","Type":"ContainerStarted","Data":"9d44648df839910022878a08450dec667db28fe365908b86584da87c8884b401"} Jan 31 14:44:01 crc kubenswrapper[4751]: E0131 14:44:01.670164 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:02.170133046 +0000 UTC m=+146.544846011 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.683973 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h4drr" event={"ID":"5c630253-f658-44fb-891d-f560f1e2b577","Type":"ContainerStarted","Data":"37dea14bb855fc0c21bc1fd1a3441e623824acc1c0c1865c768e36090fc733e6"} Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.684063 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-sxjf5" Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.684160 4751 patch_prober.go:28] interesting pod/downloads-7954f5f757-4m7jl container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.684191 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-4m7jl" podUID="d723501b-bb29-4d60-ad97-239eb749771f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.690537 4751 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-sxjf5 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.690578 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-sxjf5" podUID="c1e92f9b-2291-4bd5-80b8-c2f9e667acf8" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.700176 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h4drr" podStartSLOduration=120.700161539 podStartE2EDuration="2m0.700161539s" podCreationTimestamp="2026-01-31 14:42:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:44:01.700141078 +0000 UTC m=+146.074853963" watchObservedRunningTime="2026-01-31 14:44:01.700161539 +0000 UTC m=+146.074874424" Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.769786 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:01 crc kubenswrapper[4751]: E0131 14:44:01.770449 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:02.270433345 +0000 UTC m=+146.645146230 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.870560 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:01 crc kubenswrapper[4751]: E0131 14:44:01.870695 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:02.370675792 +0000 UTC m=+146.745388677 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.871087 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:01 crc kubenswrapper[4751]: E0131 14:44:01.873643 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:02.3736349 +0000 UTC m=+146.748347785 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.971638 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:01 crc kubenswrapper[4751]: E0131 14:44:01.971835 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:02.471808102 +0000 UTC m=+146.846520987 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:01 crc kubenswrapper[4751]: I0131 14:44:01.971969 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:01 crc kubenswrapper[4751]: E0131 14:44:01.972333 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:02.472323046 +0000 UTC m=+146.847036011 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.072705 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:02 crc kubenswrapper[4751]: E0131 14:44:02.073244 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:02.57322624 +0000 UTC m=+146.947939125 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.182423 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:02 crc kubenswrapper[4751]: E0131 14:44:02.185362 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:02.685347321 +0000 UTC m=+147.060060206 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.283791 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:02 crc kubenswrapper[4751]: E0131 14:44:02.284434 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:02.784412247 +0000 UTC m=+147.159125132 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.388165 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:02 crc kubenswrapper[4751]: E0131 14:44:02.388497 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:02.888485635 +0000 UTC m=+147.263198520 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.488972 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:02 crc kubenswrapper[4751]: E0131 14:44:02.489162 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:02.989136853 +0000 UTC m=+147.363849738 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.489357 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:02 crc kubenswrapper[4751]: E0131 14:44:02.489604 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:02.989597325 +0000 UTC m=+147.364310210 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.590149 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:02 crc kubenswrapper[4751]: E0131 14:44:02.590343 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:03.090317825 +0000 UTC m=+147.465030710 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.590617 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:02 crc kubenswrapper[4751]: E0131 14:44:02.590908 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:03.09090021 +0000 UTC m=+147.465613095 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.631567 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.689870 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vc9q2" event={"ID":"9edad05e-bd87-4a20-a947-6b09f9f7c93a","Type":"ContainerStarted","Data":"6e56ec891c20edecfef021496eccccd07734ce2f1e1a464798c1434bf7865d77"} Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.690303 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vc9q2" Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.691160 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:02 crc kubenswrapper[4751]: E0131 14:44:02.691301 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:03.191277341 +0000 UTC m=+147.565990236 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.691492 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.691903 4751 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-vc9q2 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" start-of-body= Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.691950 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vc9q2" podUID="9edad05e-bd87-4a20-a947-6b09f9f7c93a" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" Jan 31 14:44:02 crc kubenswrapper[4751]: E0131 14:44:02.692446 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:03.192431071 +0000 UTC m=+147.567144066 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.692558 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdsj4" event={"ID":"5f75ab4e-45c1-4ed9-b966-afa91dbc88a6","Type":"ContainerStarted","Data":"0f296d886a4e86c9fb7c0821d1aba553311a118eb2a4d1ed0c4ac96370b82aa8"} Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.694673 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xv2tk" event={"ID":"d031fa1b-4d52-47d7-8c39-5fa21fb6c244","Type":"ContainerStarted","Data":"ecf1d59d03b76e02faf85fdf6e2372715e317b2d34e7cad119bde4411e19b28f"} Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.696314 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-8fgxq" event={"ID":"4ba2ceb2-34e1-487c-9b13-0a480d6cc521","Type":"ContainerStarted","Data":"4329bd44507bea908031067a5de7d34eed01214163138e6bbd8c31f760de6fe5"} Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.698271 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-5hn9b" event={"ID":"01ff1674-4e01-4cdc-aea3-1e91a6a389e3","Type":"ContainerStarted","Data":"0598de1264a51e543dcd743ae65f25862fb214b642229772b95f9286721b9b77"} Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.699659 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l76jv" event={"ID":"f47a4e08-e21f-4a13-9ea2-bc1545a64cae","Type":"ContainerStarted","Data":"1aad2b6d72ce6c851e49ad7275df69908b14d74b5b348519616ddd15626c6128"} Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.701946 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v4px6" event={"ID":"d64a8f76-87cc-45eb-bc92-5802a3db6c3d","Type":"ContainerStarted","Data":"8f80be1b4f04566c437fc0b68bc723eae3982f173d1303617673bfaa62f1f3dc"} Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.704559 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b44gm" event={"ID":"b17c8e83-275b-4777-946a-c7360ad8fa48","Type":"ContainerStarted","Data":"2e231a1fa2f3ae6e3e93b3c4f014cf4b18f0455ad1eb05c727fc37de2d1bd364"} Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.704605 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b44gm" event={"ID":"b17c8e83-275b-4777-946a-c7360ad8fa48","Type":"ContainerStarted","Data":"4f43fdfe7720646fe62898497506d1efbe37a0c991a885a83b74fbd4b8132c74"} Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.705046 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vc9q2" podStartSLOduration=121.705035164 podStartE2EDuration="2m1.705035164s" podCreationTimestamp="2026-01-31 14:42:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:44:02.703552615 +0000 UTC m=+147.078265500" watchObservedRunningTime="2026-01-31 14:44:02.705035164 +0000 UTC m=+147.079748049" Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.706565 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8m7f4" event={"ID":"853ca050-beae-4089-a5df-9556eeda508b","Type":"ContainerStarted","Data":"a52fabbd60d21daf58e9d10f46243c241d1f2ed6b3424cd226da30bbcac9aefb"} Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.706593 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8m7f4" event={"ID":"853ca050-beae-4089-a5df-9556eeda508b","Type":"ContainerStarted","Data":"ce1be442c938a3a303c31bde3f68208f03c465b14cc789c88b94bb5029c12adc"} Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.708196 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7hc86" event={"ID":"9f99779e-5e77-4b5c-8886-7accebe8a897","Type":"ContainerStarted","Data":"2886b4beec8fbe46944eef93fa0ce15d7bd64528ac55c6bd65531ed442dea8af"} Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.708433 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7hc86" Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.709738 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7hjp9" event={"ID":"89a244ab-c405-48aa-893f-f50995384ede","Type":"ContainerStarted","Data":"81a09aec382eab3d1121a3bdc5e760cd357ff1f2ae90a816828e7967447d1045"} Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.710943 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7hjp9" Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.711041 4751 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-7hc86 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" start-of-body= Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.711097 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7hc86" podUID="9f99779e-5e77-4b5c-8886-7accebe8a897" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.711934 4751 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-7hjp9 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:5443/healthz\": dial tcp 10.217.0.39:5443: connect: connection refused" start-of-body= Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.711978 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7hjp9" podUID="89a244ab-c405-48aa-893f-f50995384ede" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.39:5443/healthz\": dial tcp 10.217.0.39:5443: connect: connection refused" Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.713972 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-5f7jc" event={"ID":"89314349-bbc8-4886-b93b-51358e4e71b0","Type":"ContainerStarted","Data":"466860ab81933ff0ead4c18a5a020887a870782a5543ba0f4d4b546fa85314fb"} Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.714031 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-5f7jc" event={"ID":"89314349-bbc8-4886-b93b-51358e4e71b0","Type":"ContainerStarted","Data":"83302bf2ea2d2fb8158b63ecb177ddf0ca75d838e791dc4168c4f281ff0d4cdf"} Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.717874 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kvfvk" event={"ID":"7014a649-2d58-4772-9eb3-697e4b925923","Type":"ContainerStarted","Data":"5fa53bf5deac84b0ced844b48b88bff7503d29ce0ca7b87ae125cb33f9057c7c"} Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.717924 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kvfvk" event={"ID":"7014a649-2d58-4772-9eb3-697e4b925923","Type":"ContainerStarted","Data":"6ba7af36aec1573d4d7f1b398d8fff0c8050ad6797e5d647b129bab4a92b19e3"} Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.718048 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kvfvk" Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.721946 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-skzbg" event={"ID":"f6be9bbf-6799-45e0-8d53-790a5484f3a4","Type":"ContainerStarted","Data":"1992e3beb7df5f31535ea048340397d93c80b4751327faf367960d45b1b123f4"} Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.721987 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-skzbg" event={"ID":"f6be9bbf-6799-45e0-8d53-790a5484f3a4","Type":"ContainerStarted","Data":"aff3281f7d4db094a665eb4271557a6c71c2f13307ba221302e7069d8acf2fab"} Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.722552 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-skzbg" Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.724890 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cc6m2" event={"ID":"0058e7f4-92db-444d-a979-2880c3f83442","Type":"ContainerStarted","Data":"2f491d8d63860ba98caece552ebaff2be1c8f903cbea1402c89ae7f9679ec278"} Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.728681 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nz99n" event={"ID":"4c4b193a-a01b-440a-a94a-55c4b5f06586","Type":"ContainerStarted","Data":"99d061d2fb49bff96c704106d2ba56cc3727d17965227544725d3610de44bde3"} Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.730552 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-vbfvz" event={"ID":"cc8ec6f8-52f3-4bb8-a00b-4f73276a3af4","Type":"ContainerStarted","Data":"023362aaa08846f80ba54c770dd5238cd4255165bfb25a5345e3ce4931729bb2"} Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.732025 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-5r6kv" event={"ID":"8bc2bde1-a50e-47bb-8211-4b5ed0ac74ea","Type":"ContainerStarted","Data":"cc163d448fa8fad6b5ab0077c0960c4003a53c503f6d097090f206fed6245a22"} Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.732622 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-5r6kv" Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.734055 4751 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-5r6kv container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.734114 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-5r6kv" podUID="8bc2bde1-a50e-47bb-8211-4b5ed0ac74ea" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.734444 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qdsgb" event={"ID":"e999b5a4-2e54-4195-98fa-4c5fa36f1b3a","Type":"ContainerStarted","Data":"8461527ee3273a8a1474a8ffc7a818ace12f0233df1abd1b2f654fcad45ffcc4"} Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.736296 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497830-rpwmp" event={"ID":"eade01dc-846b-42a8-a6ed-8cf0a0663e82","Type":"ContainerStarted","Data":"9d26a6d6092efc3cfe1b53bda2539e32fc75d0f27a288ecda4b2062254a0fc73"} Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.736543 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdsj4" podStartSLOduration=121.736527136 podStartE2EDuration="2m1.736527136s" podCreationTimestamp="2026-01-31 14:42:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:44:02.735259722 +0000 UTC m=+147.109972607" watchObservedRunningTime="2026-01-31 14:44:02.736527136 +0000 UTC m=+147.111240021" Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.738567 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-v8p8j" event={"ID":"b9810521-7440-49d4-bf04-7dbe3324cc5b","Type":"ContainerStarted","Data":"14415e9473becce338fe33d29144e768b2735a0df0f1ed6016935ce3baae1250"} Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.741677 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h4drr" event={"ID":"5c630253-f658-44fb-891d-f560f1e2b577","Type":"ContainerStarted","Data":"a83294cf387b6e17c78fb2bd16cec0b52e7bd1aa28deeafb800ae11bb2f42f2f"} Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.742510 4751 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-7762w container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.742539 4751 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-xr2gt container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.16:6443/healthz\": dial tcp 10.217.0.16:6443: connect: connection refused" start-of-body= Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.742563 4751 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-sxjf5 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.742575 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" podUID="802d5225-ef3f-485c-bb85-3c0f18e42952" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.16:6443/healthz\": dial tcp 10.217.0.16:6443: connect: connection refused" Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.742593 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-sxjf5" podUID="c1e92f9b-2291-4bd5-80b8-c2f9e667acf8" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.742555 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7762w" podUID="84e2930a-5ae3-4171-a3dd-e5eea62ef157" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.744552 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-db5pg" Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.744581 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-z9dj7" Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.744913 4751 patch_prober.go:28] interesting pod/console-operator-58897d9998-db5pg container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/readyz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.744960 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-db5pg" podUID="6a74f65d-f8d2-41af-8469-6f8d020b41de" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.9:8443/readyz\": dial tcp 10.217.0.9:8443: connect: connection refused" Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.792964 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.794650 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-xv2tk" podStartSLOduration=121.79463675 podStartE2EDuration="2m1.79463675s" podCreationTimestamp="2026-01-31 14:42:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:44:02.770628736 +0000 UTC m=+147.145341621" watchObservedRunningTime="2026-01-31 14:44:02.79463675 +0000 UTC m=+147.169349635" Jan 31 14:44:02 crc kubenswrapper[4751]: E0131 14:44:02.795835 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:03.295810661 +0000 UTC m=+147.670523636 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.797976 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.802389 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v4px6" podStartSLOduration=121.802381524 podStartE2EDuration="2m1.802381524s" podCreationTimestamp="2026-01-31 14:42:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:44:02.795540014 +0000 UTC m=+147.170252899" watchObservedRunningTime="2026-01-31 14:44:02.802381524 +0000 UTC m=+147.177094399" Jan 31 14:44:02 crc kubenswrapper[4751]: E0131 14:44:02.802588 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:03.302571069 +0000 UTC m=+147.677283954 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.822037 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-8fgxq" podStartSLOduration=121.822021763 podStartE2EDuration="2m1.822021763s" podCreationTimestamp="2026-01-31 14:42:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:44:02.820471562 +0000 UTC m=+147.195184457" watchObservedRunningTime="2026-01-31 14:44:02.822021763 +0000 UTC m=+147.196734648" Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.865990 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-5hn9b" Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.867776 4751 patch_prober.go:28] interesting pod/router-default-5444994796-5hn9b container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.867821 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5hn9b" podUID="01ff1674-4e01-4cdc-aea3-1e91a6a389e3" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.868408 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l76jv" podStartSLOduration=121.868399678 podStartE2EDuration="2m1.868399678s" podCreationTimestamp="2026-01-31 14:42:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:44:02.867775701 +0000 UTC m=+147.242488586" watchObservedRunningTime="2026-01-31 14:44:02.868399678 +0000 UTC m=+147.243112553" Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.869920 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-5hn9b" podStartSLOduration=121.869915198 podStartE2EDuration="2m1.869915198s" podCreationTimestamp="2026-01-31 14:42:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:44:02.849108628 +0000 UTC m=+147.223821503" watchObservedRunningTime="2026-01-31 14:44:02.869915198 +0000 UTC m=+147.244628083" Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.882460 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7hc86" podStartSLOduration=121.882438218 podStartE2EDuration="2m1.882438218s" podCreationTimestamp="2026-01-31 14:42:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:44:02.881634027 +0000 UTC m=+147.256346912" watchObservedRunningTime="2026-01-31 14:44:02.882438218 +0000 UTC m=+147.257151103" Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.898221 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-vbfvz" podStartSLOduration=121.898203275 podStartE2EDuration="2m1.898203275s" podCreationTimestamp="2026-01-31 14:42:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:44:02.898138643 +0000 UTC m=+147.272851528" watchObservedRunningTime="2026-01-31 14:44:02.898203275 +0000 UTC m=+147.272916160" Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.911743 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:02 crc kubenswrapper[4751]: E0131 14:44:02.912211 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:03.412183594 +0000 UTC m=+147.786896539 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.917962 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8m7f4" podStartSLOduration=122.917941386 podStartE2EDuration="2m2.917941386s" podCreationTimestamp="2026-01-31 14:42:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:44:02.915365818 +0000 UTC m=+147.290078703" watchObservedRunningTime="2026-01-31 14:44:02.917941386 +0000 UTC m=+147.292654271" Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.961006 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nz99n" podStartSLOduration=122.960988293 podStartE2EDuration="2m2.960988293s" podCreationTimestamp="2026-01-31 14:42:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:44:02.960062918 +0000 UTC m=+147.334775803" watchObservedRunningTime="2026-01-31 14:44:02.960988293 +0000 UTC m=+147.335701178" Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.961876 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cc6m2" podStartSLOduration=122.961871286 podStartE2EDuration="2m2.961871286s" podCreationTimestamp="2026-01-31 14:42:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:44:02.941017885 +0000 UTC m=+147.315730770" watchObservedRunningTime="2026-01-31 14:44:02.961871286 +0000 UTC m=+147.336584161" Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.983306 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-z9dj7" podStartSLOduration=122.983288802 podStartE2EDuration="2m2.983288802s" podCreationTimestamp="2026-01-31 14:42:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:44:02.983011134 +0000 UTC m=+147.357724019" watchObservedRunningTime="2026-01-31 14:44:02.983288802 +0000 UTC m=+147.358001687" Jan 31 14:44:02 crc kubenswrapper[4751]: I0131 14:44:02.999101 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-v8p8j" podStartSLOduration=121.999087119 podStartE2EDuration="2m1.999087119s" podCreationTimestamp="2026-01-31 14:42:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:44:02.997883557 +0000 UTC m=+147.372596442" watchObservedRunningTime="2026-01-31 14:44:02.999087119 +0000 UTC m=+147.373800004" Jan 31 14:44:03 crc kubenswrapper[4751]: I0131 14:44:03.013013 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:03 crc kubenswrapper[4751]: E0131 14:44:03.013310 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:03.513299774 +0000 UTC m=+147.888012659 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:03 crc kubenswrapper[4751]: I0131 14:44:03.029181 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdsj4" Jan 31 14:44:03 crc kubenswrapper[4751]: I0131 14:44:03.029499 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdsj4" Jan 31 14:44:03 crc kubenswrapper[4751]: I0131 14:44:03.030823 4751 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-wdsj4 container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.6:8443/livez\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Jan 31 14:44:03 crc kubenswrapper[4751]: I0131 14:44:03.030856 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdsj4" podUID="5f75ab4e-45c1-4ed9-b966-afa91dbc88a6" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.6:8443/livez\": dial tcp 10.217.0.6:8443: connect: connection refused" Jan 31 14:44:03 crc kubenswrapper[4751]: I0131 14:44:03.042055 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-skzbg" podStartSLOduration=8.042032023 podStartE2EDuration="8.042032023s" podCreationTimestamp="2026-01-31 14:43:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:44:03.039811484 +0000 UTC m=+147.414524369" watchObservedRunningTime="2026-01-31 14:44:03.042032023 +0000 UTC m=+147.416744908" Jan 31 14:44:03 crc kubenswrapper[4751]: I0131 14:44:03.042326 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-5r6kv" podStartSLOduration=122.04232149 podStartE2EDuration="2m2.04232149s" podCreationTimestamp="2026-01-31 14:42:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:44:03.022560029 +0000 UTC m=+147.397272914" watchObservedRunningTime="2026-01-31 14:44:03.04232149 +0000 UTC m=+147.417034365" Jan 31 14:44:03 crc kubenswrapper[4751]: I0131 14:44:03.059311 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7hjp9" podStartSLOduration=122.059295649 podStartE2EDuration="2m2.059295649s" podCreationTimestamp="2026-01-31 14:42:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:44:03.058660162 +0000 UTC m=+147.433373047" watchObservedRunningTime="2026-01-31 14:44:03.059295649 +0000 UTC m=+147.434008534" Jan 31 14:44:03 crc kubenswrapper[4751]: I0131 14:44:03.089099 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-5f7jc" podStartSLOduration=123.089085035 podStartE2EDuration="2m3.089085035s" podCreationTimestamp="2026-01-31 14:42:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:44:03.088183721 +0000 UTC m=+147.462896606" watchObservedRunningTime="2026-01-31 14:44:03.089085035 +0000 UTC m=+147.463797920" Jan 31 14:44:03 crc kubenswrapper[4751]: I0131 14:44:03.114177 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:03 crc kubenswrapper[4751]: E0131 14:44:03.114488 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:03.614474786 +0000 UTC m=+147.989187671 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:03 crc kubenswrapper[4751]: I0131 14:44:03.128462 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kvfvk" podStartSLOduration=122.128448665 podStartE2EDuration="2m2.128448665s" podCreationTimestamp="2026-01-31 14:42:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:44:03.108135408 +0000 UTC m=+147.482848293" watchObservedRunningTime="2026-01-31 14:44:03.128448665 +0000 UTC m=+147.503161540" Jan 31 14:44:03 crc kubenswrapper[4751]: I0131 14:44:03.128612 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29497830-rpwmp" podStartSLOduration=123.128609939 podStartE2EDuration="2m3.128609939s" podCreationTimestamp="2026-01-31 14:42:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:44:03.126680728 +0000 UTC m=+147.501393613" watchObservedRunningTime="2026-01-31 14:44:03.128609939 +0000 UTC m=+147.503322824" Jan 31 14:44:03 crc kubenswrapper[4751]: I0131 14:44:03.153803 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-qdsgb" podStartSLOduration=7.153791354 podStartE2EDuration="7.153791354s" podCreationTimestamp="2026-01-31 14:43:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:44:03.13962869 +0000 UTC m=+147.514341575" watchObservedRunningTime="2026-01-31 14:44:03.153791354 +0000 UTC m=+147.528504239" Jan 31 14:44:03 crc kubenswrapper[4751]: I0131 14:44:03.215653 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:03 crc kubenswrapper[4751]: E0131 14:44:03.216037 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:03.716019967 +0000 UTC m=+148.090732852 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:03 crc kubenswrapper[4751]: I0131 14:44:03.317204 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:03 crc kubenswrapper[4751]: E0131 14:44:03.317385 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:03.817359943 +0000 UTC m=+148.192072828 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:03 crc kubenswrapper[4751]: I0131 14:44:03.317488 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:03 crc kubenswrapper[4751]: E0131 14:44:03.317801 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:03.817793795 +0000 UTC m=+148.192506680 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:03 crc kubenswrapper[4751]: I0131 14:44:03.418542 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:03 crc kubenswrapper[4751]: E0131 14:44:03.418732 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:03.918709359 +0000 UTC m=+148.293422244 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:03 crc kubenswrapper[4751]: I0131 14:44:03.418875 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:03 crc kubenswrapper[4751]: E0131 14:44:03.419191 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:03.919184962 +0000 UTC m=+148.293897847 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:03 crc kubenswrapper[4751]: I0131 14:44:03.519805 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:03 crc kubenswrapper[4751]: E0131 14:44:03.519982 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:04.019959053 +0000 UTC m=+148.394671938 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:03 crc kubenswrapper[4751]: I0131 14:44:03.520269 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:03 crc kubenswrapper[4751]: E0131 14:44:03.520694 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:04.020677652 +0000 UTC m=+148.395390537 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:03 crc kubenswrapper[4751]: I0131 14:44:03.535586 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-5f7jc" Jan 31 14:44:03 crc kubenswrapper[4751]: I0131 14:44:03.535643 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-5f7jc" Jan 31 14:44:03 crc kubenswrapper[4751]: I0131 14:44:03.537385 4751 patch_prober.go:28] interesting pod/apiserver-76f77b778f-5f7jc container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.14:8443/livez\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Jan 31 14:44:03 crc kubenswrapper[4751]: I0131 14:44:03.537437 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-5f7jc" podUID="89314349-bbc8-4886-b93b-51358e4e71b0" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.14:8443/livez\": dial tcp 10.217.0.14:8443: connect: connection refused" Jan 31 14:44:03 crc kubenswrapper[4751]: I0131 14:44:03.621104 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:03 crc kubenswrapper[4751]: E0131 14:44:03.621283 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:04.121258778 +0000 UTC m=+148.495971663 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:03 crc kubenswrapper[4751]: I0131 14:44:03.621519 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:03 crc kubenswrapper[4751]: E0131 14:44:03.621805 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:04.121796672 +0000 UTC m=+148.496509557 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:03 crc kubenswrapper[4751]: I0131 14:44:03.723007 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:03 crc kubenswrapper[4751]: E0131 14:44:03.723200 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:04.223176489 +0000 UTC m=+148.597889374 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:03 crc kubenswrapper[4751]: I0131 14:44:03.723277 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:03 crc kubenswrapper[4751]: E0131 14:44:03.723568 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:04.223561499 +0000 UTC m=+148.598274384 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:03 crc kubenswrapper[4751]: I0131 14:44:03.746690 4751 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-7hc86 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" start-of-body= Jan 31 14:44:03 crc kubenswrapper[4751]: I0131 14:44:03.746736 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7hc86" podUID="9f99779e-5e77-4b5c-8886-7accebe8a897" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" Jan 31 14:44:03 crc kubenswrapper[4751]: I0131 14:44:03.746774 4751 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-5r6kv container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Jan 31 14:44:03 crc kubenswrapper[4751]: I0131 14:44:03.746817 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-5r6kv" podUID="8bc2bde1-a50e-47bb-8211-4b5ed0ac74ea" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" Jan 31 14:44:03 crc kubenswrapper[4751]: I0131 14:44:03.747087 4751 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-vc9q2 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" start-of-body= Jan 31 14:44:03 crc kubenswrapper[4751]: I0131 14:44:03.747135 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vc9q2" podUID="9edad05e-bd87-4a20-a947-6b09f9f7c93a" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" Jan 31 14:44:03 crc kubenswrapper[4751]: I0131 14:44:03.748169 4751 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-7hjp9 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:5443/healthz\": dial tcp 10.217.0.39:5443: connect: connection refused" start-of-body= Jan 31 14:44:03 crc kubenswrapper[4751]: I0131 14:44:03.748196 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7hjp9" podUID="89a244ab-c405-48aa-893f-f50995384ede" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.39:5443/healthz\": dial tcp 10.217.0.39:5443: connect: connection refused" Jan 31 14:44:03 crc kubenswrapper[4751]: I0131 14:44:03.752981 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7762w" Jan 31 14:44:03 crc kubenswrapper[4751]: I0131 14:44:03.824500 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:03 crc kubenswrapper[4751]: E0131 14:44:03.824703 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:04.324667119 +0000 UTC m=+148.699380004 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:03 crc kubenswrapper[4751]: I0131 14:44:03.826146 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:03 crc kubenswrapper[4751]: E0131 14:44:03.828176 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:04.328159941 +0000 UTC m=+148.702872826 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:03 crc kubenswrapper[4751]: I0131 14:44:03.835529 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b44gm" podStartSLOduration=122.835512245 podStartE2EDuration="2m2.835512245s" podCreationTimestamp="2026-01-31 14:42:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:44:03.162562435 +0000 UTC m=+147.537275320" watchObservedRunningTime="2026-01-31 14:44:03.835512245 +0000 UTC m=+148.210225120" Jan 31 14:44:03 crc kubenswrapper[4751]: I0131 14:44:03.874734 4751 patch_prober.go:28] interesting pod/router-default-5444994796-5hn9b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 14:44:03 crc kubenswrapper[4751]: [-]has-synced failed: reason withheld Jan 31 14:44:03 crc kubenswrapper[4751]: [+]process-running ok Jan 31 14:44:03 crc kubenswrapper[4751]: healthz check failed Jan 31 14:44:03 crc kubenswrapper[4751]: I0131 14:44:03.875114 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5hn9b" podUID="01ff1674-4e01-4cdc-aea3-1e91a6a389e3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 14:44:03 crc kubenswrapper[4751]: I0131 14:44:03.931215 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:03 crc kubenswrapper[4751]: E0131 14:44:03.931384 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:04.431360106 +0000 UTC m=+148.806072991 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:03 crc kubenswrapper[4751]: I0131 14:44:03.931540 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:03 crc kubenswrapper[4751]: E0131 14:44:03.931839 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:04.431828268 +0000 UTC m=+148.806541153 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:04 crc kubenswrapper[4751]: I0131 14:44:04.003186 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" Jan 31 14:44:04 crc kubenswrapper[4751]: I0131 14:44:04.032298 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:04 crc kubenswrapper[4751]: E0131 14:44:04.032463 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:04.532441435 +0000 UTC m=+148.907154320 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:04 crc kubenswrapper[4751]: I0131 14:44:04.032798 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:04 crc kubenswrapper[4751]: E0131 14:44:04.033207 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:04.533189354 +0000 UTC m=+148.907902239 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:04 crc kubenswrapper[4751]: I0131 14:44:04.075373 4751 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-z9dj7 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Jan 31 14:44:04 crc kubenswrapper[4751]: I0131 14:44:04.075614 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-z9dj7" podUID="e14d9fb0-f377-4331-8bc1-8f4017bb95a3" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Jan 31 14:44:04 crc kubenswrapper[4751]: I0131 14:44:04.075388 4751 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-z9dj7 container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Jan 31 14:44:04 crc kubenswrapper[4751]: I0131 14:44:04.075791 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-z9dj7" podUID="e14d9fb0-f377-4331-8bc1-8f4017bb95a3" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Jan 31 14:44:04 crc kubenswrapper[4751]: I0131 14:44:04.134193 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:04 crc kubenswrapper[4751]: E0131 14:44:04.134404 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:04.634381246 +0000 UTC m=+149.009094131 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:04 crc kubenswrapper[4751]: I0131 14:44:04.134487 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:04 crc kubenswrapper[4751]: E0131 14:44:04.134813 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:04.634801458 +0000 UTC m=+149.009514343 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:04 crc kubenswrapper[4751]: I0131 14:44:04.235603 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:04 crc kubenswrapper[4751]: E0131 14:44:04.235833 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:04.735802645 +0000 UTC m=+149.110515620 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:04 crc kubenswrapper[4751]: I0131 14:44:04.236478 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:04 crc kubenswrapper[4751]: E0131 14:44:04.236826 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:04.736817161 +0000 UTC m=+149.111530046 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:04 crc kubenswrapper[4751]: I0131 14:44:04.338284 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:04 crc kubenswrapper[4751]: E0131 14:44:04.338431 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:04.838409354 +0000 UTC m=+149.213122229 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:04 crc kubenswrapper[4751]: I0131 14:44:04.338833 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:04 crc kubenswrapper[4751]: E0131 14:44:04.339160 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:04.839153034 +0000 UTC m=+149.213865909 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:04 crc kubenswrapper[4751]: I0131 14:44:04.349266 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-db5pg" Jan 31 14:44:04 crc kubenswrapper[4751]: I0131 14:44:04.439871 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:04 crc kubenswrapper[4751]: E0131 14:44:04.440200 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:04.940185242 +0000 UTC m=+149.314898127 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:04 crc kubenswrapper[4751]: I0131 14:44:04.540973 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:04 crc kubenswrapper[4751]: E0131 14:44:04.541419 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:05.041375124 +0000 UTC m=+149.416088009 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:04 crc kubenswrapper[4751]: I0131 14:44:04.642009 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:04 crc kubenswrapper[4751]: E0131 14:44:04.642217 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:05.142192226 +0000 UTC m=+149.516905111 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:04 crc kubenswrapper[4751]: I0131 14:44:04.642343 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:04 crc kubenswrapper[4751]: E0131 14:44:04.642693 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:05.142686619 +0000 UTC m=+149.517399504 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:04 crc kubenswrapper[4751]: I0131 14:44:04.743601 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:04 crc kubenswrapper[4751]: E0131 14:44:04.744055 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:05.244029955 +0000 UTC m=+149.618742840 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:04 crc kubenswrapper[4751]: I0131 14:44:04.751025 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-x4rnh" event={"ID":"a55fc688-004a-4d6f-a48e-c10b0ae1d8f1","Type":"ContainerStarted","Data":"2ab0f0faa26018e025fa907c7d5674f1870c94097e91e92f032c5b5d999c96de"} Jan 31 14:44:04 crc kubenswrapper[4751]: I0131 14:44:04.751991 4751 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-5r6kv container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Jan 31 14:44:04 crc kubenswrapper[4751]: I0131 14:44:04.752107 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-5r6kv" podUID="8bc2bde1-a50e-47bb-8211-4b5ed0ac74ea" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" Jan 31 14:44:04 crc kubenswrapper[4751]: I0131 14:44:04.845473 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:04 crc kubenswrapper[4751]: E0131 14:44:04.848762 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:05.34874934 +0000 UTC m=+149.723462225 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:04 crc kubenswrapper[4751]: I0131 14:44:04.868110 4751 patch_prober.go:28] interesting pod/router-default-5444994796-5hn9b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 14:44:04 crc kubenswrapper[4751]: [-]has-synced failed: reason withheld Jan 31 14:44:04 crc kubenswrapper[4751]: [+]process-running ok Jan 31 14:44:04 crc kubenswrapper[4751]: healthz check failed Jan 31 14:44:04 crc kubenswrapper[4751]: I0131 14:44:04.868163 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5hn9b" podUID="01ff1674-4e01-4cdc-aea3-1e91a6a389e3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 14:44:04 crc kubenswrapper[4751]: I0131 14:44:04.946569 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:04 crc kubenswrapper[4751]: E0131 14:44:04.946721 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:05.446696947 +0000 UTC m=+149.821409832 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:04 crc kubenswrapper[4751]: I0131 14:44:04.946761 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:04 crc kubenswrapper[4751]: E0131 14:44:04.947047 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:05.447032936 +0000 UTC m=+149.821745821 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:05 crc kubenswrapper[4751]: I0131 14:44:05.048393 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:05 crc kubenswrapper[4751]: E0131 14:44:05.048567 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:05.548543696 +0000 UTC m=+149.923256581 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:05 crc kubenswrapper[4751]: I0131 14:44:05.049794 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:05 crc kubenswrapper[4751]: E0131 14:44:05.050132 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:05.550124948 +0000 UTC m=+149.924837833 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:05 crc kubenswrapper[4751]: I0131 14:44:05.150564 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:05 crc kubenswrapper[4751]: E0131 14:44:05.150761 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:05.650734055 +0000 UTC m=+150.025446940 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:05 crc kubenswrapper[4751]: I0131 14:44:05.150866 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:05 crc kubenswrapper[4751]: E0131 14:44:05.151161 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:05.651150186 +0000 UTC m=+150.025863071 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:05 crc kubenswrapper[4751]: I0131 14:44:05.164148 4751 csr.go:261] certificate signing request csr-47twz is approved, waiting to be issued Jan 31 14:44:05 crc kubenswrapper[4751]: I0131 14:44:05.171125 4751 csr.go:257] certificate signing request csr-47twz is issued Jan 31 14:44:05 crc kubenswrapper[4751]: I0131 14:44:05.251906 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:05 crc kubenswrapper[4751]: E0131 14:44:05.252272 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:05.752245705 +0000 UTC m=+150.126958590 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:05 crc kubenswrapper[4751]: I0131 14:44:05.353706 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:44:05 crc kubenswrapper[4751]: I0131 14:44:05.353749 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:44:05 crc kubenswrapper[4751]: I0131 14:44:05.353775 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:05 crc kubenswrapper[4751]: I0131 14:44:05.353800 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:44:05 crc kubenswrapper[4751]: I0131 14:44:05.353859 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:44:05 crc kubenswrapper[4751]: E0131 14:44:05.354537 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:05.854519266 +0000 UTC m=+150.229232151 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:05 crc kubenswrapper[4751]: I0131 14:44:05.356807 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:44:05 crc kubenswrapper[4751]: I0131 14:44:05.360031 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:44:05 crc kubenswrapper[4751]: I0131 14:44:05.360536 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:44:05 crc kubenswrapper[4751]: I0131 14:44:05.362610 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:44:05 crc kubenswrapper[4751]: I0131 14:44:05.454611 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:05 crc kubenswrapper[4751]: E0131 14:44:05.454767 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:05.954741502 +0000 UTC m=+150.329454387 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:05 crc kubenswrapper[4751]: I0131 14:44:05.454911 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:05 crc kubenswrapper[4751]: E0131 14:44:05.455224 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:05.955217645 +0000 UTC m=+150.329930530 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:05 crc kubenswrapper[4751]: I0131 14:44:05.556022 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:05 crc kubenswrapper[4751]: E0131 14:44:05.556201 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:06.056176601 +0000 UTC m=+150.430889486 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:05 crc kubenswrapper[4751]: I0131 14:44:05.556244 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:05 crc kubenswrapper[4751]: E0131 14:44:05.556595 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:06.056565581 +0000 UTC m=+150.431278466 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:05 crc kubenswrapper[4751]: I0131 14:44:05.621883 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:44:05 crc kubenswrapper[4751]: I0131 14:44:05.630202 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 14:44:05 crc kubenswrapper[4751]: I0131 14:44:05.637491 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 14:44:05 crc kubenswrapper[4751]: I0131 14:44:05.657900 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:05 crc kubenswrapper[4751]: E0131 14:44:05.657996 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:06.157981539 +0000 UTC m=+150.532694424 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:05 crc kubenswrapper[4751]: I0131 14:44:05.658177 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:05 crc kubenswrapper[4751]: E0131 14:44:05.658426 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:06.158420161 +0000 UTC m=+150.533133036 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:05 crc kubenswrapper[4751]: I0131 14:44:05.709460 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-z9dj7" Jan 31 14:44:05 crc kubenswrapper[4751]: I0131 14:44:05.751291 4751 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-7hjp9 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 31 14:44:05 crc kubenswrapper[4751]: I0131 14:44:05.751331 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7hjp9" podUID="89a244ab-c405-48aa-893f-f50995384ede" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.39:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 31 14:44:05 crc kubenswrapper[4751]: I0131 14:44:05.760301 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:05 crc kubenswrapper[4751]: E0131 14:44:05.760546 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:06.260533257 +0000 UTC m=+150.635246142 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:05 crc kubenswrapper[4751]: I0131 14:44:05.861386 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:05 crc kubenswrapper[4751]: E0131 14:44:05.861900 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:06.361889563 +0000 UTC m=+150.736602438 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:05 crc kubenswrapper[4751]: I0131 14:44:05.867689 4751 patch_prober.go:28] interesting pod/router-default-5444994796-5hn9b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 14:44:05 crc kubenswrapper[4751]: [-]has-synced failed: reason withheld Jan 31 14:44:05 crc kubenswrapper[4751]: [+]process-running ok Jan 31 14:44:05 crc kubenswrapper[4751]: healthz check failed Jan 31 14:44:05 crc kubenswrapper[4751]: I0131 14:44:05.867748 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5hn9b" podUID="01ff1674-4e01-4cdc-aea3-1e91a6a389e3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 14:44:05 crc kubenswrapper[4751]: I0131 14:44:05.964498 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:05 crc kubenswrapper[4751]: E0131 14:44:05.964837 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:06.464821041 +0000 UTC m=+150.839533916 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.066773 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:06 crc kubenswrapper[4751]: E0131 14:44:06.067105 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:06.567094492 +0000 UTC m=+150.941807377 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.095647 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wcnsn"] Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.098583 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wcnsn" Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.103646 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.167652 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.168040 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/074619b7-9220-4377-b93d-6088199a5e16-catalog-content\") pod \"community-operators-wcnsn\" (UID: \"074619b7-9220-4377-b93d-6088199a5e16\") " pod="openshift-marketplace/community-operators-wcnsn" Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.168109 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/074619b7-9220-4377-b93d-6088199a5e16-utilities\") pod \"community-operators-wcnsn\" (UID: \"074619b7-9220-4377-b93d-6088199a5e16\") " pod="openshift-marketplace/community-operators-wcnsn" Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.168162 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzp7l\" (UniqueName: \"kubernetes.io/projected/074619b7-9220-4377-b93d-6088199a5e16-kube-api-access-pzp7l\") pod \"community-operators-wcnsn\" (UID: \"074619b7-9220-4377-b93d-6088199a5e16\") " pod="openshift-marketplace/community-operators-wcnsn" Jan 31 14:44:06 crc kubenswrapper[4751]: E0131 14:44:06.168321 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:06.668304125 +0000 UTC m=+151.043017010 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.173153 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-01-31 14:39:05 +0000 UTC, rotation deadline is 2026-12-15 17:02:29.334903936 +0000 UTC Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.173194 4751 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7634h18m23.161712952s for next certificate rotation Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.199049 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wcnsn"] Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.270674 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzp7l\" (UniqueName: \"kubernetes.io/projected/074619b7-9220-4377-b93d-6088199a5e16-kube-api-access-pzp7l\") pod \"community-operators-wcnsn\" (UID: \"074619b7-9220-4377-b93d-6088199a5e16\") " pod="openshift-marketplace/community-operators-wcnsn" Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.270770 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.270807 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/074619b7-9220-4377-b93d-6088199a5e16-utilities\") pod \"community-operators-wcnsn\" (UID: \"074619b7-9220-4377-b93d-6088199a5e16\") " pod="openshift-marketplace/community-operators-wcnsn" Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.270828 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/074619b7-9220-4377-b93d-6088199a5e16-catalog-content\") pod \"community-operators-wcnsn\" (UID: \"074619b7-9220-4377-b93d-6088199a5e16\") " pod="openshift-marketplace/community-operators-wcnsn" Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.271306 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/074619b7-9220-4377-b93d-6088199a5e16-catalog-content\") pod \"community-operators-wcnsn\" (UID: \"074619b7-9220-4377-b93d-6088199a5e16\") " pod="openshift-marketplace/community-operators-wcnsn" Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.278660 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/074619b7-9220-4377-b93d-6088199a5e16-utilities\") pod \"community-operators-wcnsn\" (UID: \"074619b7-9220-4377-b93d-6088199a5e16\") " pod="openshift-marketplace/community-operators-wcnsn" Jan 31 14:44:06 crc kubenswrapper[4751]: E0131 14:44:06.280370 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:06.780345063 +0000 UTC m=+151.155057948 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.299606 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-m4m6r"] Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.300577 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m4m6r" Jan 31 14:44:06 crc kubenswrapper[4751]: W0131 14:44:06.302412 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-52c9d95b8b20694f3ac62508dc0d5d93be23cde43d59f9d6743b2c3adecdeb4f WatchSource:0}: Error finding container 52c9d95b8b20694f3ac62508dc0d5d93be23cde43d59f9d6743b2c3adecdeb4f: Status 404 returned error can't find the container with id 52c9d95b8b20694f3ac62508dc0d5d93be23cde43d59f9d6743b2c3adecdeb4f Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.306303 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.306937 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzp7l\" (UniqueName: \"kubernetes.io/projected/074619b7-9220-4377-b93d-6088199a5e16-kube-api-access-pzp7l\") pod \"community-operators-wcnsn\" (UID: \"074619b7-9220-4377-b93d-6088199a5e16\") " pod="openshift-marketplace/community-operators-wcnsn" Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.310970 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m4m6r"] Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.372505 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.372728 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d5f1383-42d7-47a1-9e47-8dba038241d2-catalog-content\") pod \"certified-operators-m4m6r\" (UID: \"8d5f1383-42d7-47a1-9e47-8dba038241d2\") " pod="openshift-marketplace/certified-operators-m4m6r" Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.372773 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d5f1383-42d7-47a1-9e47-8dba038241d2-utilities\") pod \"certified-operators-m4m6r\" (UID: \"8d5f1383-42d7-47a1-9e47-8dba038241d2\") " pod="openshift-marketplace/certified-operators-m4m6r" Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.372790 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-566b8\" (UniqueName: \"kubernetes.io/projected/8d5f1383-42d7-47a1-9e47-8dba038241d2-kube-api-access-566b8\") pod \"certified-operators-m4m6r\" (UID: \"8d5f1383-42d7-47a1-9e47-8dba038241d2\") " pod="openshift-marketplace/certified-operators-m4m6r" Jan 31 14:44:06 crc kubenswrapper[4751]: E0131 14:44:06.372884 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:06.872868806 +0000 UTC m=+151.247581691 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.439238 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wcnsn" Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.473850 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d5f1383-42d7-47a1-9e47-8dba038241d2-catalog-content\") pod \"certified-operators-m4m6r\" (UID: \"8d5f1383-42d7-47a1-9e47-8dba038241d2\") " pod="openshift-marketplace/certified-operators-m4m6r" Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.473905 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.473925 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d5f1383-42d7-47a1-9e47-8dba038241d2-utilities\") pod \"certified-operators-m4m6r\" (UID: \"8d5f1383-42d7-47a1-9e47-8dba038241d2\") " pod="openshift-marketplace/certified-operators-m4m6r" Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.473940 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-566b8\" (UniqueName: \"kubernetes.io/projected/8d5f1383-42d7-47a1-9e47-8dba038241d2-kube-api-access-566b8\") pod \"certified-operators-m4m6r\" (UID: \"8d5f1383-42d7-47a1-9e47-8dba038241d2\") " pod="openshift-marketplace/certified-operators-m4m6r" Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.474344 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d5f1383-42d7-47a1-9e47-8dba038241d2-catalog-content\") pod \"certified-operators-m4m6r\" (UID: \"8d5f1383-42d7-47a1-9e47-8dba038241d2\") " pod="openshift-marketplace/certified-operators-m4m6r" Jan 31 14:44:06 crc kubenswrapper[4751]: E0131 14:44:06.474453 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:06.974443229 +0000 UTC m=+151.349156114 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.474632 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d5f1383-42d7-47a1-9e47-8dba038241d2-utilities\") pod \"certified-operators-m4m6r\" (UID: \"8d5f1383-42d7-47a1-9e47-8dba038241d2\") " pod="openshift-marketplace/certified-operators-m4m6r" Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.476258 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ln2lx"] Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.477155 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ln2lx" Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.489875 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-566b8\" (UniqueName: \"kubernetes.io/projected/8d5f1383-42d7-47a1-9e47-8dba038241d2-kube-api-access-566b8\") pod \"certified-operators-m4m6r\" (UID: \"8d5f1383-42d7-47a1-9e47-8dba038241d2\") " pod="openshift-marketplace/certified-operators-m4m6r" Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.496796 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ln2lx"] Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.574659 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.574918 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5c0f5c8-cecf-451f-abef-bf357716eb71-catalog-content\") pod \"community-operators-ln2lx\" (UID: \"d5c0f5c8-cecf-451f-abef-bf357716eb71\") " pod="openshift-marketplace/community-operators-ln2lx" Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.574954 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xd7bp\" (UniqueName: \"kubernetes.io/projected/d5c0f5c8-cecf-451f-abef-bf357716eb71-kube-api-access-xd7bp\") pod \"community-operators-ln2lx\" (UID: \"d5c0f5c8-cecf-451f-abef-bf357716eb71\") " pod="openshift-marketplace/community-operators-ln2lx" Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.574977 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5c0f5c8-cecf-451f-abef-bf357716eb71-utilities\") pod \"community-operators-ln2lx\" (UID: \"d5c0f5c8-cecf-451f-abef-bf357716eb71\") " pod="openshift-marketplace/community-operators-ln2lx" Jan 31 14:44:06 crc kubenswrapper[4751]: E0131 14:44:06.575178 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:07.075162718 +0000 UTC m=+151.449875603 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.634680 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m4m6r" Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.671324 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2lq4t"] Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.672194 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2lq4t" Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.681650 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2lq4t"] Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.686797 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5c0f5c8-cecf-451f-abef-bf357716eb71-catalog-content\") pod \"community-operators-ln2lx\" (UID: \"d5c0f5c8-cecf-451f-abef-bf357716eb71\") " pod="openshift-marketplace/community-operators-ln2lx" Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.686834 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xd7bp\" (UniqueName: \"kubernetes.io/projected/d5c0f5c8-cecf-451f-abef-bf357716eb71-kube-api-access-xd7bp\") pod \"community-operators-ln2lx\" (UID: \"d5c0f5c8-cecf-451f-abef-bf357716eb71\") " pod="openshift-marketplace/community-operators-ln2lx" Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.686857 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5c0f5c8-cecf-451f-abef-bf357716eb71-utilities\") pod \"community-operators-ln2lx\" (UID: \"d5c0f5c8-cecf-451f-abef-bf357716eb71\") " pod="openshift-marketplace/community-operators-ln2lx" Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.686907 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.687490 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5c0f5c8-cecf-451f-abef-bf357716eb71-catalog-content\") pod \"community-operators-ln2lx\" (UID: \"d5c0f5c8-cecf-451f-abef-bf357716eb71\") " pod="openshift-marketplace/community-operators-ln2lx" Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.687840 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5c0f5c8-cecf-451f-abef-bf357716eb71-utilities\") pod \"community-operators-ln2lx\" (UID: \"d5c0f5c8-cecf-451f-abef-bf357716eb71\") " pod="openshift-marketplace/community-operators-ln2lx" Jan 31 14:44:06 crc kubenswrapper[4751]: E0131 14:44:06.688027 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:07.188014408 +0000 UTC m=+151.562727293 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.705917 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xd7bp\" (UniqueName: \"kubernetes.io/projected/d5c0f5c8-cecf-451f-abef-bf357716eb71-kube-api-access-xd7bp\") pod \"community-operators-ln2lx\" (UID: \"d5c0f5c8-cecf-451f-abef-bf357716eb71\") " pod="openshift-marketplace/community-operators-ln2lx" Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.776517 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"70b37644cca0f337fc4dfd4871e75b89a2f42b4c243cf2cfc79c2e019ace9a46"} Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.777007 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"52c9d95b8b20694f3ac62508dc0d5d93be23cde43d59f9d6743b2c3adecdeb4f"} Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.779062 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"170a4113c11ddf109b05e9b2b2d59fdc24f80149db398271581c0e03098fdceb"} Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.779115 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"50465fb60971365621e3ff7ead311c712da1f8cd2c99229b9e0bc34c0650ce23"} Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.781705 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"c7abe610f55ef7baaa7e163f1517703c61608ff709ef5bf94be18db548949429"} Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.781766 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"07ec98dd31415f89b39b186e72ce385edc73781a64d2d2f5fcf1affce07c6f0c"} Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.782214 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.784735 4751 generic.go:334] "Generic (PLEG): container finished" podID="eade01dc-846b-42a8-a6ed-8cf0a0663e82" containerID="9d26a6d6092efc3cfe1b53bda2539e32fc75d0f27a288ecda4b2062254a0fc73" exitCode=0 Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.784778 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497830-rpwmp" event={"ID":"eade01dc-846b-42a8-a6ed-8cf0a0663e82","Type":"ContainerDied","Data":"9d26a6d6092efc3cfe1b53bda2539e32fc75d0f27a288ecda4b2062254a0fc73"} Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.788681 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.788933 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c447796d-48ac-4eeb-8fe6-ad411966b3d3-catalog-content\") pod \"certified-operators-2lq4t\" (UID: \"c447796d-48ac-4eeb-8fe6-ad411966b3d3\") " pod="openshift-marketplace/certified-operators-2lq4t" Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.788977 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c447796d-48ac-4eeb-8fe6-ad411966b3d3-utilities\") pod \"certified-operators-2lq4t\" (UID: \"c447796d-48ac-4eeb-8fe6-ad411966b3d3\") " pod="openshift-marketplace/certified-operators-2lq4t" Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.788998 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2g96\" (UniqueName: \"kubernetes.io/projected/c447796d-48ac-4eeb-8fe6-ad411966b3d3-kube-api-access-n2g96\") pod \"certified-operators-2lq4t\" (UID: \"c447796d-48ac-4eeb-8fe6-ad411966b3d3\") " pod="openshift-marketplace/certified-operators-2lq4t" Jan 31 14:44:06 crc kubenswrapper[4751]: E0131 14:44:06.789124 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:07.289061257 +0000 UTC m=+151.663774142 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.829992 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ln2lx" Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.833544 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.834189 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.837557 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.837580 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.851159 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.864754 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m4m6r"] Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.870988 4751 patch_prober.go:28] interesting pod/router-default-5444994796-5hn9b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 14:44:06 crc kubenswrapper[4751]: [-]has-synced failed: reason withheld Jan 31 14:44:06 crc kubenswrapper[4751]: [+]process-running ok Jan 31 14:44:06 crc kubenswrapper[4751]: healthz check failed Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.871125 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5hn9b" podUID="01ff1674-4e01-4cdc-aea3-1e91a6a389e3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.891036 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c447796d-48ac-4eeb-8fe6-ad411966b3d3-catalog-content\") pod \"certified-operators-2lq4t\" (UID: \"c447796d-48ac-4eeb-8fe6-ad411966b3d3\") " pod="openshift-marketplace/certified-operators-2lq4t" Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.891139 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/44a681ea-f7f5-4eba-b40e-03ea17fd4bf8-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"44a681ea-f7f5-4eba-b40e-03ea17fd4bf8\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.891174 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c447796d-48ac-4eeb-8fe6-ad411966b3d3-utilities\") pod \"certified-operators-2lq4t\" (UID: \"c447796d-48ac-4eeb-8fe6-ad411966b3d3\") " pod="openshift-marketplace/certified-operators-2lq4t" Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.891202 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2g96\" (UniqueName: \"kubernetes.io/projected/c447796d-48ac-4eeb-8fe6-ad411966b3d3-kube-api-access-n2g96\") pod \"certified-operators-2lq4t\" (UID: \"c447796d-48ac-4eeb-8fe6-ad411966b3d3\") " pod="openshift-marketplace/certified-operators-2lq4t" Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.891245 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/44a681ea-f7f5-4eba-b40e-03ea17fd4bf8-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"44a681ea-f7f5-4eba-b40e-03ea17fd4bf8\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.891315 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:06 crc kubenswrapper[4751]: E0131 14:44:06.891930 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:07.391904442 +0000 UTC m=+151.766617317 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.892405 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c447796d-48ac-4eeb-8fe6-ad411966b3d3-catalog-content\") pod \"certified-operators-2lq4t\" (UID: \"c447796d-48ac-4eeb-8fe6-ad411966b3d3\") " pod="openshift-marketplace/certified-operators-2lq4t" Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.892622 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c447796d-48ac-4eeb-8fe6-ad411966b3d3-utilities\") pod \"certified-operators-2lq4t\" (UID: \"c447796d-48ac-4eeb-8fe6-ad411966b3d3\") " pod="openshift-marketplace/certified-operators-2lq4t" Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.916624 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wcnsn"] Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.919872 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2g96\" (UniqueName: \"kubernetes.io/projected/c447796d-48ac-4eeb-8fe6-ad411966b3d3-kube-api-access-n2g96\") pod \"certified-operators-2lq4t\" (UID: \"c447796d-48ac-4eeb-8fe6-ad411966b3d3\") " pod="openshift-marketplace/certified-operators-2lq4t" Jan 31 14:44:06 crc kubenswrapper[4751]: W0131 14:44:06.944064 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod074619b7_9220_4377_b93d_6088199a5e16.slice/crio-092d3acc3e94a3dfd58bc12b9df82ef7950bf9b5a3e7871999c9c0efa3eb1c6d WatchSource:0}: Error finding container 092d3acc3e94a3dfd58bc12b9df82ef7950bf9b5a3e7871999c9c0efa3eb1c6d: Status 404 returned error can't find the container with id 092d3acc3e94a3dfd58bc12b9df82ef7950bf9b5a3e7871999c9c0efa3eb1c6d Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.993009 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.993836 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/44a681ea-f7f5-4eba-b40e-03ea17fd4bf8-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"44a681ea-f7f5-4eba-b40e-03ea17fd4bf8\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.993939 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/44a681ea-f7f5-4eba-b40e-03ea17fd4bf8-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"44a681ea-f7f5-4eba-b40e-03ea17fd4bf8\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.993949 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/44a681ea-f7f5-4eba-b40e-03ea17fd4bf8-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"44a681ea-f7f5-4eba-b40e-03ea17fd4bf8\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 31 14:44:06 crc kubenswrapper[4751]: E0131 14:44:06.994051 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:07.494030039 +0000 UTC m=+151.868742974 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:06 crc kubenswrapper[4751]: I0131 14:44:06.998613 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2lq4t" Jan 31 14:44:07 crc kubenswrapper[4751]: I0131 14:44:07.010988 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/44a681ea-f7f5-4eba-b40e-03ea17fd4bf8-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"44a681ea-f7f5-4eba-b40e-03ea17fd4bf8\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 31 14:44:07 crc kubenswrapper[4751]: I0131 14:44:07.094839 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:07 crc kubenswrapper[4751]: E0131 14:44:07.095167 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:07.59515626 +0000 UTC m=+151.969869135 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:07 crc kubenswrapper[4751]: I0131 14:44:07.113603 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ln2lx"] Jan 31 14:44:07 crc kubenswrapper[4751]: I0131 14:44:07.155511 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 31 14:44:07 crc kubenswrapper[4751]: I0131 14:44:07.195794 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:07 crc kubenswrapper[4751]: E0131 14:44:07.196125 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:07.696108935 +0000 UTC m=+152.070821820 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:07 crc kubenswrapper[4751]: W0131 14:44:07.220389 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5c0f5c8_cecf_451f_abef_bf357716eb71.slice/crio-a9f4794a6036dec4476c4be7ee2587554c0cf25782f49b8a2635038cb9771dcf WatchSource:0}: Error finding container a9f4794a6036dec4476c4be7ee2587554c0cf25782f49b8a2635038cb9771dcf: Status 404 returned error can't find the container with id a9f4794a6036dec4476c4be7ee2587554c0cf25782f49b8a2635038cb9771dcf Jan 31 14:44:07 crc kubenswrapper[4751]: I0131 14:44:07.256896 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2lq4t"] Jan 31 14:44:07 crc kubenswrapper[4751]: I0131 14:44:07.299749 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:07 crc kubenswrapper[4751]: E0131 14:44:07.300030 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:07.800017959 +0000 UTC m=+152.174730844 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:07 crc kubenswrapper[4751]: I0131 14:44:07.400836 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:07 crc kubenswrapper[4751]: E0131 14:44:07.401034 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:07.901002576 +0000 UTC m=+152.275715461 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:07 crc kubenswrapper[4751]: I0131 14:44:07.401316 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:07 crc kubenswrapper[4751]: E0131 14:44:07.401699 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:07.901688644 +0000 UTC m=+152.276401529 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:07 crc kubenswrapper[4751]: I0131 14:44:07.411627 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 31 14:44:07 crc kubenswrapper[4751]: I0131 14:44:07.502475 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:07 crc kubenswrapper[4751]: E0131 14:44:07.502772 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:08.002757202 +0000 UTC m=+152.377470087 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:07 crc kubenswrapper[4751]: I0131 14:44:07.603488 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:07 crc kubenswrapper[4751]: E0131 14:44:07.603958 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:08.103940603 +0000 UTC m=+152.478653548 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:07 crc kubenswrapper[4751]: I0131 14:44:07.704210 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:07 crc kubenswrapper[4751]: E0131 14:44:07.704421 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:08.204390986 +0000 UTC m=+152.579103871 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:07 crc kubenswrapper[4751]: I0131 14:44:07.704903 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:07 crc kubenswrapper[4751]: E0131 14:44:07.705213 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:08.205200327 +0000 UTC m=+152.579913212 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:07 crc kubenswrapper[4751]: I0131 14:44:07.789865 4751 generic.go:334] "Generic (PLEG): container finished" podID="074619b7-9220-4377-b93d-6088199a5e16" containerID="c0a252955873aa8b7cfdf7c617f1852f7e64f86f50411d0f5cc675309d6a71b6" exitCode=0 Jan 31 14:44:07 crc kubenswrapper[4751]: I0131 14:44:07.789927 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wcnsn" event={"ID":"074619b7-9220-4377-b93d-6088199a5e16","Type":"ContainerDied","Data":"c0a252955873aa8b7cfdf7c617f1852f7e64f86f50411d0f5cc675309d6a71b6"} Jan 31 14:44:07 crc kubenswrapper[4751]: I0131 14:44:07.789953 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wcnsn" event={"ID":"074619b7-9220-4377-b93d-6088199a5e16","Type":"ContainerStarted","Data":"092d3acc3e94a3dfd58bc12b9df82ef7950bf9b5a3e7871999c9c0efa3eb1c6d"} Jan 31 14:44:07 crc kubenswrapper[4751]: I0131 14:44:07.791268 4751 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 31 14:44:07 crc kubenswrapper[4751]: I0131 14:44:07.791537 4751 generic.go:334] "Generic (PLEG): container finished" podID="8d5f1383-42d7-47a1-9e47-8dba038241d2" containerID="e34fa377384a9a30f2361b80400e882c53155e0b5c8ad5f9beb3a5c178384ca0" exitCode=0 Jan 31 14:44:07 crc kubenswrapper[4751]: I0131 14:44:07.791594 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m4m6r" event={"ID":"8d5f1383-42d7-47a1-9e47-8dba038241d2","Type":"ContainerDied","Data":"e34fa377384a9a30f2361b80400e882c53155e0b5c8ad5f9beb3a5c178384ca0"} Jan 31 14:44:07 crc kubenswrapper[4751]: I0131 14:44:07.791617 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m4m6r" event={"ID":"8d5f1383-42d7-47a1-9e47-8dba038241d2","Type":"ContainerStarted","Data":"cce74deb968262c3870a67f8d4e000b52815c6a74a72fbfe9270cef7ee6b23e7"} Jan 31 14:44:07 crc kubenswrapper[4751]: I0131 14:44:07.794645 4751 generic.go:334] "Generic (PLEG): container finished" podID="c447796d-48ac-4eeb-8fe6-ad411966b3d3" containerID="f55678880104a29f2f67c32892dfe2939404ec7dce246a6e2dd6c365f96de5ab" exitCode=0 Jan 31 14:44:07 crc kubenswrapper[4751]: I0131 14:44:07.794694 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2lq4t" event={"ID":"c447796d-48ac-4eeb-8fe6-ad411966b3d3","Type":"ContainerDied","Data":"f55678880104a29f2f67c32892dfe2939404ec7dce246a6e2dd6c365f96de5ab"} Jan 31 14:44:07 crc kubenswrapper[4751]: I0131 14:44:07.794707 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2lq4t" event={"ID":"c447796d-48ac-4eeb-8fe6-ad411966b3d3","Type":"ContainerStarted","Data":"5c1f5c13def0721993c42fbb7e9330a705cffc8e6326a288871d364ef1275f63"} Jan 31 14:44:07 crc kubenswrapper[4751]: I0131 14:44:07.795781 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"44a681ea-f7f5-4eba-b40e-03ea17fd4bf8","Type":"ContainerStarted","Data":"370d04a4b77cb1df2a005e656252d040f2a1db0e2e84ee84b32b04f105cfd9d0"} Jan 31 14:44:07 crc kubenswrapper[4751]: I0131 14:44:07.798889 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-x4rnh" event={"ID":"a55fc688-004a-4d6f-a48e-c10b0ae1d8f1","Type":"ContainerStarted","Data":"3a7629ab15f1a744d9219d9345494f0ef6457c0d790da978b3f784b4ef8a6850"} Jan 31 14:44:07 crc kubenswrapper[4751]: I0131 14:44:07.800085 4751 generic.go:334] "Generic (PLEG): container finished" podID="d5c0f5c8-cecf-451f-abef-bf357716eb71" containerID="4b70b0f5c40fae7241cf1b33c7ddc52732dc42394eac071686d9ade2daf20d08" exitCode=0 Jan 31 14:44:07 crc kubenswrapper[4751]: I0131 14:44:07.800165 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ln2lx" event={"ID":"d5c0f5c8-cecf-451f-abef-bf357716eb71","Type":"ContainerDied","Data":"4b70b0f5c40fae7241cf1b33c7ddc52732dc42394eac071686d9ade2daf20d08"} Jan 31 14:44:07 crc kubenswrapper[4751]: I0131 14:44:07.800188 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ln2lx" event={"ID":"d5c0f5c8-cecf-451f-abef-bf357716eb71","Type":"ContainerStarted","Data":"a9f4794a6036dec4476c4be7ee2587554c0cf25782f49b8a2635038cb9771dcf"} Jan 31 14:44:07 crc kubenswrapper[4751]: I0131 14:44:07.805862 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:07 crc kubenswrapper[4751]: E0131 14:44:07.806166 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:08.306152503 +0000 UTC m=+152.680865378 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:07 crc kubenswrapper[4751]: I0131 14:44:07.868119 4751 patch_prober.go:28] interesting pod/router-default-5444994796-5hn9b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 14:44:07 crc kubenswrapper[4751]: [-]has-synced failed: reason withheld Jan 31 14:44:07 crc kubenswrapper[4751]: [+]process-running ok Jan 31 14:44:07 crc kubenswrapper[4751]: healthz check failed Jan 31 14:44:07 crc kubenswrapper[4751]: I0131 14:44:07.868189 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5hn9b" podUID="01ff1674-4e01-4cdc-aea3-1e91a6a389e3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 14:44:07 crc kubenswrapper[4751]: I0131 14:44:07.908944 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:07 crc kubenswrapper[4751]: E0131 14:44:07.910613 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:08.410599761 +0000 UTC m=+152.785312656 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.009713 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:08 crc kubenswrapper[4751]: E0131 14:44:08.010130 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:08.510115899 +0000 UTC m=+152.884828784 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.019133 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-sxjf5" Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.038337 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdsj4" Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.046065 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497830-rpwmp" Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.046632 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdsj4" Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.095214 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-k2xfl"] Jan 31 14:44:08 crc kubenswrapper[4751]: E0131 14:44:08.095457 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eade01dc-846b-42a8-a6ed-8cf0a0663e82" containerName="collect-profiles" Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.095469 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="eade01dc-846b-42a8-a6ed-8cf0a0663e82" containerName="collect-profiles" Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.095580 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="eade01dc-846b-42a8-a6ed-8cf0a0663e82" containerName="collect-profiles" Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.096369 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k2xfl" Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.097761 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k2xfl"] Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.101615 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.110653 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eade01dc-846b-42a8-a6ed-8cf0a0663e82-config-volume\") pod \"eade01dc-846b-42a8-a6ed-8cf0a0663e82\" (UID: \"eade01dc-846b-42a8-a6ed-8cf0a0663e82\") " Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.110693 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eade01dc-846b-42a8-a6ed-8cf0a0663e82-secret-volume\") pod \"eade01dc-846b-42a8-a6ed-8cf0a0663e82\" (UID: \"eade01dc-846b-42a8-a6ed-8cf0a0663e82\") " Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.110892 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwjbh\" (UniqueName: \"kubernetes.io/projected/eade01dc-846b-42a8-a6ed-8cf0a0663e82-kube-api-access-zwjbh\") pod \"eade01dc-846b-42a8-a6ed-8cf0a0663e82\" (UID: \"eade01dc-846b-42a8-a6ed-8cf0a0663e82\") " Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.111050 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.113781 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eade01dc-846b-42a8-a6ed-8cf0a0663e82-config-volume" (OuterVolumeSpecName: "config-volume") pod "eade01dc-846b-42a8-a6ed-8cf0a0663e82" (UID: "eade01dc-846b-42a8-a6ed-8cf0a0663e82"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:44:08 crc kubenswrapper[4751]: E0131 14:44:08.115859 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:08.615842121 +0000 UTC m=+152.990555006 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.121044 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eade01dc-846b-42a8-a6ed-8cf0a0663e82-kube-api-access-zwjbh" (OuterVolumeSpecName: "kube-api-access-zwjbh") pod "eade01dc-846b-42a8-a6ed-8cf0a0663e82" (UID: "eade01dc-846b-42a8-a6ed-8cf0a0663e82"). InnerVolumeSpecName "kube-api-access-zwjbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.121914 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eade01dc-846b-42a8-a6ed-8cf0a0663e82-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "eade01dc-846b-42a8-a6ed-8cf0a0663e82" (UID: "eade01dc-846b-42a8-a6ed-8cf0a0663e82"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.169666 4751 patch_prober.go:28] interesting pod/downloads-7954f5f757-4m7jl container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.169724 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-4m7jl" podUID="d723501b-bb29-4d60-ad97-239eb749771f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.173980 4751 patch_prober.go:28] interesting pod/downloads-7954f5f757-4m7jl container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.174013 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-4m7jl" podUID="d723501b-bb29-4d60-ad97-239eb749771f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.215989 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.216170 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkxqv\" (UniqueName: \"kubernetes.io/projected/e656c7af-fbd9-4e9c-ae61-d4142d37c89f-kube-api-access-wkxqv\") pod \"redhat-marketplace-k2xfl\" (UID: \"e656c7af-fbd9-4e9c-ae61-d4142d37c89f\") " pod="openshift-marketplace/redhat-marketplace-k2xfl" Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.216216 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e656c7af-fbd9-4e9c-ae61-d4142d37c89f-utilities\") pod \"redhat-marketplace-k2xfl\" (UID: \"e656c7af-fbd9-4e9c-ae61-d4142d37c89f\") " pod="openshift-marketplace/redhat-marketplace-k2xfl" Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.216263 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e656c7af-fbd9-4e9c-ae61-d4142d37c89f-catalog-content\") pod \"redhat-marketplace-k2xfl\" (UID: \"e656c7af-fbd9-4e9c-ae61-d4142d37c89f\") " pod="openshift-marketplace/redhat-marketplace-k2xfl" Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.216364 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwjbh\" (UniqueName: \"kubernetes.io/projected/eade01dc-846b-42a8-a6ed-8cf0a0663e82-kube-api-access-zwjbh\") on node \"crc\" DevicePath \"\"" Jan 31 14:44:08 crc kubenswrapper[4751]: E0131 14:44:08.216369 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:08.716350495 +0000 UTC m=+153.091063470 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.216399 4751 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eade01dc-846b-42a8-a6ed-8cf0a0663e82-config-volume\") on node \"crc\" DevicePath \"\"" Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.216412 4751 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eade01dc-846b-42a8-a6ed-8cf0a0663e82-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.318874 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e656c7af-fbd9-4e9c-ae61-d4142d37c89f-catalog-content\") pod \"redhat-marketplace-k2xfl\" (UID: \"e656c7af-fbd9-4e9c-ae61-d4142d37c89f\") " pod="openshift-marketplace/redhat-marketplace-k2xfl" Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.319049 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkxqv\" (UniqueName: \"kubernetes.io/projected/e656c7af-fbd9-4e9c-ae61-d4142d37c89f-kube-api-access-wkxqv\") pod \"redhat-marketplace-k2xfl\" (UID: \"e656c7af-fbd9-4e9c-ae61-d4142d37c89f\") " pod="openshift-marketplace/redhat-marketplace-k2xfl" Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.319147 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.319232 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e656c7af-fbd9-4e9c-ae61-d4142d37c89f-utilities\") pod \"redhat-marketplace-k2xfl\" (UID: \"e656c7af-fbd9-4e9c-ae61-d4142d37c89f\") " pod="openshift-marketplace/redhat-marketplace-k2xfl" Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.319355 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e656c7af-fbd9-4e9c-ae61-d4142d37c89f-catalog-content\") pod \"redhat-marketplace-k2xfl\" (UID: \"e656c7af-fbd9-4e9c-ae61-d4142d37c89f\") " pod="openshift-marketplace/redhat-marketplace-k2xfl" Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.319745 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e656c7af-fbd9-4e9c-ae61-d4142d37c89f-utilities\") pod \"redhat-marketplace-k2xfl\" (UID: \"e656c7af-fbd9-4e9c-ae61-d4142d37c89f\") " pod="openshift-marketplace/redhat-marketplace-k2xfl" Jan 31 14:44:08 crc kubenswrapper[4751]: E0131 14:44:08.319883 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:08.819872758 +0000 UTC m=+153.194585643 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.337218 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkxqv\" (UniqueName: \"kubernetes.io/projected/e656c7af-fbd9-4e9c-ae61-d4142d37c89f-kube-api-access-wkxqv\") pod \"redhat-marketplace-k2xfl\" (UID: \"e656c7af-fbd9-4e9c-ae61-d4142d37c89f\") " pod="openshift-marketplace/redhat-marketplace-k2xfl" Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.420452 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:08 crc kubenswrapper[4751]: E0131 14:44:08.420987 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:08.920965038 +0000 UTC m=+153.295677943 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.446684 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k2xfl" Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.484514 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nfjx5"] Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.486059 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nfjx5" Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.499744 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nfjx5"] Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.522667 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:08 crc kubenswrapper[4751]: E0131 14:44:08.523015 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:09.022996222 +0000 UTC m=+153.397709107 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.544417 4751 patch_prober.go:28] interesting pod/apiserver-76f77b778f-5f7jc container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Jan 31 14:44:08 crc kubenswrapper[4751]: [+]log ok Jan 31 14:44:08 crc kubenswrapper[4751]: [+]etcd ok Jan 31 14:44:08 crc kubenswrapper[4751]: [+]poststarthook/start-apiserver-admission-initializer ok Jan 31 14:44:08 crc kubenswrapper[4751]: [+]poststarthook/generic-apiserver-start-informers ok Jan 31 14:44:08 crc kubenswrapper[4751]: [+]poststarthook/max-in-flight-filter ok Jan 31 14:44:08 crc kubenswrapper[4751]: [+]poststarthook/storage-object-count-tracker-hook ok Jan 31 14:44:08 crc kubenswrapper[4751]: [+]poststarthook/image.openshift.io-apiserver-caches ok Jan 31 14:44:08 crc kubenswrapper[4751]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Jan 31 14:44:08 crc kubenswrapper[4751]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Jan 31 14:44:08 crc kubenswrapper[4751]: [+]poststarthook/project.openshift.io-projectcache ok Jan 31 14:44:08 crc kubenswrapper[4751]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Jan 31 14:44:08 crc kubenswrapper[4751]: [+]poststarthook/openshift.io-startinformers ok Jan 31 14:44:08 crc kubenswrapper[4751]: [+]poststarthook/openshift.io-restmapperupdater ok Jan 31 14:44:08 crc kubenswrapper[4751]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Jan 31 14:44:08 crc kubenswrapper[4751]: livez check failed Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.544496 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-5f7jc" podUID="89314349-bbc8-4886-b93b-51358e4e71b0" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.605527 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.606919 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.612861 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.612899 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.623710 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:08 crc kubenswrapper[4751]: E0131 14:44:08.623853 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:09.123813894 +0000 UTC m=+153.498526789 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.624053 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.624114 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nf7x\" (UniqueName: \"kubernetes.io/projected/e771b68a-beea-4c8b-a085-b869155ca20d-kube-api-access-4nf7x\") pod \"redhat-marketplace-nfjx5\" (UID: \"e771b68a-beea-4c8b-a085-b869155ca20d\") " pod="openshift-marketplace/redhat-marketplace-nfjx5" Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.624180 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e771b68a-beea-4c8b-a085-b869155ca20d-utilities\") pod \"redhat-marketplace-nfjx5\" (UID: \"e771b68a-beea-4c8b-a085-b869155ca20d\") " pod="openshift-marketplace/redhat-marketplace-nfjx5" Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.624292 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e771b68a-beea-4c8b-a085-b869155ca20d-catalog-content\") pod \"redhat-marketplace-nfjx5\" (UID: \"e771b68a-beea-4c8b-a085-b869155ca20d\") " pod="openshift-marketplace/redhat-marketplace-nfjx5" Jan 31 14:44:08 crc kubenswrapper[4751]: E0131 14:44:08.624684 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:09.124674167 +0000 UTC m=+153.499387062 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.627325 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.681252 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k2xfl"] Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.725412 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:08 crc kubenswrapper[4751]: E0131 14:44:08.725581 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:09.225558061 +0000 UTC m=+153.600270956 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.725641 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.725672 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nf7x\" (UniqueName: \"kubernetes.io/projected/e771b68a-beea-4c8b-a085-b869155ca20d-kube-api-access-4nf7x\") pod \"redhat-marketplace-nfjx5\" (UID: \"e771b68a-beea-4c8b-a085-b869155ca20d\") " pod="openshift-marketplace/redhat-marketplace-nfjx5" Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.725715 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e771b68a-beea-4c8b-a085-b869155ca20d-utilities\") pod \"redhat-marketplace-nfjx5\" (UID: \"e771b68a-beea-4c8b-a085-b869155ca20d\") " pod="openshift-marketplace/redhat-marketplace-nfjx5" Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.725768 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7322d0f6-a94f-48be-98fb-b2883f20cc53-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"7322d0f6-a94f-48be-98fb-b2883f20cc53\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.725820 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7322d0f6-a94f-48be-98fb-b2883f20cc53-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"7322d0f6-a94f-48be-98fb-b2883f20cc53\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.725846 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e771b68a-beea-4c8b-a085-b869155ca20d-catalog-content\") pod \"redhat-marketplace-nfjx5\" (UID: \"e771b68a-beea-4c8b-a085-b869155ca20d\") " pod="openshift-marketplace/redhat-marketplace-nfjx5" Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.726486 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e771b68a-beea-4c8b-a085-b869155ca20d-catalog-content\") pod \"redhat-marketplace-nfjx5\" (UID: \"e771b68a-beea-4c8b-a085-b869155ca20d\") " pod="openshift-marketplace/redhat-marketplace-nfjx5" Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.726632 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e771b68a-beea-4c8b-a085-b869155ca20d-utilities\") pod \"redhat-marketplace-nfjx5\" (UID: \"e771b68a-beea-4c8b-a085-b869155ca20d\") " pod="openshift-marketplace/redhat-marketplace-nfjx5" Jan 31 14:44:08 crc kubenswrapper[4751]: E0131 14:44:08.726807 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:09.226795984 +0000 UTC m=+153.601508979 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.757689 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nf7x\" (UniqueName: \"kubernetes.io/projected/e771b68a-beea-4c8b-a085-b869155ca20d-kube-api-access-4nf7x\") pod \"redhat-marketplace-nfjx5\" (UID: \"e771b68a-beea-4c8b-a085-b869155ca20d\") " pod="openshift-marketplace/redhat-marketplace-nfjx5" Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.815144 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nfjx5" Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.824379 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k2xfl" event={"ID":"e656c7af-fbd9-4e9c-ae61-d4142d37c89f","Type":"ContainerStarted","Data":"ed378354261ea17a2d24e834a9aed8f1a45166375fb6ae1ce1dc38b9af3b5e0f"} Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.828898 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.829214 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7322d0f6-a94f-48be-98fb-b2883f20cc53-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"7322d0f6-a94f-48be-98fb-b2883f20cc53\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.829260 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7322d0f6-a94f-48be-98fb-b2883f20cc53-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"7322d0f6-a94f-48be-98fb-b2883f20cc53\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 31 14:44:08 crc kubenswrapper[4751]: E0131 14:44:08.829701 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:09.32967943 +0000 UTC m=+153.704392305 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.829736 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7322d0f6-a94f-48be-98fb-b2883f20cc53-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"7322d0f6-a94f-48be-98fb-b2883f20cc53\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.851256 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497830-rpwmp" event={"ID":"eade01dc-846b-42a8-a6ed-8cf0a0663e82","Type":"ContainerDied","Data":"9d44648df839910022878a08450dec667db28fe365908b86584da87c8884b401"} Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.851333 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d44648df839910022878a08450dec667db28fe365908b86584da87c8884b401" Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.851531 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497830-rpwmp" Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.856988 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7322d0f6-a94f-48be-98fb-b2883f20cc53-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"7322d0f6-a94f-48be-98fb-b2883f20cc53\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.864031 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-5hn9b" Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.868613 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"44a681ea-f7f5-4eba-b40e-03ea17fd4bf8","Type":"ContainerStarted","Data":"3baa617a27e83d80f5320f7cc47fc62891a992ae7b55cc71b019d15fc16ab870"} Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.869559 4751 patch_prober.go:28] interesting pod/router-default-5444994796-5hn9b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 14:44:08 crc kubenswrapper[4751]: [-]has-synced failed: reason withheld Jan 31 14:44:08 crc kubenswrapper[4751]: [+]process-running ok Jan 31 14:44:08 crc kubenswrapper[4751]: healthz check failed Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.869617 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5hn9b" podUID="01ff1674-4e01-4cdc-aea3-1e91a6a389e3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.873786 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-x4rnh" event={"ID":"a55fc688-004a-4d6f-a48e-c10b0ae1d8f1","Type":"ContainerStarted","Data":"93d7520b8789253b8932588bc554d325acef997a5d57f5ac4aef79ae4024916e"} Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.883772 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.883751968 podStartE2EDuration="2.883751968s" podCreationTimestamp="2026-01-31 14:44:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:44:08.88345257 +0000 UTC m=+153.258165455" watchObservedRunningTime="2026-01-31 14:44:08.883751968 +0000 UTC m=+153.258464853" Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.897488 4751 patch_prober.go:28] interesting pod/machine-config-daemon-2wpj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.897548 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podUID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.928635 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-h262z" Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.928696 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-h262z" Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.930152 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:08 crc kubenswrapper[4751]: E0131 14:44:08.930474 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:09.430459172 +0000 UTC m=+153.805172057 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.931837 4751 patch_prober.go:28] interesting pod/console-f9d7485db-h262z container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.17:8443/health\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.931871 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-h262z" podUID="5caeb3dc-2a42-41b5-ac91-c1c8a216fb43" containerName="console" probeResult="failure" output="Get \"https://10.217.0.17:8443/health\": dial tcp 10.217.0.17:8443: connect: connection refused" Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.945628 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 31 14:44:08 crc kubenswrapper[4751]: I0131 14:44:08.991657 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7hjp9" Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.023364 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vc9q2" Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.026111 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-5r6kv" Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.031591 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:09 crc kubenswrapper[4751]: E0131 14:44:09.031693 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:09.531675884 +0000 UTC m=+153.906388759 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.032666 4751 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.033613 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:09 crc kubenswrapper[4751]: E0131 14:44:09.034150 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:09.534136609 +0000 UTC m=+153.908849494 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.048716 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7hc86" Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.081938 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nfjx5"] Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.135620 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:09 crc kubenswrapper[4751]: E0131 14:44:09.135777 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:09.635756403 +0000 UTC m=+154.010469288 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.135992 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:09 crc kubenswrapper[4751]: E0131 14:44:09.139193 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:09.639161303 +0000 UTC m=+154.013874358 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.238910 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:09 crc kubenswrapper[4751]: E0131 14:44:09.239109 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:09.73905737 +0000 UTC m=+154.113770255 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.239374 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:09 crc kubenswrapper[4751]: E0131 14:44:09.239809 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:09.73979266 +0000 UTC m=+154.114505545 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.270402 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gktqp"] Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.271410 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gktqp" Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.273454 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.287853 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gktqp"] Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.340595 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:09 crc kubenswrapper[4751]: E0131 14:44:09.340783 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:09.840756006 +0000 UTC m=+154.215468891 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.340832 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cfb2e52-7371-4d38-994c-92b5b7d123cc-catalog-content\") pod \"redhat-operators-gktqp\" (UID: \"0cfb2e52-7371-4d38-994c-92b5b7d123cc\") " pod="openshift-marketplace/redhat-operators-gktqp" Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.340925 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgf8m\" (UniqueName: \"kubernetes.io/projected/0cfb2e52-7371-4d38-994c-92b5b7d123cc-kube-api-access-qgf8m\") pod \"redhat-operators-gktqp\" (UID: \"0cfb2e52-7371-4d38-994c-92b5b7d123cc\") " pod="openshift-marketplace/redhat-operators-gktqp" Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.340991 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cfb2e52-7371-4d38-994c-92b5b7d123cc-utilities\") pod \"redhat-operators-gktqp\" (UID: \"0cfb2e52-7371-4d38-994c-92b5b7d123cc\") " pod="openshift-marketplace/redhat-operators-gktqp" Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.341038 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:09 crc kubenswrapper[4751]: E0131 14:44:09.341565 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:09.841539427 +0000 UTC m=+154.216252342 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.393990 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 31 14:44:09 crc kubenswrapper[4751]: W0131 14:44:09.426430 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod7322d0f6_a94f_48be_98fb_b2883f20cc53.slice/crio-a3df1f6a5863eb19b6d181d426ef7f986d5e8f0fcb559160484636c7ea634096 WatchSource:0}: Error finding container a3df1f6a5863eb19b6d181d426ef7f986d5e8f0fcb559160484636c7ea634096: Status 404 returned error can't find the container with id a3df1f6a5863eb19b6d181d426ef7f986d5e8f0fcb559160484636c7ea634096 Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.442212 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:09 crc kubenswrapper[4751]: E0131 14:44:09.442505 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:09.942460642 +0000 UTC m=+154.317173537 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.443443 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgf8m\" (UniqueName: \"kubernetes.io/projected/0cfb2e52-7371-4d38-994c-92b5b7d123cc-kube-api-access-qgf8m\") pod \"redhat-operators-gktqp\" (UID: \"0cfb2e52-7371-4d38-994c-92b5b7d123cc\") " pod="openshift-marketplace/redhat-operators-gktqp" Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.443550 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cfb2e52-7371-4d38-994c-92b5b7d123cc-utilities\") pod \"redhat-operators-gktqp\" (UID: \"0cfb2e52-7371-4d38-994c-92b5b7d123cc\") " pod="openshift-marketplace/redhat-operators-gktqp" Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.443620 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.443686 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cfb2e52-7371-4d38-994c-92b5b7d123cc-catalog-content\") pod \"redhat-operators-gktqp\" (UID: \"0cfb2e52-7371-4d38-994c-92b5b7d123cc\") " pod="openshift-marketplace/redhat-operators-gktqp" Jan 31 14:44:09 crc kubenswrapper[4751]: E0131 14:44:09.444181 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:09.944157826 +0000 UTC m=+154.318870721 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.444483 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cfb2e52-7371-4d38-994c-92b5b7d123cc-catalog-content\") pod \"redhat-operators-gktqp\" (UID: \"0cfb2e52-7371-4d38-994c-92b5b7d123cc\") " pod="openshift-marketplace/redhat-operators-gktqp" Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.444840 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cfb2e52-7371-4d38-994c-92b5b7d123cc-utilities\") pod \"redhat-operators-gktqp\" (UID: \"0cfb2e52-7371-4d38-994c-92b5b7d123cc\") " pod="openshift-marketplace/redhat-operators-gktqp" Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.466219 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgf8m\" (UniqueName: \"kubernetes.io/projected/0cfb2e52-7371-4d38-994c-92b5b7d123cc-kube-api-access-qgf8m\") pod \"redhat-operators-gktqp\" (UID: \"0cfb2e52-7371-4d38-994c-92b5b7d123cc\") " pod="openshift-marketplace/redhat-operators-gktqp" Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.545239 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:09 crc kubenswrapper[4751]: E0131 14:44:09.545430 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:10.04540281 +0000 UTC m=+154.420115695 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.545662 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:09 crc kubenswrapper[4751]: E0131 14:44:09.546026 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:10.046002166 +0000 UTC m=+154.420715051 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.588057 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gktqp" Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.646938 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:09 crc kubenswrapper[4751]: E0131 14:44:09.647194 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:10.147137756 +0000 UTC m=+154.521850641 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.647324 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:09 crc kubenswrapper[4751]: E0131 14:44:09.647791 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:10.147758383 +0000 UTC m=+154.522471268 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.674993 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-s7j7f"] Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.675981 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s7j7f" Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.687322 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s7j7f"] Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.748272 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:09 crc kubenswrapper[4751]: E0131 14:44:09.748474 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 14:44:10.248442891 +0000 UTC m=+154.623155786 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.749138 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hwzm\" (UniqueName: \"kubernetes.io/projected/f614f9ab-b5e2-4548-93e7-571d1ffb57b0-kube-api-access-8hwzm\") pod \"redhat-operators-s7j7f\" (UID: \"f614f9ab-b5e2-4548-93e7-571d1ffb57b0\") " pod="openshift-marketplace/redhat-operators-s7j7f" Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.749180 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f614f9ab-b5e2-4548-93e7-571d1ffb57b0-catalog-content\") pod \"redhat-operators-s7j7f\" (UID: \"f614f9ab-b5e2-4548-93e7-571d1ffb57b0\") " pod="openshift-marketplace/redhat-operators-s7j7f" Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.749335 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f614f9ab-b5e2-4548-93e7-571d1ffb57b0-utilities\") pod \"redhat-operators-s7j7f\" (UID: \"f614f9ab-b5e2-4548-93e7-571d1ffb57b0\") " pod="openshift-marketplace/redhat-operators-s7j7f" Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.749393 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:09 crc kubenswrapper[4751]: E0131 14:44:09.762514 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 14:44:10.262485242 +0000 UTC m=+154.637198127 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpbgx" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.831590 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gktqp"] Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.836595 4751 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-01-31T14:44:09.032942248Z","Handler":null,"Name":""} Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.839512 4751 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.839545 4751 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Jan 31 14:44:09 crc kubenswrapper[4751]: W0131 14:44:09.841311 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cfb2e52_7371_4d38_994c_92b5b7d123cc.slice/crio-6d8aa8d0e0300436346b38972033f042890b145471a99f8a553c2f56d280787e WatchSource:0}: Error finding container 6d8aa8d0e0300436346b38972033f042890b145471a99f8a553c2f56d280787e: Status 404 returned error can't find the container with id 6d8aa8d0e0300436346b38972033f042890b145471a99f8a553c2f56d280787e Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.851962 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.852208 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f614f9ab-b5e2-4548-93e7-571d1ffb57b0-catalog-content\") pod \"redhat-operators-s7j7f\" (UID: \"f614f9ab-b5e2-4548-93e7-571d1ffb57b0\") " pod="openshift-marketplace/redhat-operators-s7j7f" Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.852252 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f614f9ab-b5e2-4548-93e7-571d1ffb57b0-utilities\") pod \"redhat-operators-s7j7f\" (UID: \"f614f9ab-b5e2-4548-93e7-571d1ffb57b0\") " pod="openshift-marketplace/redhat-operators-s7j7f" Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.852329 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hwzm\" (UniqueName: \"kubernetes.io/projected/f614f9ab-b5e2-4548-93e7-571d1ffb57b0-kube-api-access-8hwzm\") pod \"redhat-operators-s7j7f\" (UID: \"f614f9ab-b5e2-4548-93e7-571d1ffb57b0\") " pod="openshift-marketplace/redhat-operators-s7j7f" Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.852746 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f614f9ab-b5e2-4548-93e7-571d1ffb57b0-catalog-content\") pod \"redhat-operators-s7j7f\" (UID: \"f614f9ab-b5e2-4548-93e7-571d1ffb57b0\") " pod="openshift-marketplace/redhat-operators-s7j7f" Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.858542 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f614f9ab-b5e2-4548-93e7-571d1ffb57b0-utilities\") pod \"redhat-operators-s7j7f\" (UID: \"f614f9ab-b5e2-4548-93e7-571d1ffb57b0\") " pod="openshift-marketplace/redhat-operators-s7j7f" Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.859303 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.867737 4751 patch_prober.go:28] interesting pod/router-default-5444994796-5hn9b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 14:44:09 crc kubenswrapper[4751]: [-]has-synced failed: reason withheld Jan 31 14:44:09 crc kubenswrapper[4751]: [+]process-running ok Jan 31 14:44:09 crc kubenswrapper[4751]: healthz check failed Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.867786 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5hn9b" podUID="01ff1674-4e01-4cdc-aea3-1e91a6a389e3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.873281 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hwzm\" (UniqueName: \"kubernetes.io/projected/f614f9ab-b5e2-4548-93e7-571d1ffb57b0-kube-api-access-8hwzm\") pod \"redhat-operators-s7j7f\" (UID: \"f614f9ab-b5e2-4548-93e7-571d1ffb57b0\") " pod="openshift-marketplace/redhat-operators-s7j7f" Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.889515 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-x4rnh" event={"ID":"a55fc688-004a-4d6f-a48e-c10b0ae1d8f1","Type":"ContainerStarted","Data":"ba5e0e5ba49d3b46b4260cd7fd4839fed6a2a5958b6941e1910ddfd9298fbde7"} Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.892317 4751 generic.go:334] "Generic (PLEG): container finished" podID="e771b68a-beea-4c8b-a085-b869155ca20d" containerID="cc1400d076f7032bfa7b9349903c39c2a8d9d2e65e96a7551c8c78a1f7255455" exitCode=0 Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.892408 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nfjx5" event={"ID":"e771b68a-beea-4c8b-a085-b869155ca20d","Type":"ContainerDied","Data":"cc1400d076f7032bfa7b9349903c39c2a8d9d2e65e96a7551c8c78a1f7255455"} Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.892469 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nfjx5" event={"ID":"e771b68a-beea-4c8b-a085-b869155ca20d","Type":"ContainerStarted","Data":"17e2b2135e55e973ccc015ba33cfd9e0c7a1763d73b3153f649e1c6747bac744"} Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.894339 4751 generic.go:334] "Generic (PLEG): container finished" podID="e656c7af-fbd9-4e9c-ae61-d4142d37c89f" containerID="6c75c5ad4aa0723fec261497091fc30b60d95e73f9fe993ece85f3e477da66ef" exitCode=0 Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.894409 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k2xfl" event={"ID":"e656c7af-fbd9-4e9c-ae61-d4142d37c89f","Type":"ContainerDied","Data":"6c75c5ad4aa0723fec261497091fc30b60d95e73f9fe993ece85f3e477da66ef"} Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.896160 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"7322d0f6-a94f-48be-98fb-b2883f20cc53","Type":"ContainerStarted","Data":"38c7f576f0ad4b5e8d74c391485eb57e1eab7f03e125ea86814743a2e11cd91c"} Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.896191 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"7322d0f6-a94f-48be-98fb-b2883f20cc53","Type":"ContainerStarted","Data":"a3df1f6a5863eb19b6d181d426ef7f986d5e8f0fcb559160484636c7ea634096"} Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.897456 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gktqp" event={"ID":"0cfb2e52-7371-4d38-994c-92b5b7d123cc","Type":"ContainerStarted","Data":"6d8aa8d0e0300436346b38972033f042890b145471a99f8a553c2f56d280787e"} Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.903915 4751 generic.go:334] "Generic (PLEG): container finished" podID="44a681ea-f7f5-4eba-b40e-03ea17fd4bf8" containerID="3baa617a27e83d80f5320f7cc47fc62891a992ae7b55cc71b019d15fc16ab870" exitCode=0 Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.903951 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"44a681ea-f7f5-4eba-b40e-03ea17fd4bf8","Type":"ContainerDied","Data":"3baa617a27e83d80f5320f7cc47fc62891a992ae7b55cc71b019d15fc16ab870"} Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.915750 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-x4rnh" podStartSLOduration=14.915717788 podStartE2EDuration="14.915717788s" podCreationTimestamp="2026-01-31 14:43:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:44:09.909290999 +0000 UTC m=+154.284003884" watchObservedRunningTime="2026-01-31 14:44:09.915717788 +0000 UTC m=+154.290430673" Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.944990 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=1.944973141 podStartE2EDuration="1.944973141s" podCreationTimestamp="2026-01-31 14:44:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:44:09.928333372 +0000 UTC m=+154.303046267" watchObservedRunningTime="2026-01-31 14:44:09.944973141 +0000 UTC m=+154.319686026" Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.954232 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.961118 4751 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 31 14:44:09 crc kubenswrapper[4751]: I0131 14:44:09.961160 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:10 crc kubenswrapper[4751]: I0131 14:44:09.999965 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s7j7f" Jan 31 14:44:10 crc kubenswrapper[4751]: I0131 14:44:10.000153 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpbgx\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:10 crc kubenswrapper[4751]: I0131 14:44:10.221289 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s7j7f"] Jan 31 14:44:10 crc kubenswrapper[4751]: I0131 14:44:10.276522 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:10 crc kubenswrapper[4751]: I0131 14:44:10.441935 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Jan 31 14:44:10 crc kubenswrapper[4751]: I0131 14:44:10.555002 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mpbgx"] Jan 31 14:44:10 crc kubenswrapper[4751]: I0131 14:44:10.870644 4751 patch_prober.go:28] interesting pod/router-default-5444994796-5hn9b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 14:44:10 crc kubenswrapper[4751]: [-]has-synced failed: reason withheld Jan 31 14:44:10 crc kubenswrapper[4751]: [+]process-running ok Jan 31 14:44:10 crc kubenswrapper[4751]: healthz check failed Jan 31 14:44:10 crc kubenswrapper[4751]: I0131 14:44:10.870712 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5hn9b" podUID="01ff1674-4e01-4cdc-aea3-1e91a6a389e3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 14:44:10 crc kubenswrapper[4751]: I0131 14:44:10.911023 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s7j7f" event={"ID":"f614f9ab-b5e2-4548-93e7-571d1ffb57b0","Type":"ContainerStarted","Data":"7b416007999209b30e30ac3cbb706b9a31917cc6ff3256ae9a397696b89670d4"} Jan 31 14:44:10 crc kubenswrapper[4751]: I0131 14:44:10.913539 4751 generic.go:334] "Generic (PLEG): container finished" podID="0cfb2e52-7371-4d38-994c-92b5b7d123cc" containerID="aa50b668454ba4cf1d6033028034c77daf53f009e58a1184a7d22b857abf8b23" exitCode=0 Jan 31 14:44:10 crc kubenswrapper[4751]: I0131 14:44:10.913589 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gktqp" event={"ID":"0cfb2e52-7371-4d38-994c-92b5b7d123cc","Type":"ContainerDied","Data":"aa50b668454ba4cf1d6033028034c77daf53f009e58a1184a7d22b857abf8b23"} Jan 31 14:44:10 crc kubenswrapper[4751]: I0131 14:44:10.917529 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" event={"ID":"4e18e163-6cf0-48ef-9a6f-90cbece870b0","Type":"ContainerStarted","Data":"f189ebd73b2de2ffc6329477d3690421c7e4c89608c81de50df6ebb8b9b1c5e0"} Jan 31 14:44:11 crc kubenswrapper[4751]: I0131 14:44:11.174549 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 31 14:44:11 crc kubenswrapper[4751]: I0131 14:44:11.276267 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/44a681ea-f7f5-4eba-b40e-03ea17fd4bf8-kubelet-dir\") pod \"44a681ea-f7f5-4eba-b40e-03ea17fd4bf8\" (UID: \"44a681ea-f7f5-4eba-b40e-03ea17fd4bf8\") " Jan 31 14:44:11 crc kubenswrapper[4751]: I0131 14:44:11.276338 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/44a681ea-f7f5-4eba-b40e-03ea17fd4bf8-kube-api-access\") pod \"44a681ea-f7f5-4eba-b40e-03ea17fd4bf8\" (UID: \"44a681ea-f7f5-4eba-b40e-03ea17fd4bf8\") " Jan 31 14:44:11 crc kubenswrapper[4751]: I0131 14:44:11.279235 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/44a681ea-f7f5-4eba-b40e-03ea17fd4bf8-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "44a681ea-f7f5-4eba-b40e-03ea17fd4bf8" (UID: "44a681ea-f7f5-4eba-b40e-03ea17fd4bf8"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 14:44:11 crc kubenswrapper[4751]: I0131 14:44:11.282374 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44a681ea-f7f5-4eba-b40e-03ea17fd4bf8-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "44a681ea-f7f5-4eba-b40e-03ea17fd4bf8" (UID: "44a681ea-f7f5-4eba-b40e-03ea17fd4bf8"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:44:11 crc kubenswrapper[4751]: I0131 14:44:11.378420 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/44a681ea-f7f5-4eba-b40e-03ea17fd4bf8-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 31 14:44:11 crc kubenswrapper[4751]: I0131 14:44:11.378460 4751 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/44a681ea-f7f5-4eba-b40e-03ea17fd4bf8-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 31 14:44:12 crc kubenswrapper[4751]: I0131 14:44:11.867116 4751 patch_prober.go:28] interesting pod/router-default-5444994796-5hn9b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 14:44:12 crc kubenswrapper[4751]: [-]has-synced failed: reason withheld Jan 31 14:44:12 crc kubenswrapper[4751]: [+]process-running ok Jan 31 14:44:12 crc kubenswrapper[4751]: healthz check failed Jan 31 14:44:12 crc kubenswrapper[4751]: I0131 14:44:11.867195 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5hn9b" podUID="01ff1674-4e01-4cdc-aea3-1e91a6a389e3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 14:44:12 crc kubenswrapper[4751]: I0131 14:44:12.042344 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"44a681ea-f7f5-4eba-b40e-03ea17fd4bf8","Type":"ContainerDied","Data":"370d04a4b77cb1df2a005e656252d040f2a1db0e2e84ee84b32b04f105cfd9d0"} Jan 31 14:44:12 crc kubenswrapper[4751]: I0131 14:44:12.042419 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="370d04a4b77cb1df2a005e656252d040f2a1db0e2e84ee84b32b04f105cfd9d0" Jan 31 14:44:12 crc kubenswrapper[4751]: I0131 14:44:12.042418 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 31 14:44:12 crc kubenswrapper[4751]: I0131 14:44:12.867468 4751 patch_prober.go:28] interesting pod/router-default-5444994796-5hn9b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 14:44:12 crc kubenswrapper[4751]: [-]has-synced failed: reason withheld Jan 31 14:44:12 crc kubenswrapper[4751]: [+]process-running ok Jan 31 14:44:12 crc kubenswrapper[4751]: healthz check failed Jan 31 14:44:12 crc kubenswrapper[4751]: I0131 14:44:12.867539 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5hn9b" podUID="01ff1674-4e01-4cdc-aea3-1e91a6a389e3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 14:44:13 crc kubenswrapper[4751]: I0131 14:44:13.049606 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" event={"ID":"4e18e163-6cf0-48ef-9a6f-90cbece870b0","Type":"ContainerStarted","Data":"4a4776950d27c1d1245ca6dd71fb7012b30d42bb2d21525539ad27b3f377c032"} Jan 31 14:44:13 crc kubenswrapper[4751]: I0131 14:44:13.049788 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:13 crc kubenswrapper[4751]: I0131 14:44:13.053285 4751 generic.go:334] "Generic (PLEG): container finished" podID="f614f9ab-b5e2-4548-93e7-571d1ffb57b0" containerID="9dea3e4098c379086439a00ba95f58535865ef9c6e3300b004af608a3da30bb4" exitCode=0 Jan 31 14:44:13 crc kubenswrapper[4751]: I0131 14:44:13.053301 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s7j7f" event={"ID":"f614f9ab-b5e2-4548-93e7-571d1ffb57b0","Type":"ContainerDied","Data":"9dea3e4098c379086439a00ba95f58535865ef9c6e3300b004af608a3da30bb4"} Jan 31 14:44:13 crc kubenswrapper[4751]: I0131 14:44:13.055661 4751 generic.go:334] "Generic (PLEG): container finished" podID="7322d0f6-a94f-48be-98fb-b2883f20cc53" containerID="38c7f576f0ad4b5e8d74c391485eb57e1eab7f03e125ea86814743a2e11cd91c" exitCode=0 Jan 31 14:44:13 crc kubenswrapper[4751]: I0131 14:44:13.055694 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"7322d0f6-a94f-48be-98fb-b2883f20cc53","Type":"ContainerDied","Data":"38c7f576f0ad4b5e8d74c391485eb57e1eab7f03e125ea86814743a2e11cd91c"} Jan 31 14:44:13 crc kubenswrapper[4751]: I0131 14:44:13.070470 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" podStartSLOduration=132.070453942 podStartE2EDuration="2m12.070453942s" podCreationTimestamp="2026-01-31 14:42:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:44:13.069353943 +0000 UTC m=+157.444066838" watchObservedRunningTime="2026-01-31 14:44:13.070453942 +0000 UTC m=+157.445166847" Jan 31 14:44:13 crc kubenswrapper[4751]: I0131 14:44:13.540812 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-5f7jc" Jan 31 14:44:13 crc kubenswrapper[4751]: I0131 14:44:13.545394 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-5f7jc" Jan 31 14:44:13 crc kubenswrapper[4751]: I0131 14:44:13.868240 4751 patch_prober.go:28] interesting pod/router-default-5444994796-5hn9b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 14:44:13 crc kubenswrapper[4751]: [-]has-synced failed: reason withheld Jan 31 14:44:13 crc kubenswrapper[4751]: [+]process-running ok Jan 31 14:44:13 crc kubenswrapper[4751]: healthz check failed Jan 31 14:44:13 crc kubenswrapper[4751]: I0131 14:44:13.868748 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5hn9b" podUID="01ff1674-4e01-4cdc-aea3-1e91a6a389e3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 14:44:14 crc kubenswrapper[4751]: I0131 14:44:14.047890 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-skzbg" Jan 31 14:44:14 crc kubenswrapper[4751]: I0131 14:44:14.866139 4751 patch_prober.go:28] interesting pod/router-default-5444994796-5hn9b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 14:44:14 crc kubenswrapper[4751]: [-]has-synced failed: reason withheld Jan 31 14:44:14 crc kubenswrapper[4751]: [+]process-running ok Jan 31 14:44:14 crc kubenswrapper[4751]: healthz check failed Jan 31 14:44:14 crc kubenswrapper[4751]: I0131 14:44:14.866181 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5hn9b" podUID="01ff1674-4e01-4cdc-aea3-1e91a6a389e3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 14:44:15 crc kubenswrapper[4751]: I0131 14:44:15.866414 4751 patch_prober.go:28] interesting pod/router-default-5444994796-5hn9b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 14:44:15 crc kubenswrapper[4751]: [-]has-synced failed: reason withheld Jan 31 14:44:15 crc kubenswrapper[4751]: [+]process-running ok Jan 31 14:44:15 crc kubenswrapper[4751]: healthz check failed Jan 31 14:44:15 crc kubenswrapper[4751]: I0131 14:44:15.866750 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5hn9b" podUID="01ff1674-4e01-4cdc-aea3-1e91a6a389e3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 14:44:16 crc kubenswrapper[4751]: I0131 14:44:16.866984 4751 patch_prober.go:28] interesting pod/router-default-5444994796-5hn9b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 14:44:16 crc kubenswrapper[4751]: [-]has-synced failed: reason withheld Jan 31 14:44:16 crc kubenswrapper[4751]: [+]process-running ok Jan 31 14:44:16 crc kubenswrapper[4751]: healthz check failed Jan 31 14:44:16 crc kubenswrapper[4751]: I0131 14:44:16.867039 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5hn9b" podUID="01ff1674-4e01-4cdc-aea3-1e91a6a389e3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 14:44:17 crc kubenswrapper[4751]: I0131 14:44:17.866437 4751 patch_prober.go:28] interesting pod/router-default-5444994796-5hn9b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 14:44:17 crc kubenswrapper[4751]: [-]has-synced failed: reason withheld Jan 31 14:44:17 crc kubenswrapper[4751]: [+]process-running ok Jan 31 14:44:17 crc kubenswrapper[4751]: healthz check failed Jan 31 14:44:17 crc kubenswrapper[4751]: I0131 14:44:17.866755 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5hn9b" podUID="01ff1674-4e01-4cdc-aea3-1e91a6a389e3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 14:44:18 crc kubenswrapper[4751]: I0131 14:44:18.166542 4751 patch_prober.go:28] interesting pod/downloads-7954f5f757-4m7jl container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Jan 31 14:44:18 crc kubenswrapper[4751]: I0131 14:44:18.166597 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-4m7jl" podUID="d723501b-bb29-4d60-ad97-239eb749771f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Jan 31 14:44:18 crc kubenswrapper[4751]: I0131 14:44:18.166606 4751 patch_prober.go:28] interesting pod/downloads-7954f5f757-4m7jl container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Jan 31 14:44:18 crc kubenswrapper[4751]: I0131 14:44:18.166673 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-4m7jl" podUID="d723501b-bb29-4d60-ad97-239eb749771f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Jan 31 14:44:18 crc kubenswrapper[4751]: I0131 14:44:18.865907 4751 patch_prober.go:28] interesting pod/router-default-5444994796-5hn9b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 14:44:18 crc kubenswrapper[4751]: [-]has-synced failed: reason withheld Jan 31 14:44:18 crc kubenswrapper[4751]: [+]process-running ok Jan 31 14:44:18 crc kubenswrapper[4751]: healthz check failed Jan 31 14:44:18 crc kubenswrapper[4751]: I0131 14:44:18.865974 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5hn9b" podUID="01ff1674-4e01-4cdc-aea3-1e91a6a389e3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 14:44:18 crc kubenswrapper[4751]: I0131 14:44:18.929030 4751 patch_prober.go:28] interesting pod/console-f9d7485db-h262z container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.17:8443/health\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Jan 31 14:44:18 crc kubenswrapper[4751]: I0131 14:44:18.929097 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-h262z" podUID="5caeb3dc-2a42-41b5-ac91-c1c8a216fb43" containerName="console" probeResult="failure" output="Get \"https://10.217.0.17:8443/health\": dial tcp 10.217.0.17:8443: connect: connection refused" Jan 31 14:44:19 crc kubenswrapper[4751]: I0131 14:44:19.866838 4751 patch_prober.go:28] interesting pod/router-default-5444994796-5hn9b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 14:44:19 crc kubenswrapper[4751]: [-]has-synced failed: reason withheld Jan 31 14:44:19 crc kubenswrapper[4751]: [+]process-running ok Jan 31 14:44:19 crc kubenswrapper[4751]: healthz check failed Jan 31 14:44:19 crc kubenswrapper[4751]: I0131 14:44:19.866894 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5hn9b" podUID="01ff1674-4e01-4cdc-aea3-1e91a6a389e3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 14:44:20 crc kubenswrapper[4751]: I0131 14:44:20.868039 4751 patch_prober.go:28] interesting pod/router-default-5444994796-5hn9b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 14:44:20 crc kubenswrapper[4751]: [-]has-synced failed: reason withheld Jan 31 14:44:20 crc kubenswrapper[4751]: [+]process-running ok Jan 31 14:44:20 crc kubenswrapper[4751]: healthz check failed Jan 31 14:44:20 crc kubenswrapper[4751]: I0131 14:44:20.868122 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5hn9b" podUID="01ff1674-4e01-4cdc-aea3-1e91a6a389e3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 14:44:21 crc kubenswrapper[4751]: I0131 14:44:21.352415 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 31 14:44:21 crc kubenswrapper[4751]: I0131 14:44:21.437440 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7322d0f6-a94f-48be-98fb-b2883f20cc53-kubelet-dir\") pod \"7322d0f6-a94f-48be-98fb-b2883f20cc53\" (UID: \"7322d0f6-a94f-48be-98fb-b2883f20cc53\") " Jan 31 14:44:21 crc kubenswrapper[4751]: I0131 14:44:21.437562 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7322d0f6-a94f-48be-98fb-b2883f20cc53-kube-api-access\") pod \"7322d0f6-a94f-48be-98fb-b2883f20cc53\" (UID: \"7322d0f6-a94f-48be-98fb-b2883f20cc53\") " Jan 31 14:44:21 crc kubenswrapper[4751]: I0131 14:44:21.437577 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7322d0f6-a94f-48be-98fb-b2883f20cc53-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "7322d0f6-a94f-48be-98fb-b2883f20cc53" (UID: "7322d0f6-a94f-48be-98fb-b2883f20cc53"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 14:44:21 crc kubenswrapper[4751]: I0131 14:44:21.437795 4751 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7322d0f6-a94f-48be-98fb-b2883f20cc53-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 31 14:44:21 crc kubenswrapper[4751]: I0131 14:44:21.442671 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7322d0f6-a94f-48be-98fb-b2883f20cc53-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "7322d0f6-a94f-48be-98fb-b2883f20cc53" (UID: "7322d0f6-a94f-48be-98fb-b2883f20cc53"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:44:21 crc kubenswrapper[4751]: I0131 14:44:21.539130 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7322d0f6-a94f-48be-98fb-b2883f20cc53-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 31 14:44:21 crc kubenswrapper[4751]: I0131 14:44:21.866886 4751 patch_prober.go:28] interesting pod/router-default-5444994796-5hn9b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 14:44:21 crc kubenswrapper[4751]: [-]has-synced failed: reason withheld Jan 31 14:44:21 crc kubenswrapper[4751]: [+]process-running ok Jan 31 14:44:21 crc kubenswrapper[4751]: healthz check failed Jan 31 14:44:21 crc kubenswrapper[4751]: I0131 14:44:21.866963 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5hn9b" podUID="01ff1674-4e01-4cdc-aea3-1e91a6a389e3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 14:44:22 crc kubenswrapper[4751]: I0131 14:44:22.122046 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"7322d0f6-a94f-48be-98fb-b2883f20cc53","Type":"ContainerDied","Data":"a3df1f6a5863eb19b6d181d426ef7f986d5e8f0fcb559160484636c7ea634096"} Jan 31 14:44:22 crc kubenswrapper[4751]: I0131 14:44:22.122105 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3df1f6a5863eb19b6d181d426ef7f986d5e8f0fcb559160484636c7ea634096" Jan 31 14:44:22 crc kubenswrapper[4751]: I0131 14:44:22.122108 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 31 14:44:22 crc kubenswrapper[4751]: I0131 14:44:22.867136 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-5hn9b" Jan 31 14:44:22 crc kubenswrapper[4751]: I0131 14:44:22.869840 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-5hn9b" Jan 31 14:44:23 crc kubenswrapper[4751]: I0131 14:44:23.969898 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/68aeb9c7-d3c3-4c34-96ab-bb947421c504-metrics-certs\") pod \"network-metrics-daemon-xtn6l\" (UID: \"68aeb9c7-d3c3-4c34-96ab-bb947421c504\") " pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:44:23 crc kubenswrapper[4751]: I0131 14:44:23.976868 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/68aeb9c7-d3c3-4c34-96ab-bb947421c504-metrics-certs\") pod \"network-metrics-daemon-xtn6l\" (UID: \"68aeb9c7-d3c3-4c34-96ab-bb947421c504\") " pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:44:24 crc kubenswrapper[4751]: I0131 14:44:24.010685 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-sxjf5"] Jan 31 14:44:24 crc kubenswrapper[4751]: I0131 14:44:24.011006 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-sxjf5" podUID="c1e92f9b-2291-4bd5-80b8-c2f9e667acf8" containerName="controller-manager" containerID="cri-o://96a0531e47323a9257c24b651a7067cc71a6c2a1c9189022bfa8c72e23c446c1" gracePeriod=30 Jan 31 14:44:24 crc kubenswrapper[4751]: I0131 14:44:24.024458 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xtn6l" Jan 31 14:44:24 crc kubenswrapper[4751]: I0131 14:44:24.025895 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7762w"] Jan 31 14:44:24 crc kubenswrapper[4751]: I0131 14:44:24.026419 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7762w" podUID="84e2930a-5ae3-4171-a3dd-e5eea62ef157" containerName="route-controller-manager" containerID="cri-o://8e402889398f0b5d93bacd46f42378e3cdc7f2ee478995578d04804d8ec0f029" gracePeriod=30 Jan 31 14:44:25 crc kubenswrapper[4751]: I0131 14:44:25.153119 4751 generic.go:334] "Generic (PLEG): container finished" podID="84e2930a-5ae3-4171-a3dd-e5eea62ef157" containerID="8e402889398f0b5d93bacd46f42378e3cdc7f2ee478995578d04804d8ec0f029" exitCode=0 Jan 31 14:44:25 crc kubenswrapper[4751]: I0131 14:44:25.153211 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7762w" event={"ID":"84e2930a-5ae3-4171-a3dd-e5eea62ef157","Type":"ContainerDied","Data":"8e402889398f0b5d93bacd46f42378e3cdc7f2ee478995578d04804d8ec0f029"} Jan 31 14:44:25 crc kubenswrapper[4751]: I0131 14:44:25.154560 4751 generic.go:334] "Generic (PLEG): container finished" podID="c1e92f9b-2291-4bd5-80b8-c2f9e667acf8" containerID="96a0531e47323a9257c24b651a7067cc71a6c2a1c9189022bfa8c72e23c446c1" exitCode=0 Jan 31 14:44:25 crc kubenswrapper[4751]: I0131 14:44:25.154587 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-sxjf5" event={"ID":"c1e92f9b-2291-4bd5-80b8-c2f9e667acf8","Type":"ContainerDied","Data":"96a0531e47323a9257c24b651a7067cc71a6c2a1c9189022bfa8c72e23c446c1"} Jan 31 14:44:28 crc kubenswrapper[4751]: I0131 14:44:28.172329 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-4m7jl" Jan 31 14:44:28 crc kubenswrapper[4751]: I0131 14:44:28.937922 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-h262z" Jan 31 14:44:28 crc kubenswrapper[4751]: I0131 14:44:28.946740 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-h262z" Jan 31 14:44:29 crc kubenswrapper[4751]: I0131 14:44:29.014485 4751 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-sxjf5 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 31 14:44:29 crc kubenswrapper[4751]: I0131 14:44:29.014573 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-sxjf5" podUID="c1e92f9b-2291-4bd5-80b8-c2f9e667acf8" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 31 14:44:29 crc kubenswrapper[4751]: I0131 14:44:29.839328 4751 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-7762w container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 31 14:44:29 crc kubenswrapper[4751]: I0131 14:44:29.839409 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7762w" podUID="84e2930a-5ae3-4171-a3dd-e5eea62ef157" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 31 14:44:30 crc kubenswrapper[4751]: I0131 14:44:30.285805 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:44:36 crc kubenswrapper[4751]: E0131 14:44:36.600269 4751 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 31 14:44:36 crc kubenswrapper[4751]: E0131 14:44:36.601327 4751 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xd7bp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-ln2lx_openshift-marketplace(d5c0f5c8-cecf-451f-abef-bf357716eb71): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 31 14:44:36 crc kubenswrapper[4751]: E0131 14:44:36.602542 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-ln2lx" podUID="d5c0f5c8-cecf-451f-abef-bf357716eb71" Jan 31 14:44:38 crc kubenswrapper[4751]: I0131 14:44:38.896786 4751 patch_prober.go:28] interesting pod/machine-config-daemon-2wpj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 14:44:38 crc kubenswrapper[4751]: I0131 14:44:38.897268 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podUID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 14:44:39 crc kubenswrapper[4751]: I0131 14:44:39.014751 4751 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-sxjf5 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 31 14:44:39 crc kubenswrapper[4751]: I0131 14:44:39.015344 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-sxjf5" podUID="c1e92f9b-2291-4bd5-80b8-c2f9e667acf8" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 31 14:44:39 crc kubenswrapper[4751]: I0131 14:44:39.279454 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kvfvk" Jan 31 14:44:39 crc kubenswrapper[4751]: I0131 14:44:39.839818 4751 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-7762w container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": context deadline exceeded" start-of-body= Jan 31 14:44:39 crc kubenswrapper[4751]: I0131 14:44:39.839912 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7762w" podUID="84e2930a-5ae3-4171-a3dd-e5eea62ef157" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": context deadline exceeded" Jan 31 14:44:45 crc kubenswrapper[4751]: E0131 14:44:45.164920 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-ln2lx" podUID="d5c0f5c8-cecf-451f-abef-bf357716eb71" Jan 31 14:44:45 crc kubenswrapper[4751]: I0131 14:44:45.607456 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 31 14:44:45 crc kubenswrapper[4751]: E0131 14:44:45.607840 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44a681ea-f7f5-4eba-b40e-03ea17fd4bf8" containerName="pruner" Jan 31 14:44:45 crc kubenswrapper[4751]: I0131 14:44:45.607867 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="44a681ea-f7f5-4eba-b40e-03ea17fd4bf8" containerName="pruner" Jan 31 14:44:45 crc kubenswrapper[4751]: E0131 14:44:45.607895 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7322d0f6-a94f-48be-98fb-b2883f20cc53" containerName="pruner" Jan 31 14:44:45 crc kubenswrapper[4751]: I0131 14:44:45.607911 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="7322d0f6-a94f-48be-98fb-b2883f20cc53" containerName="pruner" Jan 31 14:44:45 crc kubenswrapper[4751]: I0131 14:44:45.608185 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="44a681ea-f7f5-4eba-b40e-03ea17fd4bf8" containerName="pruner" Jan 31 14:44:45 crc kubenswrapper[4751]: I0131 14:44:45.608215 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="7322d0f6-a94f-48be-98fb-b2883f20cc53" containerName="pruner" Jan 31 14:44:45 crc kubenswrapper[4751]: I0131 14:44:45.609010 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 31 14:44:45 crc kubenswrapper[4751]: I0131 14:44:45.618414 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 31 14:44:45 crc kubenswrapper[4751]: I0131 14:44:45.618498 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 31 14:44:45 crc kubenswrapper[4751]: I0131 14:44:45.622160 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 31 14:44:45 crc kubenswrapper[4751]: I0131 14:44:45.657916 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 14:44:45 crc kubenswrapper[4751]: I0131 14:44:45.729189 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4ee27ad5-3acb-4388-a964-3b526b79e776-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4ee27ad5-3acb-4388-a964-3b526b79e776\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 31 14:44:45 crc kubenswrapper[4751]: I0131 14:44:45.729427 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4ee27ad5-3acb-4388-a964-3b526b79e776-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4ee27ad5-3acb-4388-a964-3b526b79e776\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 31 14:44:45 crc kubenswrapper[4751]: I0131 14:44:45.830154 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4ee27ad5-3acb-4388-a964-3b526b79e776-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4ee27ad5-3acb-4388-a964-3b526b79e776\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 31 14:44:45 crc kubenswrapper[4751]: I0131 14:44:45.830292 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4ee27ad5-3acb-4388-a964-3b526b79e776-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4ee27ad5-3acb-4388-a964-3b526b79e776\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 31 14:44:45 crc kubenswrapper[4751]: I0131 14:44:45.830400 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4ee27ad5-3acb-4388-a964-3b526b79e776-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4ee27ad5-3acb-4388-a964-3b526b79e776\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 31 14:44:45 crc kubenswrapper[4751]: I0131 14:44:45.852872 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4ee27ad5-3acb-4388-a964-3b526b79e776-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4ee27ad5-3acb-4388-a964-3b526b79e776\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 31 14:44:45 crc kubenswrapper[4751]: I0131 14:44:45.983136 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 31 14:44:49 crc kubenswrapper[4751]: I0131 14:44:49.015016 4751 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-sxjf5 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 31 14:44:49 crc kubenswrapper[4751]: I0131 14:44:49.015090 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-sxjf5" podUID="c1e92f9b-2291-4bd5-80b8-c2f9e667acf8" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 31 14:44:49 crc kubenswrapper[4751]: E0131 14:44:49.044582 4751 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 31 14:44:49 crc kubenswrapper[4751]: E0131 14:44:49.044869 4751 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-n2g96,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-2lq4t_openshift-marketplace(c447796d-48ac-4eeb-8fe6-ad411966b3d3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 31 14:44:49 crc kubenswrapper[4751]: E0131 14:44:49.046246 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-2lq4t" podUID="c447796d-48ac-4eeb-8fe6-ad411966b3d3" Jan 31 14:44:49 crc kubenswrapper[4751]: I0131 14:44:49.839584 4751 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-7762w container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 31 14:44:49 crc kubenswrapper[4751]: I0131 14:44:49.839730 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7762w" podUID="84e2930a-5ae3-4171-a3dd-e5eea62ef157" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.004255 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.005627 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.016023 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.106772 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ccfa0c88-7f51-4d85-8a49-e05865c6a06e-var-lock\") pod \"installer-9-crc\" (UID: \"ccfa0c88-7f51-4d85-8a49-e05865c6a06e\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.107130 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ccfa0c88-7f51-4d85-8a49-e05865c6a06e-kubelet-dir\") pod \"installer-9-crc\" (UID: \"ccfa0c88-7f51-4d85-8a49-e05865c6a06e\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.107192 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ccfa0c88-7f51-4d85-8a49-e05865c6a06e-kube-api-access\") pod \"installer-9-crc\" (UID: \"ccfa0c88-7f51-4d85-8a49-e05865c6a06e\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.208308 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ccfa0c88-7f51-4d85-8a49-e05865c6a06e-kubelet-dir\") pod \"installer-9-crc\" (UID: \"ccfa0c88-7f51-4d85-8a49-e05865c6a06e\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.208393 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ccfa0c88-7f51-4d85-8a49-e05865c6a06e-kube-api-access\") pod \"installer-9-crc\" (UID: \"ccfa0c88-7f51-4d85-8a49-e05865c6a06e\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.208442 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ccfa0c88-7f51-4d85-8a49-e05865c6a06e-var-lock\") pod \"installer-9-crc\" (UID: \"ccfa0c88-7f51-4d85-8a49-e05865c6a06e\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.208552 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ccfa0c88-7f51-4d85-8a49-e05865c6a06e-var-lock\") pod \"installer-9-crc\" (UID: \"ccfa0c88-7f51-4d85-8a49-e05865c6a06e\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.208612 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ccfa0c88-7f51-4d85-8a49-e05865c6a06e-kubelet-dir\") pod \"installer-9-crc\" (UID: \"ccfa0c88-7f51-4d85-8a49-e05865c6a06e\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.240728 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ccfa0c88-7f51-4d85-8a49-e05865c6a06e-kube-api-access\") pod \"installer-9-crc\" (UID: \"ccfa0c88-7f51-4d85-8a49-e05865c6a06e\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 31 14:44:50 crc kubenswrapper[4751]: E0131 14:44:50.291119 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-2lq4t" podUID="c447796d-48ac-4eeb-8fe6-ad411966b3d3" Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.317150 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7762w" event={"ID":"84e2930a-5ae3-4171-a3dd-e5eea62ef157","Type":"ContainerDied","Data":"15e734ffd4fba2493be6a9b1bfbac50c0f6bd9a8e2ffdca45f856621c3703f44"} Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.317206 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15e734ffd4fba2493be6a9b1bfbac50c0f6bd9a8e2ffdca45f856621c3703f44" Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.320999 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-sxjf5" event={"ID":"c1e92f9b-2291-4bd5-80b8-c2f9e667acf8","Type":"ContainerDied","Data":"f58e74380a8c1e3f0d559b0c6a44b9911f247b06dc418233b9c41d9a25e6f05e"} Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.321039 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f58e74380a8c1e3f0d559b0c6a44b9911f247b06dc418233b9c41d9a25e6f05e" Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.335981 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.349434 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7762w" Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.355215 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-sxjf5" Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.402834 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fb78c7854-7b78f"] Jan 31 14:44:50 crc kubenswrapper[4751]: E0131 14:44:50.403186 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1e92f9b-2291-4bd5-80b8-c2f9e667acf8" containerName="controller-manager" Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.403207 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1e92f9b-2291-4bd5-80b8-c2f9e667acf8" containerName="controller-manager" Jan 31 14:44:50 crc kubenswrapper[4751]: E0131 14:44:50.403218 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84e2930a-5ae3-4171-a3dd-e5eea62ef157" containerName="route-controller-manager" Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.403228 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="84e2930a-5ae3-4171-a3dd-e5eea62ef157" containerName="route-controller-manager" Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.403366 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="84e2930a-5ae3-4171-a3dd-e5eea62ef157" containerName="route-controller-manager" Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.403384 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1e92f9b-2291-4bd5-80b8-c2f9e667acf8" containerName="controller-manager" Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.406178 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6fb78c7854-7b78f" Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.411850 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1e92f9b-2291-4bd5-80b8-c2f9e667acf8-config\") pod \"c1e92f9b-2291-4bd5-80b8-c2f9e667acf8\" (UID: \"c1e92f9b-2291-4bd5-80b8-c2f9e667acf8\") " Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.411903 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c1e92f9b-2291-4bd5-80b8-c2f9e667acf8-proxy-ca-bundles\") pod \"c1e92f9b-2291-4bd5-80b8-c2f9e667acf8\" (UID: \"c1e92f9b-2291-4bd5-80b8-c2f9e667acf8\") " Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.411941 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbprf\" (UniqueName: \"kubernetes.io/projected/c1e92f9b-2291-4bd5-80b8-c2f9e667acf8-kube-api-access-fbprf\") pod \"c1e92f9b-2291-4bd5-80b8-c2f9e667acf8\" (UID: \"c1e92f9b-2291-4bd5-80b8-c2f9e667acf8\") " Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.411966 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84e2930a-5ae3-4171-a3dd-e5eea62ef157-serving-cert\") pod \"84e2930a-5ae3-4171-a3dd-e5eea62ef157\" (UID: \"84e2930a-5ae3-4171-a3dd-e5eea62ef157\") " Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.412004 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84e2930a-5ae3-4171-a3dd-e5eea62ef157-config\") pod \"84e2930a-5ae3-4171-a3dd-e5eea62ef157\" (UID: \"84e2930a-5ae3-4171-a3dd-e5eea62ef157\") " Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.412040 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/84e2930a-5ae3-4171-a3dd-e5eea62ef157-client-ca\") pod \"84e2930a-5ae3-4171-a3dd-e5eea62ef157\" (UID: \"84e2930a-5ae3-4171-a3dd-e5eea62ef157\") " Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.412084 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ksqdw\" (UniqueName: \"kubernetes.io/projected/84e2930a-5ae3-4171-a3dd-e5eea62ef157-kube-api-access-ksqdw\") pod \"84e2930a-5ae3-4171-a3dd-e5eea62ef157\" (UID: \"84e2930a-5ae3-4171-a3dd-e5eea62ef157\") " Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.412121 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1e92f9b-2291-4bd5-80b8-c2f9e667acf8-serving-cert\") pod \"c1e92f9b-2291-4bd5-80b8-c2f9e667acf8\" (UID: \"c1e92f9b-2291-4bd5-80b8-c2f9e667acf8\") " Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.412162 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c1e92f9b-2291-4bd5-80b8-c2f9e667acf8-client-ca\") pod \"c1e92f9b-2291-4bd5-80b8-c2f9e667acf8\" (UID: \"c1e92f9b-2291-4bd5-80b8-c2f9e667acf8\") " Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.412261 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/78cdf25f-daec-4cd7-8954-1fef6f3727db-client-ca\") pod \"route-controller-manager-6fb78c7854-7b78f\" (UID: \"78cdf25f-daec-4cd7-8954-1fef6f3727db\") " pod="openshift-route-controller-manager/route-controller-manager-6fb78c7854-7b78f" Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.412332 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78cdf25f-daec-4cd7-8954-1fef6f3727db-serving-cert\") pod \"route-controller-manager-6fb78c7854-7b78f\" (UID: \"78cdf25f-daec-4cd7-8954-1fef6f3727db\") " pod="openshift-route-controller-manager/route-controller-manager-6fb78c7854-7b78f" Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.412369 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78cdf25f-daec-4cd7-8954-1fef6f3727db-config\") pod \"route-controller-manager-6fb78c7854-7b78f\" (UID: \"78cdf25f-daec-4cd7-8954-1fef6f3727db\") " pod="openshift-route-controller-manager/route-controller-manager-6fb78c7854-7b78f" Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.412400 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tv6b\" (UniqueName: \"kubernetes.io/projected/78cdf25f-daec-4cd7-8954-1fef6f3727db-kube-api-access-6tv6b\") pod \"route-controller-manager-6fb78c7854-7b78f\" (UID: \"78cdf25f-daec-4cd7-8954-1fef6f3727db\") " pod="openshift-route-controller-manager/route-controller-manager-6fb78c7854-7b78f" Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.413319 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1e92f9b-2291-4bd5-80b8-c2f9e667acf8-config" (OuterVolumeSpecName: "config") pod "c1e92f9b-2291-4bd5-80b8-c2f9e667acf8" (UID: "c1e92f9b-2291-4bd5-80b8-c2f9e667acf8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.413895 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1e92f9b-2291-4bd5-80b8-c2f9e667acf8-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "c1e92f9b-2291-4bd5-80b8-c2f9e667acf8" (UID: "c1e92f9b-2291-4bd5-80b8-c2f9e667acf8"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.415009 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1e92f9b-2291-4bd5-80b8-c2f9e667acf8-client-ca" (OuterVolumeSpecName: "client-ca") pod "c1e92f9b-2291-4bd5-80b8-c2f9e667acf8" (UID: "c1e92f9b-2291-4bd5-80b8-c2f9e667acf8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.415508 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84e2930a-5ae3-4171-a3dd-e5eea62ef157-client-ca" (OuterVolumeSpecName: "client-ca") pod "84e2930a-5ae3-4171-a3dd-e5eea62ef157" (UID: "84e2930a-5ae3-4171-a3dd-e5eea62ef157"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.415717 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84e2930a-5ae3-4171-a3dd-e5eea62ef157-config" (OuterVolumeSpecName: "config") pod "84e2930a-5ae3-4171-a3dd-e5eea62ef157" (UID: "84e2930a-5ae3-4171-a3dd-e5eea62ef157"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.427642 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1e92f9b-2291-4bd5-80b8-c2f9e667acf8-kube-api-access-fbprf" (OuterVolumeSpecName: "kube-api-access-fbprf") pod "c1e92f9b-2291-4bd5-80b8-c2f9e667acf8" (UID: "c1e92f9b-2291-4bd5-80b8-c2f9e667acf8"). InnerVolumeSpecName "kube-api-access-fbprf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.428226 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84e2930a-5ae3-4171-a3dd-e5eea62ef157-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "84e2930a-5ae3-4171-a3dd-e5eea62ef157" (UID: "84e2930a-5ae3-4171-a3dd-e5eea62ef157"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.428395 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1e92f9b-2291-4bd5-80b8-c2f9e667acf8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c1e92f9b-2291-4bd5-80b8-c2f9e667acf8" (UID: "c1e92f9b-2291-4bd5-80b8-c2f9e667acf8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.428568 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84e2930a-5ae3-4171-a3dd-e5eea62ef157-kube-api-access-ksqdw" (OuterVolumeSpecName: "kube-api-access-ksqdw") pod "84e2930a-5ae3-4171-a3dd-e5eea62ef157" (UID: "84e2930a-5ae3-4171-a3dd-e5eea62ef157"). InnerVolumeSpecName "kube-api-access-ksqdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.469160 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fb78c7854-7b78f"] Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.513449 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78cdf25f-daec-4cd7-8954-1fef6f3727db-serving-cert\") pod \"route-controller-manager-6fb78c7854-7b78f\" (UID: \"78cdf25f-daec-4cd7-8954-1fef6f3727db\") " pod="openshift-route-controller-manager/route-controller-manager-6fb78c7854-7b78f" Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.513514 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78cdf25f-daec-4cd7-8954-1fef6f3727db-config\") pod \"route-controller-manager-6fb78c7854-7b78f\" (UID: \"78cdf25f-daec-4cd7-8954-1fef6f3727db\") " pod="openshift-route-controller-manager/route-controller-manager-6fb78c7854-7b78f" Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.513551 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tv6b\" (UniqueName: \"kubernetes.io/projected/78cdf25f-daec-4cd7-8954-1fef6f3727db-kube-api-access-6tv6b\") pod \"route-controller-manager-6fb78c7854-7b78f\" (UID: \"78cdf25f-daec-4cd7-8954-1fef6f3727db\") " pod="openshift-route-controller-manager/route-controller-manager-6fb78c7854-7b78f" Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.513597 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/78cdf25f-daec-4cd7-8954-1fef6f3727db-client-ca\") pod \"route-controller-manager-6fb78c7854-7b78f\" (UID: \"78cdf25f-daec-4cd7-8954-1fef6f3727db\") " pod="openshift-route-controller-manager/route-controller-manager-6fb78c7854-7b78f" Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.513656 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ksqdw\" (UniqueName: \"kubernetes.io/projected/84e2930a-5ae3-4171-a3dd-e5eea62ef157-kube-api-access-ksqdw\") on node \"crc\" DevicePath \"\"" Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.513682 4751 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1e92f9b-2291-4bd5-80b8-c2f9e667acf8-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.513701 4751 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c1e92f9b-2291-4bd5-80b8-c2f9e667acf8-client-ca\") on node \"crc\" DevicePath \"\"" Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.513716 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1e92f9b-2291-4bd5-80b8-c2f9e667acf8-config\") on node \"crc\" DevicePath \"\"" Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.513731 4751 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c1e92f9b-2291-4bd5-80b8-c2f9e667acf8-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.513749 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbprf\" (UniqueName: \"kubernetes.io/projected/c1e92f9b-2291-4bd5-80b8-c2f9e667acf8-kube-api-access-fbprf\") on node \"crc\" DevicePath \"\"" Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.513761 4751 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84e2930a-5ae3-4171-a3dd-e5eea62ef157-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.513772 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84e2930a-5ae3-4171-a3dd-e5eea62ef157-config\") on node \"crc\" DevicePath \"\"" Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.513782 4751 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/84e2930a-5ae3-4171-a3dd-e5eea62ef157-client-ca\") on node \"crc\" DevicePath \"\"" Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.514829 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/78cdf25f-daec-4cd7-8954-1fef6f3727db-client-ca\") pod \"route-controller-manager-6fb78c7854-7b78f\" (UID: \"78cdf25f-daec-4cd7-8954-1fef6f3727db\") " pod="openshift-route-controller-manager/route-controller-manager-6fb78c7854-7b78f" Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.516945 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78cdf25f-daec-4cd7-8954-1fef6f3727db-config\") pod \"route-controller-manager-6fb78c7854-7b78f\" (UID: \"78cdf25f-daec-4cd7-8954-1fef6f3727db\") " pod="openshift-route-controller-manager/route-controller-manager-6fb78c7854-7b78f" Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.517412 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78cdf25f-daec-4cd7-8954-1fef6f3727db-serving-cert\") pod \"route-controller-manager-6fb78c7854-7b78f\" (UID: \"78cdf25f-daec-4cd7-8954-1fef6f3727db\") " pod="openshift-route-controller-manager/route-controller-manager-6fb78c7854-7b78f" Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.534156 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tv6b\" (UniqueName: \"kubernetes.io/projected/78cdf25f-daec-4cd7-8954-1fef6f3727db-kube-api-access-6tv6b\") pod \"route-controller-manager-6fb78c7854-7b78f\" (UID: \"78cdf25f-daec-4cd7-8954-1fef6f3727db\") " pod="openshift-route-controller-manager/route-controller-manager-6fb78c7854-7b78f" Jan 31 14:44:50 crc kubenswrapper[4751]: I0131 14:44:50.771726 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6fb78c7854-7b78f" Jan 31 14:44:51 crc kubenswrapper[4751]: I0131 14:44:51.326396 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-sxjf5" Jan 31 14:44:51 crc kubenswrapper[4751]: I0131 14:44:51.326423 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7762w" Jan 31 14:44:51 crc kubenswrapper[4751]: I0131 14:44:51.353330 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-sxjf5"] Jan 31 14:44:51 crc kubenswrapper[4751]: I0131 14:44:51.361522 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-sxjf5"] Jan 31 14:44:51 crc kubenswrapper[4751]: I0131 14:44:51.366610 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7762w"] Jan 31 14:44:51 crc kubenswrapper[4751]: I0131 14:44:51.370174 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7762w"] Jan 31 14:44:52 crc kubenswrapper[4751]: I0131 14:44:52.419954 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84e2930a-5ae3-4171-a3dd-e5eea62ef157" path="/var/lib/kubelet/pods/84e2930a-5ae3-4171-a3dd-e5eea62ef157/volumes" Jan 31 14:44:52 crc kubenswrapper[4751]: I0131 14:44:52.421045 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1e92f9b-2291-4bd5-80b8-c2f9e667acf8" path="/var/lib/kubelet/pods/c1e92f9b-2291-4bd5-80b8-c2f9e667acf8/volumes" Jan 31 14:44:52 crc kubenswrapper[4751]: E0131 14:44:52.522757 4751 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 31 14:44:52 crc kubenswrapper[4751]: E0131 14:44:52.522988 4751 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-566b8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-m4m6r_openshift-marketplace(8d5f1383-42d7-47a1-9e47-8dba038241d2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 31 14:44:52 crc kubenswrapper[4751]: E0131 14:44:52.524330 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-m4m6r" podUID="8d5f1383-42d7-47a1-9e47-8dba038241d2" Jan 31 14:44:52 crc kubenswrapper[4751]: E0131 14:44:52.772490 4751 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 31 14:44:52 crc kubenswrapper[4751]: E0131 14:44:52.772790 4751 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wkxqv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-k2xfl_openshift-marketplace(e656c7af-fbd9-4e9c-ae61-d4142d37c89f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 31 14:44:52 crc kubenswrapper[4751]: E0131 14:44:52.774453 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-k2xfl" podUID="e656c7af-fbd9-4e9c-ae61-d4142d37c89f" Jan 31 14:44:53 crc kubenswrapper[4751]: I0131 14:44:53.110358 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-9ff5cfb77-zl2hc"] Jan 31 14:44:53 crc kubenswrapper[4751]: I0131 14:44:53.111896 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9ff5cfb77-zl2hc" Jan 31 14:44:53 crc kubenswrapper[4751]: I0131 14:44:53.116150 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 31 14:44:53 crc kubenswrapper[4751]: I0131 14:44:53.116728 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 31 14:44:53 crc kubenswrapper[4751]: I0131 14:44:53.116775 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 31 14:44:53 crc kubenswrapper[4751]: I0131 14:44:53.116878 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 31 14:44:53 crc kubenswrapper[4751]: I0131 14:44:53.117050 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 31 14:44:53 crc kubenswrapper[4751]: I0131 14:44:53.123552 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 31 14:44:53 crc kubenswrapper[4751]: I0131 14:44:53.124804 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 31 14:44:53 crc kubenswrapper[4751]: I0131 14:44:53.133027 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-9ff5cfb77-zl2hc"] Jan 31 14:44:53 crc kubenswrapper[4751]: I0131 14:44:53.251843 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7689427f-2c92-4b56-9617-1139504142ee-config\") pod \"controller-manager-9ff5cfb77-zl2hc\" (UID: \"7689427f-2c92-4b56-9617-1139504142ee\") " pod="openshift-controller-manager/controller-manager-9ff5cfb77-zl2hc" Jan 31 14:44:53 crc kubenswrapper[4751]: I0131 14:44:53.251924 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7689427f-2c92-4b56-9617-1139504142ee-serving-cert\") pod \"controller-manager-9ff5cfb77-zl2hc\" (UID: \"7689427f-2c92-4b56-9617-1139504142ee\") " pod="openshift-controller-manager/controller-manager-9ff5cfb77-zl2hc" Jan 31 14:44:53 crc kubenswrapper[4751]: I0131 14:44:53.252059 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krs4f\" (UniqueName: \"kubernetes.io/projected/7689427f-2c92-4b56-9617-1139504142ee-kube-api-access-krs4f\") pod \"controller-manager-9ff5cfb77-zl2hc\" (UID: \"7689427f-2c92-4b56-9617-1139504142ee\") " pod="openshift-controller-manager/controller-manager-9ff5cfb77-zl2hc" Jan 31 14:44:53 crc kubenswrapper[4751]: I0131 14:44:53.252169 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7689427f-2c92-4b56-9617-1139504142ee-proxy-ca-bundles\") pod \"controller-manager-9ff5cfb77-zl2hc\" (UID: \"7689427f-2c92-4b56-9617-1139504142ee\") " pod="openshift-controller-manager/controller-manager-9ff5cfb77-zl2hc" Jan 31 14:44:53 crc kubenswrapper[4751]: I0131 14:44:53.252235 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7689427f-2c92-4b56-9617-1139504142ee-client-ca\") pod \"controller-manager-9ff5cfb77-zl2hc\" (UID: \"7689427f-2c92-4b56-9617-1139504142ee\") " pod="openshift-controller-manager/controller-manager-9ff5cfb77-zl2hc" Jan 31 14:44:53 crc kubenswrapper[4751]: I0131 14:44:53.353948 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krs4f\" (UniqueName: \"kubernetes.io/projected/7689427f-2c92-4b56-9617-1139504142ee-kube-api-access-krs4f\") pod \"controller-manager-9ff5cfb77-zl2hc\" (UID: \"7689427f-2c92-4b56-9617-1139504142ee\") " pod="openshift-controller-manager/controller-manager-9ff5cfb77-zl2hc" Jan 31 14:44:53 crc kubenswrapper[4751]: I0131 14:44:53.354267 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7689427f-2c92-4b56-9617-1139504142ee-proxy-ca-bundles\") pod \"controller-manager-9ff5cfb77-zl2hc\" (UID: \"7689427f-2c92-4b56-9617-1139504142ee\") " pod="openshift-controller-manager/controller-manager-9ff5cfb77-zl2hc" Jan 31 14:44:53 crc kubenswrapper[4751]: I0131 14:44:53.354381 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7689427f-2c92-4b56-9617-1139504142ee-client-ca\") pod \"controller-manager-9ff5cfb77-zl2hc\" (UID: \"7689427f-2c92-4b56-9617-1139504142ee\") " pod="openshift-controller-manager/controller-manager-9ff5cfb77-zl2hc" Jan 31 14:44:53 crc kubenswrapper[4751]: I0131 14:44:53.355776 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7689427f-2c92-4b56-9617-1139504142ee-client-ca\") pod \"controller-manager-9ff5cfb77-zl2hc\" (UID: \"7689427f-2c92-4b56-9617-1139504142ee\") " pod="openshift-controller-manager/controller-manager-9ff5cfb77-zl2hc" Jan 31 14:44:53 crc kubenswrapper[4751]: I0131 14:44:53.356006 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7689427f-2c92-4b56-9617-1139504142ee-proxy-ca-bundles\") pod \"controller-manager-9ff5cfb77-zl2hc\" (UID: \"7689427f-2c92-4b56-9617-1139504142ee\") " pod="openshift-controller-manager/controller-manager-9ff5cfb77-zl2hc" Jan 31 14:44:53 crc kubenswrapper[4751]: I0131 14:44:53.357905 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7689427f-2c92-4b56-9617-1139504142ee-config\") pod \"controller-manager-9ff5cfb77-zl2hc\" (UID: \"7689427f-2c92-4b56-9617-1139504142ee\") " pod="openshift-controller-manager/controller-manager-9ff5cfb77-zl2hc" Jan 31 14:44:53 crc kubenswrapper[4751]: I0131 14:44:53.357989 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7689427f-2c92-4b56-9617-1139504142ee-config\") pod \"controller-manager-9ff5cfb77-zl2hc\" (UID: \"7689427f-2c92-4b56-9617-1139504142ee\") " pod="openshift-controller-manager/controller-manager-9ff5cfb77-zl2hc" Jan 31 14:44:53 crc kubenswrapper[4751]: I0131 14:44:53.358062 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7689427f-2c92-4b56-9617-1139504142ee-serving-cert\") pod \"controller-manager-9ff5cfb77-zl2hc\" (UID: \"7689427f-2c92-4b56-9617-1139504142ee\") " pod="openshift-controller-manager/controller-manager-9ff5cfb77-zl2hc" Jan 31 14:44:53 crc kubenswrapper[4751]: I0131 14:44:53.369600 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7689427f-2c92-4b56-9617-1139504142ee-serving-cert\") pod \"controller-manager-9ff5cfb77-zl2hc\" (UID: \"7689427f-2c92-4b56-9617-1139504142ee\") " pod="openshift-controller-manager/controller-manager-9ff5cfb77-zl2hc" Jan 31 14:44:53 crc kubenswrapper[4751]: I0131 14:44:53.400528 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krs4f\" (UniqueName: \"kubernetes.io/projected/7689427f-2c92-4b56-9617-1139504142ee-kube-api-access-krs4f\") pod \"controller-manager-9ff5cfb77-zl2hc\" (UID: \"7689427f-2c92-4b56-9617-1139504142ee\") " pod="openshift-controller-manager/controller-manager-9ff5cfb77-zl2hc" Jan 31 14:44:53 crc kubenswrapper[4751]: I0131 14:44:53.435294 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9ff5cfb77-zl2hc" Jan 31 14:44:53 crc kubenswrapper[4751]: E0131 14:44:53.584612 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-k2xfl" podUID="e656c7af-fbd9-4e9c-ae61-d4142d37c89f" Jan 31 14:44:53 crc kubenswrapper[4751]: E0131 14:44:53.587265 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-m4m6r" podUID="8d5f1383-42d7-47a1-9e47-8dba038241d2" Jan 31 14:44:53 crc kubenswrapper[4751]: E0131 14:44:53.662401 4751 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 31 14:44:53 crc kubenswrapper[4751]: E0131 14:44:53.662647 4751 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qgf8m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-gktqp_openshift-marketplace(0cfb2e52-7371-4d38-994c-92b5b7d123cc): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 31 14:44:53 crc kubenswrapper[4751]: E0131 14:44:53.664028 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-gktqp" podUID="0cfb2e52-7371-4d38-994c-92b5b7d123cc" Jan 31 14:44:53 crc kubenswrapper[4751]: E0131 14:44:53.665737 4751 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 31 14:44:53 crc kubenswrapper[4751]: E0131 14:44:53.665957 4751 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4nf7x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-nfjx5_openshift-marketplace(e771b68a-beea-4c8b-a085-b869155ca20d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 31 14:44:53 crc kubenswrapper[4751]: E0131 14:44:53.667501 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-nfjx5" podUID="e771b68a-beea-4c8b-a085-b869155ca20d" Jan 31 14:44:53 crc kubenswrapper[4751]: I0131 14:44:53.875806 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-xtn6l"] Jan 31 14:44:53 crc kubenswrapper[4751]: W0131 14:44:53.881364 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68aeb9c7_d3c3_4c34_96ab_bb947421c504.slice/crio-e3f878d12e7733c635007689de1ae2b125573c090c2c8d57308fa277204993ab WatchSource:0}: Error finding container e3f878d12e7733c635007689de1ae2b125573c090c2c8d57308fa277204993ab: Status 404 returned error can't find the container with id e3f878d12e7733c635007689de1ae2b125573c090c2c8d57308fa277204993ab Jan 31 14:44:53 crc kubenswrapper[4751]: I0131 14:44:53.979212 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fb78c7854-7b78f"] Jan 31 14:44:53 crc kubenswrapper[4751]: W0131 14:44:53.982456 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78cdf25f_daec_4cd7_8954_1fef6f3727db.slice/crio-970e5859fee2ce045474246c69d85b582bd206939166d1d5d70f07913b640d31 WatchSource:0}: Error finding container 970e5859fee2ce045474246c69d85b582bd206939166d1d5d70f07913b640d31: Status 404 returned error can't find the container with id 970e5859fee2ce045474246c69d85b582bd206939166d1d5d70f07913b640d31 Jan 31 14:44:54 crc kubenswrapper[4751]: I0131 14:44:54.130852 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-9ff5cfb77-zl2hc"] Jan 31 14:44:54 crc kubenswrapper[4751]: W0131 14:44:54.134130 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4ee27ad5_3acb_4388_a964_3b526b79e776.slice/crio-6ffb197e3834ae2da47e7ac1e9c5209aca77770c8ec1ba80483d9a0a51243fd6 WatchSource:0}: Error finding container 6ffb197e3834ae2da47e7ac1e9c5209aca77770c8ec1ba80483d9a0a51243fd6: Status 404 returned error can't find the container with id 6ffb197e3834ae2da47e7ac1e9c5209aca77770c8ec1ba80483d9a0a51243fd6 Jan 31 14:44:54 crc kubenswrapper[4751]: I0131 14:44:54.135510 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 31 14:44:54 crc kubenswrapper[4751]: I0131 14:44:54.139365 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 31 14:44:54 crc kubenswrapper[4751]: I0131 14:44:54.352701 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ccfa0c88-7f51-4d85-8a49-e05865c6a06e","Type":"ContainerStarted","Data":"ca8ebba9df4a8c9712a669a8d97759aea5c95bd694f2cead6b4521af30eb8469"} Jan 31 14:44:54 crc kubenswrapper[4751]: I0131 14:44:54.354167 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9ff5cfb77-zl2hc" event={"ID":"7689427f-2c92-4b56-9617-1139504142ee","Type":"ContainerStarted","Data":"53e8f013a379679cef6168fde6d18706b1593479aa88c48f6b833cf1be744d64"} Jan 31 14:44:54 crc kubenswrapper[4751]: I0131 14:44:54.355191 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"4ee27ad5-3acb-4388-a964-3b526b79e776","Type":"ContainerStarted","Data":"6ffb197e3834ae2da47e7ac1e9c5209aca77770c8ec1ba80483d9a0a51243fd6"} Jan 31 14:44:54 crc kubenswrapper[4751]: I0131 14:44:54.356297 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6fb78c7854-7b78f" event={"ID":"78cdf25f-daec-4cd7-8954-1fef6f3727db","Type":"ContainerStarted","Data":"970e5859fee2ce045474246c69d85b582bd206939166d1d5d70f07913b640d31"} Jan 31 14:44:54 crc kubenswrapper[4751]: I0131 14:44:54.357404 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xtn6l" event={"ID":"68aeb9c7-d3c3-4c34-96ab-bb947421c504","Type":"ContainerStarted","Data":"e3f878d12e7733c635007689de1ae2b125573c090c2c8d57308fa277204993ab"} Jan 31 14:44:54 crc kubenswrapper[4751]: E0131 14:44:54.361756 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-nfjx5" podUID="e771b68a-beea-4c8b-a085-b869155ca20d" Jan 31 14:44:54 crc kubenswrapper[4751]: E0131 14:44:54.365459 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-gktqp" podUID="0cfb2e52-7371-4d38-994c-92b5b7d123cc" Jan 31 14:44:55 crc kubenswrapper[4751]: I0131 14:44:55.363963 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s7j7f" event={"ID":"f614f9ab-b5e2-4548-93e7-571d1ffb57b0","Type":"ContainerStarted","Data":"f9bdddf94b5f6d16e3861f0fec527d5909cdc3d4c12d6d71c61a9d592a18874f"} Jan 31 14:44:55 crc kubenswrapper[4751]: I0131 14:44:55.366045 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6fb78c7854-7b78f" event={"ID":"78cdf25f-daec-4cd7-8954-1fef6f3727db","Type":"ContainerStarted","Data":"d6fcf4545fc743b58a8d591ee8c96761c9cf12c0b3c6f1393a495a38267ab81b"} Jan 31 14:44:55 crc kubenswrapper[4751]: I0131 14:44:55.366222 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6fb78c7854-7b78f" Jan 31 14:44:55 crc kubenswrapper[4751]: I0131 14:44:55.369221 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xtn6l" event={"ID":"68aeb9c7-d3c3-4c34-96ab-bb947421c504","Type":"ContainerStarted","Data":"6bc10453d43a1e5be2ec99bfe8bab5eef283d3e7ba32bb938b4f299d4b7611e7"} Jan 31 14:44:55 crc kubenswrapper[4751]: I0131 14:44:55.372317 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ccfa0c88-7f51-4d85-8a49-e05865c6a06e","Type":"ContainerStarted","Data":"8b4d59b21d9818b51f757f56dda578d9b5e64551b0acae90d2098c728b3290ee"} Jan 31 14:44:55 crc kubenswrapper[4751]: I0131 14:44:55.373286 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6fb78c7854-7b78f" Jan 31 14:44:55 crc kubenswrapper[4751]: I0131 14:44:55.376377 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9ff5cfb77-zl2hc" event={"ID":"7689427f-2c92-4b56-9617-1139504142ee","Type":"ContainerStarted","Data":"5ab0b26976de57e2a1cac017e1118160bad1efe8234f246fb4610cad0bf8fe55"} Jan 31 14:44:55 crc kubenswrapper[4751]: I0131 14:44:55.377004 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-9ff5cfb77-zl2hc" Jan 31 14:44:55 crc kubenswrapper[4751]: I0131 14:44:55.379887 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"4ee27ad5-3acb-4388-a964-3b526b79e776","Type":"ContainerStarted","Data":"088b308890b05df5e2b8d2107eb926cdcdbce0d50a37541483e97b4f64b46c2c"} Jan 31 14:44:55 crc kubenswrapper[4751]: I0131 14:44:55.381496 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-9ff5cfb77-zl2hc" Jan 31 14:44:55 crc kubenswrapper[4751]: I0131 14:44:55.382795 4751 generic.go:334] "Generic (PLEG): container finished" podID="074619b7-9220-4377-b93d-6088199a5e16" containerID="a757fc9386532749c4b360530fb36362a62f17d343908433db3d64555171c0b9" exitCode=0 Jan 31 14:44:55 crc kubenswrapper[4751]: I0131 14:44:55.382829 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wcnsn" event={"ID":"074619b7-9220-4377-b93d-6088199a5e16","Type":"ContainerDied","Data":"a757fc9386532749c4b360530fb36362a62f17d343908433db3d64555171c0b9"} Jan 31 14:44:55 crc kubenswrapper[4751]: I0131 14:44:55.404693 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=6.404679202 podStartE2EDuration="6.404679202s" podCreationTimestamp="2026-01-31 14:44:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:44:55.404398214 +0000 UTC m=+199.779111109" watchObservedRunningTime="2026-01-31 14:44:55.404679202 +0000 UTC m=+199.779392087" Jan 31 14:44:55 crc kubenswrapper[4751]: I0131 14:44:55.426657 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-9ff5cfb77-zl2hc" podStartSLOduration=12.426636969 podStartE2EDuration="12.426636969s" podCreationTimestamp="2026-01-31 14:44:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:44:55.423006443 +0000 UTC m=+199.797719338" watchObservedRunningTime="2026-01-31 14:44:55.426636969 +0000 UTC m=+199.801349864" Jan 31 14:44:55 crc kubenswrapper[4751]: I0131 14:44:55.466763 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=10.466713182 podStartE2EDuration="10.466713182s" podCreationTimestamp="2026-01-31 14:44:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:44:55.461140495 +0000 UTC m=+199.835853380" watchObservedRunningTime="2026-01-31 14:44:55.466713182 +0000 UTC m=+199.841426077" Jan 31 14:44:55 crc kubenswrapper[4751]: I0131 14:44:55.484326 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6fb78c7854-7b78f" podStartSLOduration=11.484306454 podStartE2EDuration="11.484306454s" podCreationTimestamp="2026-01-31 14:44:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:44:55.477939947 +0000 UTC m=+199.852652832" watchObservedRunningTime="2026-01-31 14:44:55.484306454 +0000 UTC m=+199.859019339" Jan 31 14:44:56 crc kubenswrapper[4751]: I0131 14:44:56.392743 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xtn6l" event={"ID":"68aeb9c7-d3c3-4c34-96ab-bb947421c504","Type":"ContainerStarted","Data":"60bab24f3dc5e2c7363b1ca62b341cb3c2ce6d95eb14311354c74fe4b027b247"} Jan 31 14:44:56 crc kubenswrapper[4751]: I0131 14:44:56.423104 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-xtn6l" podStartSLOduration=175.423088194 podStartE2EDuration="2m55.423088194s" podCreationTimestamp="2026-01-31 14:42:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:44:56.420748542 +0000 UTC m=+200.795461437" watchObservedRunningTime="2026-01-31 14:44:56.423088194 +0000 UTC m=+200.797801089" Jan 31 14:44:59 crc kubenswrapper[4751]: I0131 14:44:59.420980 4751 generic.go:334] "Generic (PLEG): container finished" podID="4ee27ad5-3acb-4388-a964-3b526b79e776" containerID="088b308890b05df5e2b8d2107eb926cdcdbce0d50a37541483e97b4f64b46c2c" exitCode=0 Jan 31 14:44:59 crc kubenswrapper[4751]: I0131 14:44:59.421146 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"4ee27ad5-3acb-4388-a964-3b526b79e776","Type":"ContainerDied","Data":"088b308890b05df5e2b8d2107eb926cdcdbce0d50a37541483e97b4f64b46c2c"} Jan 31 14:44:59 crc kubenswrapper[4751]: I0131 14:44:59.425253 4751 generic.go:334] "Generic (PLEG): container finished" podID="f614f9ab-b5e2-4548-93e7-571d1ffb57b0" containerID="f9bdddf94b5f6d16e3861f0fec527d5909cdc3d4c12d6d71c61a9d592a18874f" exitCode=0 Jan 31 14:44:59 crc kubenswrapper[4751]: I0131 14:44:59.425325 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s7j7f" event={"ID":"f614f9ab-b5e2-4548-93e7-571d1ffb57b0","Type":"ContainerDied","Data":"f9bdddf94b5f6d16e3861f0fec527d5909cdc3d4c12d6d71c61a9d592a18874f"} Jan 31 14:45:00 crc kubenswrapper[4751]: I0131 14:45:00.150703 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497845-gwlpr"] Jan 31 14:45:00 crc kubenswrapper[4751]: I0131 14:45:00.152311 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497845-gwlpr" Jan 31 14:45:00 crc kubenswrapper[4751]: I0131 14:45:00.157226 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 31 14:45:00 crc kubenswrapper[4751]: I0131 14:45:00.157261 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 31 14:45:00 crc kubenswrapper[4751]: I0131 14:45:00.171669 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497845-gwlpr"] Jan 31 14:45:00 crc kubenswrapper[4751]: I0131 14:45:00.266340 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7a7534cc-afa8-4cd1-acb0-e4269e55316b-secret-volume\") pod \"collect-profiles-29497845-gwlpr\" (UID: \"7a7534cc-afa8-4cd1-acb0-e4269e55316b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497845-gwlpr" Jan 31 14:45:00 crc kubenswrapper[4751]: I0131 14:45:00.266449 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pd4xf\" (UniqueName: \"kubernetes.io/projected/7a7534cc-afa8-4cd1-acb0-e4269e55316b-kube-api-access-pd4xf\") pod \"collect-profiles-29497845-gwlpr\" (UID: \"7a7534cc-afa8-4cd1-acb0-e4269e55316b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497845-gwlpr" Jan 31 14:45:00 crc kubenswrapper[4751]: I0131 14:45:00.266839 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7a7534cc-afa8-4cd1-acb0-e4269e55316b-config-volume\") pod \"collect-profiles-29497845-gwlpr\" (UID: \"7a7534cc-afa8-4cd1-acb0-e4269e55316b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497845-gwlpr" Jan 31 14:45:00 crc kubenswrapper[4751]: I0131 14:45:00.368981 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7a7534cc-afa8-4cd1-acb0-e4269e55316b-secret-volume\") pod \"collect-profiles-29497845-gwlpr\" (UID: \"7a7534cc-afa8-4cd1-acb0-e4269e55316b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497845-gwlpr" Jan 31 14:45:00 crc kubenswrapper[4751]: I0131 14:45:00.369065 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pd4xf\" (UniqueName: \"kubernetes.io/projected/7a7534cc-afa8-4cd1-acb0-e4269e55316b-kube-api-access-pd4xf\") pod \"collect-profiles-29497845-gwlpr\" (UID: \"7a7534cc-afa8-4cd1-acb0-e4269e55316b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497845-gwlpr" Jan 31 14:45:00 crc kubenswrapper[4751]: I0131 14:45:00.369119 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7a7534cc-afa8-4cd1-acb0-e4269e55316b-config-volume\") pod \"collect-profiles-29497845-gwlpr\" (UID: \"7a7534cc-afa8-4cd1-acb0-e4269e55316b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497845-gwlpr" Jan 31 14:45:00 crc kubenswrapper[4751]: I0131 14:45:00.370344 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7a7534cc-afa8-4cd1-acb0-e4269e55316b-config-volume\") pod \"collect-profiles-29497845-gwlpr\" (UID: \"7a7534cc-afa8-4cd1-acb0-e4269e55316b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497845-gwlpr" Jan 31 14:45:00 crc kubenswrapper[4751]: I0131 14:45:00.376522 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7a7534cc-afa8-4cd1-acb0-e4269e55316b-secret-volume\") pod \"collect-profiles-29497845-gwlpr\" (UID: \"7a7534cc-afa8-4cd1-acb0-e4269e55316b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497845-gwlpr" Jan 31 14:45:00 crc kubenswrapper[4751]: I0131 14:45:00.394127 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pd4xf\" (UniqueName: \"kubernetes.io/projected/7a7534cc-afa8-4cd1-acb0-e4269e55316b-kube-api-access-pd4xf\") pod \"collect-profiles-29497845-gwlpr\" (UID: \"7a7534cc-afa8-4cd1-acb0-e4269e55316b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497845-gwlpr" Jan 31 14:45:00 crc kubenswrapper[4751]: I0131 14:45:00.432054 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wcnsn" event={"ID":"074619b7-9220-4377-b93d-6088199a5e16","Type":"ContainerStarted","Data":"362555e9a4bda60e895d4cff8fad32fbdb6800b24c4a8d8deeb2ac026aebcc1b"} Jan 31 14:45:00 crc kubenswrapper[4751]: I0131 14:45:00.450200 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wcnsn" podStartSLOduration=3.477883351 podStartE2EDuration="54.450184338s" podCreationTimestamp="2026-01-31 14:44:06 +0000 UTC" firstStartedPulling="2026-01-31 14:44:07.791011553 +0000 UTC m=+152.165724438" lastFinishedPulling="2026-01-31 14:44:58.76331253 +0000 UTC m=+203.138025425" observedRunningTime="2026-01-31 14:45:00.448575515 +0000 UTC m=+204.823288410" watchObservedRunningTime="2026-01-31 14:45:00.450184338 +0000 UTC m=+204.824897223" Jan 31 14:45:00 crc kubenswrapper[4751]: I0131 14:45:00.521379 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497845-gwlpr" Jan 31 14:45:00 crc kubenswrapper[4751]: I0131 14:45:00.779913 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 31 14:45:00 crc kubenswrapper[4751]: I0131 14:45:00.875821 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4ee27ad5-3acb-4388-a964-3b526b79e776-kubelet-dir\") pod \"4ee27ad5-3acb-4388-a964-3b526b79e776\" (UID: \"4ee27ad5-3acb-4388-a964-3b526b79e776\") " Jan 31 14:45:00 crc kubenswrapper[4751]: I0131 14:45:00.875915 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4ee27ad5-3acb-4388-a964-3b526b79e776-kube-api-access\") pod \"4ee27ad5-3acb-4388-a964-3b526b79e776\" (UID: \"4ee27ad5-3acb-4388-a964-3b526b79e776\") " Jan 31 14:45:00 crc kubenswrapper[4751]: I0131 14:45:00.875937 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4ee27ad5-3acb-4388-a964-3b526b79e776-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4ee27ad5-3acb-4388-a964-3b526b79e776" (UID: "4ee27ad5-3acb-4388-a964-3b526b79e776"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 14:45:00 crc kubenswrapper[4751]: I0131 14:45:00.876204 4751 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4ee27ad5-3acb-4388-a964-3b526b79e776-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 31 14:45:00 crc kubenswrapper[4751]: I0131 14:45:00.882259 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ee27ad5-3acb-4388-a964-3b526b79e776-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4ee27ad5-3acb-4388-a964-3b526b79e776" (UID: "4ee27ad5-3acb-4388-a964-3b526b79e776"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:45:00 crc kubenswrapper[4751]: I0131 14:45:00.977223 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4ee27ad5-3acb-4388-a964-3b526b79e776-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 31 14:45:01 crc kubenswrapper[4751]: I0131 14:45:01.138806 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497845-gwlpr"] Jan 31 14:45:01 crc kubenswrapper[4751]: W0131 14:45:01.182034 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a7534cc_afa8_4cd1_acb0_e4269e55316b.slice/crio-7610769ab3f7dc50064969e1696261c3c40fd1cfc71d4e432425c19f9e85417a WatchSource:0}: Error finding container 7610769ab3f7dc50064969e1696261c3c40fd1cfc71d4e432425c19f9e85417a: Status 404 returned error can't find the container with id 7610769ab3f7dc50064969e1696261c3c40fd1cfc71d4e432425c19f9e85417a Jan 31 14:45:01 crc kubenswrapper[4751]: I0131 14:45:01.437840 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"4ee27ad5-3acb-4388-a964-3b526b79e776","Type":"ContainerDied","Data":"6ffb197e3834ae2da47e7ac1e9c5209aca77770c8ec1ba80483d9a0a51243fd6"} Jan 31 14:45:01 crc kubenswrapper[4751]: I0131 14:45:01.438147 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ffb197e3834ae2da47e7ac1e9c5209aca77770c8ec1ba80483d9a0a51243fd6" Jan 31 14:45:01 crc kubenswrapper[4751]: I0131 14:45:01.437896 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 31 14:45:01 crc kubenswrapper[4751]: I0131 14:45:01.446632 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497845-gwlpr" event={"ID":"7a7534cc-afa8-4cd1-acb0-e4269e55316b","Type":"ContainerStarted","Data":"14e3784ceb4cadc39980cddb4a29a7503a31cfa20643871453d5f7d2495d2d0d"} Jan 31 14:45:01 crc kubenswrapper[4751]: I0131 14:45:01.446816 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497845-gwlpr" event={"ID":"7a7534cc-afa8-4cd1-acb0-e4269e55316b","Type":"ContainerStarted","Data":"7610769ab3f7dc50064969e1696261c3c40fd1cfc71d4e432425c19f9e85417a"} Jan 31 14:45:01 crc kubenswrapper[4751]: I0131 14:45:01.449574 4751 generic.go:334] "Generic (PLEG): container finished" podID="d5c0f5c8-cecf-451f-abef-bf357716eb71" containerID="ec5579b0b5c03bbe363906a09e9b8073fa04eb6f15f0254accde5725abd7492c" exitCode=0 Jan 31 14:45:01 crc kubenswrapper[4751]: I0131 14:45:01.449662 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ln2lx" event={"ID":"d5c0f5c8-cecf-451f-abef-bf357716eb71","Type":"ContainerDied","Data":"ec5579b0b5c03bbe363906a09e9b8073fa04eb6f15f0254accde5725abd7492c"} Jan 31 14:45:01 crc kubenswrapper[4751]: I0131 14:45:01.454885 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s7j7f" event={"ID":"f614f9ab-b5e2-4548-93e7-571d1ffb57b0","Type":"ContainerStarted","Data":"060b3a3ddee4e9105f7ffb6a1ce801e4a26650b10b65707cadc77226dc18ea06"} Jan 31 14:45:01 crc kubenswrapper[4751]: I0131 14:45:01.495809 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29497845-gwlpr" podStartSLOduration=1.495781953 podStartE2EDuration="1.495781953s" podCreationTimestamp="2026-01-31 14:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:45:01.467459299 +0000 UTC m=+205.842172194" watchObservedRunningTime="2026-01-31 14:45:01.495781953 +0000 UTC m=+205.870494838" Jan 31 14:45:01 crc kubenswrapper[4751]: I0131 14:45:01.496221 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-s7j7f" podStartSLOduration=4.663128504 podStartE2EDuration="52.496215074s" podCreationTimestamp="2026-01-31 14:44:09 +0000 UTC" firstStartedPulling="2026-01-31 14:44:13.055692183 +0000 UTC m=+157.430405068" lastFinishedPulling="2026-01-31 14:45:00.888778723 +0000 UTC m=+205.263491638" observedRunningTime="2026-01-31 14:45:01.489510898 +0000 UTC m=+205.864223793" watchObservedRunningTime="2026-01-31 14:45:01.496215074 +0000 UTC m=+205.870927959" Jan 31 14:45:02 crc kubenswrapper[4751]: I0131 14:45:02.461026 4751 generic.go:334] "Generic (PLEG): container finished" podID="7a7534cc-afa8-4cd1-acb0-e4269e55316b" containerID="14e3784ceb4cadc39980cddb4a29a7503a31cfa20643871453d5f7d2495d2d0d" exitCode=0 Jan 31 14:45:02 crc kubenswrapper[4751]: I0131 14:45:02.461098 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497845-gwlpr" event={"ID":"7a7534cc-afa8-4cd1-acb0-e4269e55316b","Type":"ContainerDied","Data":"14e3784ceb4cadc39980cddb4a29a7503a31cfa20643871453d5f7d2495d2d0d"} Jan 31 14:45:02 crc kubenswrapper[4751]: I0131 14:45:02.464795 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ln2lx" event={"ID":"d5c0f5c8-cecf-451f-abef-bf357716eb71","Type":"ContainerStarted","Data":"9543af933d7edd0ace2113eee9860f1097d0ebff37536026931c946593943f0a"} Jan 31 14:45:02 crc kubenswrapper[4751]: I0131 14:45:02.466819 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2lq4t" event={"ID":"c447796d-48ac-4eeb-8fe6-ad411966b3d3","Type":"ContainerStarted","Data":"6ee8fb8b12b6ee8cd20623ca96e7a87fc43d879e6d76a01bc3e55e235825e807"} Jan 31 14:45:02 crc kubenswrapper[4751]: I0131 14:45:02.518936 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ln2lx" podStartSLOduration=2.384002687 podStartE2EDuration="56.518917329s" podCreationTimestamp="2026-01-31 14:44:06 +0000 UTC" firstStartedPulling="2026-01-31 14:44:07.802103636 +0000 UTC m=+152.176816521" lastFinishedPulling="2026-01-31 14:45:01.937018268 +0000 UTC m=+206.311731163" observedRunningTime="2026-01-31 14:45:02.517938693 +0000 UTC m=+206.892651588" watchObservedRunningTime="2026-01-31 14:45:02.518917329 +0000 UTC m=+206.893630214" Jan 31 14:45:03 crc kubenswrapper[4751]: I0131 14:45:03.473592 4751 generic.go:334] "Generic (PLEG): container finished" podID="c447796d-48ac-4eeb-8fe6-ad411966b3d3" containerID="6ee8fb8b12b6ee8cd20623ca96e7a87fc43d879e6d76a01bc3e55e235825e807" exitCode=0 Jan 31 14:45:03 crc kubenswrapper[4751]: I0131 14:45:03.473676 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2lq4t" event={"ID":"c447796d-48ac-4eeb-8fe6-ad411966b3d3","Type":"ContainerDied","Data":"6ee8fb8b12b6ee8cd20623ca96e7a87fc43d879e6d76a01bc3e55e235825e807"} Jan 31 14:45:03 crc kubenswrapper[4751]: I0131 14:45:03.790029 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497845-gwlpr" Jan 31 14:45:03 crc kubenswrapper[4751]: I0131 14:45:03.932697 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7a7534cc-afa8-4cd1-acb0-e4269e55316b-secret-volume\") pod \"7a7534cc-afa8-4cd1-acb0-e4269e55316b\" (UID: \"7a7534cc-afa8-4cd1-acb0-e4269e55316b\") " Jan 31 14:45:03 crc kubenswrapper[4751]: I0131 14:45:03.933272 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7a7534cc-afa8-4cd1-acb0-e4269e55316b-config-volume\") pod \"7a7534cc-afa8-4cd1-acb0-e4269e55316b\" (UID: \"7a7534cc-afa8-4cd1-acb0-e4269e55316b\") " Jan 31 14:45:03 crc kubenswrapper[4751]: I0131 14:45:03.933613 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pd4xf\" (UniqueName: \"kubernetes.io/projected/7a7534cc-afa8-4cd1-acb0-e4269e55316b-kube-api-access-pd4xf\") pod \"7a7534cc-afa8-4cd1-acb0-e4269e55316b\" (UID: \"7a7534cc-afa8-4cd1-acb0-e4269e55316b\") " Jan 31 14:45:03 crc kubenswrapper[4751]: I0131 14:45:03.934104 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a7534cc-afa8-4cd1-acb0-e4269e55316b-config-volume" (OuterVolumeSpecName: "config-volume") pod "7a7534cc-afa8-4cd1-acb0-e4269e55316b" (UID: "7a7534cc-afa8-4cd1-acb0-e4269e55316b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:45:03 crc kubenswrapper[4751]: I0131 14:45:03.939923 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a7534cc-afa8-4cd1-acb0-e4269e55316b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7a7534cc-afa8-4cd1-acb0-e4269e55316b" (UID: "7a7534cc-afa8-4cd1-acb0-e4269e55316b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:45:03 crc kubenswrapper[4751]: I0131 14:45:03.948035 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a7534cc-afa8-4cd1-acb0-e4269e55316b-kube-api-access-pd4xf" (OuterVolumeSpecName: "kube-api-access-pd4xf") pod "7a7534cc-afa8-4cd1-acb0-e4269e55316b" (UID: "7a7534cc-afa8-4cd1-acb0-e4269e55316b"). InnerVolumeSpecName "kube-api-access-pd4xf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:45:04 crc kubenswrapper[4751]: I0131 14:45:04.035682 4751 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7a7534cc-afa8-4cd1-acb0-e4269e55316b-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 31 14:45:04 crc kubenswrapper[4751]: I0131 14:45:04.035716 4751 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7a7534cc-afa8-4cd1-acb0-e4269e55316b-config-volume\") on node \"crc\" DevicePath \"\"" Jan 31 14:45:04 crc kubenswrapper[4751]: I0131 14:45:04.035726 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pd4xf\" (UniqueName: \"kubernetes.io/projected/7a7534cc-afa8-4cd1-acb0-e4269e55316b-kube-api-access-pd4xf\") on node \"crc\" DevicePath \"\"" Jan 31 14:45:04 crc kubenswrapper[4751]: I0131 14:45:04.480843 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497845-gwlpr" event={"ID":"7a7534cc-afa8-4cd1-acb0-e4269e55316b","Type":"ContainerDied","Data":"7610769ab3f7dc50064969e1696261c3c40fd1cfc71d4e432425c19f9e85417a"} Jan 31 14:45:04 crc kubenswrapper[4751]: I0131 14:45:04.481877 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7610769ab3f7dc50064969e1696261c3c40fd1cfc71d4e432425c19f9e85417a" Jan 31 14:45:04 crc kubenswrapper[4751]: I0131 14:45:04.480881 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497845-gwlpr" Jan 31 14:45:04 crc kubenswrapper[4751]: I0131 14:45:04.482730 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2lq4t" event={"ID":"c447796d-48ac-4eeb-8fe6-ad411966b3d3","Type":"ContainerStarted","Data":"cd00a1023a77b74abf41fc259fe9b2a475f7ada25dc1b39aa83f13c57794ccfd"} Jan 31 14:45:04 crc kubenswrapper[4751]: I0131 14:45:04.509365 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2lq4t" podStartSLOduration=2.334571377 podStartE2EDuration="58.509345464s" podCreationTimestamp="2026-01-31 14:44:06 +0000 UTC" firstStartedPulling="2026-01-31 14:44:07.795598224 +0000 UTC m=+152.170311109" lastFinishedPulling="2026-01-31 14:45:03.970372311 +0000 UTC m=+208.345085196" observedRunningTime="2026-01-31 14:45:04.508687687 +0000 UTC m=+208.883400602" watchObservedRunningTime="2026-01-31 14:45:04.509345464 +0000 UTC m=+208.884058359" Jan 31 14:45:06 crc kubenswrapper[4751]: I0131 14:45:06.439856 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wcnsn" Jan 31 14:45:06 crc kubenswrapper[4751]: I0131 14:45:06.440920 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wcnsn" Jan 31 14:45:06 crc kubenswrapper[4751]: I0131 14:45:06.502958 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gktqp" event={"ID":"0cfb2e52-7371-4d38-994c-92b5b7d123cc","Type":"ContainerStarted","Data":"0fd8fddc66836c1f7f3a6139ad51fa7a751ab965677515d2805ae4e6c08a2765"} Jan 31 14:45:06 crc kubenswrapper[4751]: I0131 14:45:06.830921 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ln2lx" Jan 31 14:45:06 crc kubenswrapper[4751]: I0131 14:45:06.832640 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ln2lx" Jan 31 14:45:06 crc kubenswrapper[4751]: I0131 14:45:06.905808 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ln2lx" Jan 31 14:45:06 crc kubenswrapper[4751]: I0131 14:45:06.906489 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wcnsn" Jan 31 14:45:06 crc kubenswrapper[4751]: I0131 14:45:06.957839 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wcnsn" Jan 31 14:45:07 crc kubenswrapper[4751]: I0131 14:45:07.006061 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2lq4t" Jan 31 14:45:07 crc kubenswrapper[4751]: I0131 14:45:07.006685 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2lq4t" Jan 31 14:45:07 crc kubenswrapper[4751]: I0131 14:45:07.059884 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2lq4t" Jan 31 14:45:07 crc kubenswrapper[4751]: I0131 14:45:07.513627 4751 generic.go:334] "Generic (PLEG): container finished" podID="e656c7af-fbd9-4e9c-ae61-d4142d37c89f" containerID="874aebfb442c94d60aaad947db92520e6e5ff745ee226afefd00dd9dc85cb564" exitCode=0 Jan 31 14:45:07 crc kubenswrapper[4751]: I0131 14:45:07.513909 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k2xfl" event={"ID":"e656c7af-fbd9-4e9c-ae61-d4142d37c89f","Type":"ContainerDied","Data":"874aebfb442c94d60aaad947db92520e6e5ff745ee226afefd00dd9dc85cb564"} Jan 31 14:45:07 crc kubenswrapper[4751]: I0131 14:45:07.519907 4751 generic.go:334] "Generic (PLEG): container finished" podID="0cfb2e52-7371-4d38-994c-92b5b7d123cc" containerID="0fd8fddc66836c1f7f3a6139ad51fa7a751ab965677515d2805ae4e6c08a2765" exitCode=0 Jan 31 14:45:07 crc kubenswrapper[4751]: I0131 14:45:07.520018 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gktqp" event={"ID":"0cfb2e52-7371-4d38-994c-92b5b7d123cc","Type":"ContainerDied","Data":"0fd8fddc66836c1f7f3a6139ad51fa7a751ab965677515d2805ae4e6c08a2765"} Jan 31 14:45:07 crc kubenswrapper[4751]: I0131 14:45:07.523524 4751 generic.go:334] "Generic (PLEG): container finished" podID="e771b68a-beea-4c8b-a085-b869155ca20d" containerID="449419de3999925adbffe064ed4c6d253fd8062b2c5e50eef6d0e389bc5f1a74" exitCode=0 Jan 31 14:45:07 crc kubenswrapper[4751]: I0131 14:45:07.523689 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nfjx5" event={"ID":"e771b68a-beea-4c8b-a085-b869155ca20d","Type":"ContainerDied","Data":"449419de3999925adbffe064ed4c6d253fd8062b2c5e50eef6d0e389bc5f1a74"} Jan 31 14:45:07 crc kubenswrapper[4751]: I0131 14:45:07.583586 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ln2lx" Jan 31 14:45:08 crc kubenswrapper[4751]: I0131 14:45:08.529781 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gktqp" event={"ID":"0cfb2e52-7371-4d38-994c-92b5b7d123cc","Type":"ContainerStarted","Data":"632dd7cf21c157e19b8b506aada8a2a0cc9ee7a4c7089d92374fc5dc9f67b1ea"} Jan 31 14:45:08 crc kubenswrapper[4751]: I0131 14:45:08.534313 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nfjx5" event={"ID":"e771b68a-beea-4c8b-a085-b869155ca20d","Type":"ContainerStarted","Data":"3b08935560f35380fa53730c9bdd64653cc7a118a7c4b8bd45e5eeddbd415e24"} Jan 31 14:45:08 crc kubenswrapper[4751]: I0131 14:45:08.538287 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k2xfl" event={"ID":"e656c7af-fbd9-4e9c-ae61-d4142d37c89f","Type":"ContainerStarted","Data":"3bb7101aeb47dd5d5b9aa6ef1075e32a424c360c1ebaa7fd0787c20e4303f647"} Jan 31 14:45:08 crc kubenswrapper[4751]: I0131 14:45:08.552290 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gktqp" podStartSLOduration=2.199002408 podStartE2EDuration="59.552274663s" podCreationTimestamp="2026-01-31 14:44:09 +0000 UTC" firstStartedPulling="2026-01-31 14:44:10.916982288 +0000 UTC m=+155.291695173" lastFinishedPulling="2026-01-31 14:45:08.270254543 +0000 UTC m=+212.644967428" observedRunningTime="2026-01-31 14:45:08.550594018 +0000 UTC m=+212.925306903" watchObservedRunningTime="2026-01-31 14:45:08.552274663 +0000 UTC m=+212.926987548" Jan 31 14:45:08 crc kubenswrapper[4751]: I0131 14:45:08.571264 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nfjx5" podStartSLOduration=2.338379742 podStartE2EDuration="1m0.571245271s" podCreationTimestamp="2026-01-31 14:44:08 +0000 UTC" firstStartedPulling="2026-01-31 14:44:09.899267564 +0000 UTC m=+154.273980449" lastFinishedPulling="2026-01-31 14:45:08.132133063 +0000 UTC m=+212.506845978" observedRunningTime="2026-01-31 14:45:08.56855237 +0000 UTC m=+212.943265255" watchObservedRunningTime="2026-01-31 14:45:08.571245271 +0000 UTC m=+212.945958176" Jan 31 14:45:08 crc kubenswrapper[4751]: I0131 14:45:08.587080 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-k2xfl" podStartSLOduration=2.318436348 podStartE2EDuration="1m0.587041166s" podCreationTimestamp="2026-01-31 14:44:08 +0000 UTC" firstStartedPulling="2026-01-31 14:44:09.899322416 +0000 UTC m=+154.274035301" lastFinishedPulling="2026-01-31 14:45:08.167927234 +0000 UTC m=+212.542640119" observedRunningTime="2026-01-31 14:45:08.583801881 +0000 UTC m=+212.958514776" watchObservedRunningTime="2026-01-31 14:45:08.587041166 +0000 UTC m=+212.961754051" Jan 31 14:45:08 crc kubenswrapper[4751]: I0131 14:45:08.815920 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nfjx5" Jan 31 14:45:08 crc kubenswrapper[4751]: I0131 14:45:08.816258 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nfjx5" Jan 31 14:45:08 crc kubenswrapper[4751]: I0131 14:45:08.896811 4751 patch_prober.go:28] interesting pod/machine-config-daemon-2wpj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 14:45:08 crc kubenswrapper[4751]: I0131 14:45:08.896871 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podUID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 14:45:08 crc kubenswrapper[4751]: I0131 14:45:08.896915 4751 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" Jan 31 14:45:08 crc kubenswrapper[4751]: I0131 14:45:08.897535 4751 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3956d143be77f4a50143f9678eb51ab7871e250cae73d87c9e7fce2575e466c2"} pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 31 14:45:08 crc kubenswrapper[4751]: I0131 14:45:08.897639 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podUID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" containerName="machine-config-daemon" containerID="cri-o://3956d143be77f4a50143f9678eb51ab7871e250cae73d87c9e7fce2575e466c2" gracePeriod=600 Jan 31 14:45:09 crc kubenswrapper[4751]: I0131 14:45:09.545595 4751 generic.go:334] "Generic (PLEG): container finished" podID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" containerID="3956d143be77f4a50143f9678eb51ab7871e250cae73d87c9e7fce2575e466c2" exitCode=0 Jan 31 14:45:09 crc kubenswrapper[4751]: I0131 14:45:09.545674 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" event={"ID":"b4c170e8-22c9-43a9-8b34-9d626c2ccddc","Type":"ContainerDied","Data":"3956d143be77f4a50143f9678eb51ab7871e250cae73d87c9e7fce2575e466c2"} Jan 31 14:45:09 crc kubenswrapper[4751]: I0131 14:45:09.546326 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" event={"ID":"b4c170e8-22c9-43a9-8b34-9d626c2ccddc","Type":"ContainerStarted","Data":"45cb0d3a062f00471c149bf8e8ee7eaef0df67968aef3870677e63ed898aa00d"} Jan 31 14:45:09 crc kubenswrapper[4751]: I0131 14:45:09.548577 4751 generic.go:334] "Generic (PLEG): container finished" podID="8d5f1383-42d7-47a1-9e47-8dba038241d2" containerID="c0f53c12a6e17e599de6a624dae5a0ba532d7e88bc9baf9838475b082d03f347" exitCode=0 Jan 31 14:45:09 crc kubenswrapper[4751]: I0131 14:45:09.548649 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m4m6r" event={"ID":"8d5f1383-42d7-47a1-9e47-8dba038241d2","Type":"ContainerDied","Data":"c0f53c12a6e17e599de6a624dae5a0ba532d7e88bc9baf9838475b082d03f347"} Jan 31 14:45:09 crc kubenswrapper[4751]: I0131 14:45:09.589050 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gktqp" Jan 31 14:45:09 crc kubenswrapper[4751]: I0131 14:45:09.589116 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gktqp" Jan 31 14:45:09 crc kubenswrapper[4751]: I0131 14:45:09.773985 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ln2lx"] Jan 31 14:45:09 crc kubenswrapper[4751]: I0131 14:45:09.774250 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ln2lx" podUID="d5c0f5c8-cecf-451f-abef-bf357716eb71" containerName="registry-server" containerID="cri-o://9543af933d7edd0ace2113eee9860f1097d0ebff37536026931c946593943f0a" gracePeriod=2 Jan 31 14:45:09 crc kubenswrapper[4751]: I0131 14:45:09.850616 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-nfjx5" podUID="e771b68a-beea-4c8b-a085-b869155ca20d" containerName="registry-server" probeResult="failure" output=< Jan 31 14:45:09 crc kubenswrapper[4751]: timeout: failed to connect service ":50051" within 1s Jan 31 14:45:09 crc kubenswrapper[4751]: > Jan 31 14:45:10 crc kubenswrapper[4751]: I0131 14:45:10.000405 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-s7j7f" Jan 31 14:45:10 crc kubenswrapper[4751]: I0131 14:45:10.000448 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-s7j7f" Jan 31 14:45:10 crc kubenswrapper[4751]: I0131 14:45:10.067147 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-s7j7f" Jan 31 14:45:10 crc kubenswrapper[4751]: I0131 14:45:10.264399 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ln2lx" Jan 31 14:45:10 crc kubenswrapper[4751]: I0131 14:45:10.328475 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5c0f5c8-cecf-451f-abef-bf357716eb71-catalog-content\") pod \"d5c0f5c8-cecf-451f-abef-bf357716eb71\" (UID: \"d5c0f5c8-cecf-451f-abef-bf357716eb71\") " Jan 31 14:45:10 crc kubenswrapper[4751]: I0131 14:45:10.328583 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xd7bp\" (UniqueName: \"kubernetes.io/projected/d5c0f5c8-cecf-451f-abef-bf357716eb71-kube-api-access-xd7bp\") pod \"d5c0f5c8-cecf-451f-abef-bf357716eb71\" (UID: \"d5c0f5c8-cecf-451f-abef-bf357716eb71\") " Jan 31 14:45:10 crc kubenswrapper[4751]: I0131 14:45:10.328621 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5c0f5c8-cecf-451f-abef-bf357716eb71-utilities\") pod \"d5c0f5c8-cecf-451f-abef-bf357716eb71\" (UID: \"d5c0f5c8-cecf-451f-abef-bf357716eb71\") " Jan 31 14:45:10 crc kubenswrapper[4751]: I0131 14:45:10.329893 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5c0f5c8-cecf-451f-abef-bf357716eb71-utilities" (OuterVolumeSpecName: "utilities") pod "d5c0f5c8-cecf-451f-abef-bf357716eb71" (UID: "d5c0f5c8-cecf-451f-abef-bf357716eb71"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 14:45:10 crc kubenswrapper[4751]: I0131 14:45:10.334909 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5c0f5c8-cecf-451f-abef-bf357716eb71-kube-api-access-xd7bp" (OuterVolumeSpecName: "kube-api-access-xd7bp") pod "d5c0f5c8-cecf-451f-abef-bf357716eb71" (UID: "d5c0f5c8-cecf-451f-abef-bf357716eb71"). InnerVolumeSpecName "kube-api-access-xd7bp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:45:10 crc kubenswrapper[4751]: I0131 14:45:10.402533 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5c0f5c8-cecf-451f-abef-bf357716eb71-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d5c0f5c8-cecf-451f-abef-bf357716eb71" (UID: "d5c0f5c8-cecf-451f-abef-bf357716eb71"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 14:45:10 crc kubenswrapper[4751]: I0131 14:45:10.435028 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5c0f5c8-cecf-451f-abef-bf357716eb71-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 14:45:10 crc kubenswrapper[4751]: I0131 14:45:10.435084 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xd7bp\" (UniqueName: \"kubernetes.io/projected/d5c0f5c8-cecf-451f-abef-bf357716eb71-kube-api-access-xd7bp\") on node \"crc\" DevicePath \"\"" Jan 31 14:45:10 crc kubenswrapper[4751]: I0131 14:45:10.435111 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5c0f5c8-cecf-451f-abef-bf357716eb71-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 14:45:10 crc kubenswrapper[4751]: I0131 14:45:10.443573 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xr2gt"] Jan 31 14:45:10 crc kubenswrapper[4751]: I0131 14:45:10.556277 4751 generic.go:334] "Generic (PLEG): container finished" podID="d5c0f5c8-cecf-451f-abef-bf357716eb71" containerID="9543af933d7edd0ace2113eee9860f1097d0ebff37536026931c946593943f0a" exitCode=0 Jan 31 14:45:10 crc kubenswrapper[4751]: I0131 14:45:10.556353 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ln2lx" Jan 31 14:45:10 crc kubenswrapper[4751]: I0131 14:45:10.556374 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ln2lx" event={"ID":"d5c0f5c8-cecf-451f-abef-bf357716eb71","Type":"ContainerDied","Data":"9543af933d7edd0ace2113eee9860f1097d0ebff37536026931c946593943f0a"} Jan 31 14:45:10 crc kubenswrapper[4751]: I0131 14:45:10.557673 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ln2lx" event={"ID":"d5c0f5c8-cecf-451f-abef-bf357716eb71","Type":"ContainerDied","Data":"a9f4794a6036dec4476c4be7ee2587554c0cf25782f49b8a2635038cb9771dcf"} Jan 31 14:45:10 crc kubenswrapper[4751]: I0131 14:45:10.557715 4751 scope.go:117] "RemoveContainer" containerID="9543af933d7edd0ace2113eee9860f1097d0ebff37536026931c946593943f0a" Jan 31 14:45:10 crc kubenswrapper[4751]: I0131 14:45:10.560649 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m4m6r" event={"ID":"8d5f1383-42d7-47a1-9e47-8dba038241d2","Type":"ContainerStarted","Data":"eabca8f8fcdbfb2f04b488498b2a615e9946a5ba739f9fb75c570ef168f4bcd8"} Jan 31 14:45:10 crc kubenswrapper[4751]: I0131 14:45:10.576123 4751 scope.go:117] "RemoveContainer" containerID="ec5579b0b5c03bbe363906a09e9b8073fa04eb6f15f0254accde5725abd7492c" Jan 31 14:45:10 crc kubenswrapper[4751]: I0131 14:45:10.594659 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-m4m6r" podStartSLOduration=2.199564629 podStartE2EDuration="1m4.594641732s" podCreationTimestamp="2026-01-31 14:44:06 +0000 UTC" firstStartedPulling="2026-01-31 14:44:07.792618566 +0000 UTC m=+152.167331451" lastFinishedPulling="2026-01-31 14:45:10.187695669 +0000 UTC m=+214.562408554" observedRunningTime="2026-01-31 14:45:10.583319895 +0000 UTC m=+214.958032780" watchObservedRunningTime="2026-01-31 14:45:10.594641732 +0000 UTC m=+214.969354617" Jan 31 14:45:10 crc kubenswrapper[4751]: I0131 14:45:10.598347 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ln2lx"] Jan 31 14:45:10 crc kubenswrapper[4751]: I0131 14:45:10.606368 4751 scope.go:117] "RemoveContainer" containerID="4b70b0f5c40fae7241cf1b33c7ddc52732dc42394eac071686d9ade2daf20d08" Jan 31 14:45:10 crc kubenswrapper[4751]: I0131 14:45:10.610127 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ln2lx"] Jan 31 14:45:10 crc kubenswrapper[4751]: I0131 14:45:10.615776 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-s7j7f" Jan 31 14:45:10 crc kubenswrapper[4751]: I0131 14:45:10.634260 4751 scope.go:117] "RemoveContainer" containerID="9543af933d7edd0ace2113eee9860f1097d0ebff37536026931c946593943f0a" Jan 31 14:45:10 crc kubenswrapper[4751]: E0131 14:45:10.637965 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9543af933d7edd0ace2113eee9860f1097d0ebff37536026931c946593943f0a\": container with ID starting with 9543af933d7edd0ace2113eee9860f1097d0ebff37536026931c946593943f0a not found: ID does not exist" containerID="9543af933d7edd0ace2113eee9860f1097d0ebff37536026931c946593943f0a" Jan 31 14:45:10 crc kubenswrapper[4751]: I0131 14:45:10.638005 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9543af933d7edd0ace2113eee9860f1097d0ebff37536026931c946593943f0a"} err="failed to get container status \"9543af933d7edd0ace2113eee9860f1097d0ebff37536026931c946593943f0a\": rpc error: code = NotFound desc = could not find container \"9543af933d7edd0ace2113eee9860f1097d0ebff37536026931c946593943f0a\": container with ID starting with 9543af933d7edd0ace2113eee9860f1097d0ebff37536026931c946593943f0a not found: ID does not exist" Jan 31 14:45:10 crc kubenswrapper[4751]: I0131 14:45:10.638030 4751 scope.go:117] "RemoveContainer" containerID="ec5579b0b5c03bbe363906a09e9b8073fa04eb6f15f0254accde5725abd7492c" Jan 31 14:45:10 crc kubenswrapper[4751]: I0131 14:45:10.641139 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gktqp" podUID="0cfb2e52-7371-4d38-994c-92b5b7d123cc" containerName="registry-server" probeResult="failure" output=< Jan 31 14:45:10 crc kubenswrapper[4751]: timeout: failed to connect service ":50051" within 1s Jan 31 14:45:10 crc kubenswrapper[4751]: > Jan 31 14:45:10 crc kubenswrapper[4751]: E0131 14:45:10.641169 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec5579b0b5c03bbe363906a09e9b8073fa04eb6f15f0254accde5725abd7492c\": container with ID starting with ec5579b0b5c03bbe363906a09e9b8073fa04eb6f15f0254accde5725abd7492c not found: ID does not exist" containerID="ec5579b0b5c03bbe363906a09e9b8073fa04eb6f15f0254accde5725abd7492c" Jan 31 14:45:10 crc kubenswrapper[4751]: I0131 14:45:10.641227 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec5579b0b5c03bbe363906a09e9b8073fa04eb6f15f0254accde5725abd7492c"} err="failed to get container status \"ec5579b0b5c03bbe363906a09e9b8073fa04eb6f15f0254accde5725abd7492c\": rpc error: code = NotFound desc = could not find container \"ec5579b0b5c03bbe363906a09e9b8073fa04eb6f15f0254accde5725abd7492c\": container with ID starting with ec5579b0b5c03bbe363906a09e9b8073fa04eb6f15f0254accde5725abd7492c not found: ID does not exist" Jan 31 14:45:10 crc kubenswrapper[4751]: I0131 14:45:10.641256 4751 scope.go:117] "RemoveContainer" containerID="4b70b0f5c40fae7241cf1b33c7ddc52732dc42394eac071686d9ade2daf20d08" Jan 31 14:45:10 crc kubenswrapper[4751]: E0131 14:45:10.644217 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b70b0f5c40fae7241cf1b33c7ddc52732dc42394eac071686d9ade2daf20d08\": container with ID starting with 4b70b0f5c40fae7241cf1b33c7ddc52732dc42394eac071686d9ade2daf20d08 not found: ID does not exist" containerID="4b70b0f5c40fae7241cf1b33c7ddc52732dc42394eac071686d9ade2daf20d08" Jan 31 14:45:10 crc kubenswrapper[4751]: I0131 14:45:10.644263 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b70b0f5c40fae7241cf1b33c7ddc52732dc42394eac071686d9ade2daf20d08"} err="failed to get container status \"4b70b0f5c40fae7241cf1b33c7ddc52732dc42394eac071686d9ade2daf20d08\": rpc error: code = NotFound desc = could not find container \"4b70b0f5c40fae7241cf1b33c7ddc52732dc42394eac071686d9ade2daf20d08\": container with ID starting with 4b70b0f5c40fae7241cf1b33c7ddc52732dc42394eac071686d9ade2daf20d08 not found: ID does not exist" Jan 31 14:45:12 crc kubenswrapper[4751]: I0131 14:45:12.175202 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-s7j7f"] Jan 31 14:45:12 crc kubenswrapper[4751]: I0131 14:45:12.413195 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5c0f5c8-cecf-451f-abef-bf357716eb71" path="/var/lib/kubelet/pods/d5c0f5c8-cecf-451f-abef-bf357716eb71/volumes" Jan 31 14:45:12 crc kubenswrapper[4751]: I0131 14:45:12.572563 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-s7j7f" podUID="f614f9ab-b5e2-4548-93e7-571d1ffb57b0" containerName="registry-server" containerID="cri-o://060b3a3ddee4e9105f7ffb6a1ce801e4a26650b10b65707cadc77226dc18ea06" gracePeriod=2 Jan 31 14:45:13 crc kubenswrapper[4751]: I0131 14:45:13.588932 4751 generic.go:334] "Generic (PLEG): container finished" podID="f614f9ab-b5e2-4548-93e7-571d1ffb57b0" containerID="060b3a3ddee4e9105f7ffb6a1ce801e4a26650b10b65707cadc77226dc18ea06" exitCode=0 Jan 31 14:45:13 crc kubenswrapper[4751]: I0131 14:45:13.589015 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s7j7f" event={"ID":"f614f9ab-b5e2-4548-93e7-571d1ffb57b0","Type":"ContainerDied","Data":"060b3a3ddee4e9105f7ffb6a1ce801e4a26650b10b65707cadc77226dc18ea06"} Jan 31 14:45:13 crc kubenswrapper[4751]: I0131 14:45:13.829803 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s7j7f" Jan 31 14:45:13 crc kubenswrapper[4751]: I0131 14:45:13.980082 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f614f9ab-b5e2-4548-93e7-571d1ffb57b0-catalog-content\") pod \"f614f9ab-b5e2-4548-93e7-571d1ffb57b0\" (UID: \"f614f9ab-b5e2-4548-93e7-571d1ffb57b0\") " Jan 31 14:45:13 crc kubenswrapper[4751]: I0131 14:45:13.980147 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f614f9ab-b5e2-4548-93e7-571d1ffb57b0-utilities\") pod \"f614f9ab-b5e2-4548-93e7-571d1ffb57b0\" (UID: \"f614f9ab-b5e2-4548-93e7-571d1ffb57b0\") " Jan 31 14:45:13 crc kubenswrapper[4751]: I0131 14:45:13.980320 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hwzm\" (UniqueName: \"kubernetes.io/projected/f614f9ab-b5e2-4548-93e7-571d1ffb57b0-kube-api-access-8hwzm\") pod \"f614f9ab-b5e2-4548-93e7-571d1ffb57b0\" (UID: \"f614f9ab-b5e2-4548-93e7-571d1ffb57b0\") " Jan 31 14:45:13 crc kubenswrapper[4751]: I0131 14:45:13.981524 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f614f9ab-b5e2-4548-93e7-571d1ffb57b0-utilities" (OuterVolumeSpecName: "utilities") pod "f614f9ab-b5e2-4548-93e7-571d1ffb57b0" (UID: "f614f9ab-b5e2-4548-93e7-571d1ffb57b0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 14:45:13 crc kubenswrapper[4751]: I0131 14:45:13.987817 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f614f9ab-b5e2-4548-93e7-571d1ffb57b0-kube-api-access-8hwzm" (OuterVolumeSpecName: "kube-api-access-8hwzm") pod "f614f9ab-b5e2-4548-93e7-571d1ffb57b0" (UID: "f614f9ab-b5e2-4548-93e7-571d1ffb57b0"). InnerVolumeSpecName "kube-api-access-8hwzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:45:14 crc kubenswrapper[4751]: I0131 14:45:14.081831 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hwzm\" (UniqueName: \"kubernetes.io/projected/f614f9ab-b5e2-4548-93e7-571d1ffb57b0-kube-api-access-8hwzm\") on node \"crc\" DevicePath \"\"" Jan 31 14:45:14 crc kubenswrapper[4751]: I0131 14:45:14.082111 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f614f9ab-b5e2-4548-93e7-571d1ffb57b0-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 14:45:14 crc kubenswrapper[4751]: I0131 14:45:14.091440 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f614f9ab-b5e2-4548-93e7-571d1ffb57b0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f614f9ab-b5e2-4548-93e7-571d1ffb57b0" (UID: "f614f9ab-b5e2-4548-93e7-571d1ffb57b0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 14:45:14 crc kubenswrapper[4751]: I0131 14:45:14.183649 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f614f9ab-b5e2-4548-93e7-571d1ffb57b0-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 14:45:14 crc kubenswrapper[4751]: I0131 14:45:14.596823 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s7j7f" event={"ID":"f614f9ab-b5e2-4548-93e7-571d1ffb57b0","Type":"ContainerDied","Data":"7b416007999209b30e30ac3cbb706b9a31917cc6ff3256ae9a397696b89670d4"} Jan 31 14:45:14 crc kubenswrapper[4751]: I0131 14:45:14.596886 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s7j7f" Jan 31 14:45:14 crc kubenswrapper[4751]: I0131 14:45:14.596908 4751 scope.go:117] "RemoveContainer" containerID="060b3a3ddee4e9105f7ffb6a1ce801e4a26650b10b65707cadc77226dc18ea06" Jan 31 14:45:14 crc kubenswrapper[4751]: I0131 14:45:14.612143 4751 scope.go:117] "RemoveContainer" containerID="f9bdddf94b5f6d16e3861f0fec527d5909cdc3d4c12d6d71c61a9d592a18874f" Jan 31 14:45:14 crc kubenswrapper[4751]: I0131 14:45:14.616235 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-s7j7f"] Jan 31 14:45:14 crc kubenswrapper[4751]: I0131 14:45:14.621523 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-s7j7f"] Jan 31 14:45:14 crc kubenswrapper[4751]: I0131 14:45:14.633545 4751 scope.go:117] "RemoveContainer" containerID="9dea3e4098c379086439a00ba95f58535865ef9c6e3300b004af608a3da30bb4" Jan 31 14:45:16 crc kubenswrapper[4751]: I0131 14:45:16.411960 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f614f9ab-b5e2-4548-93e7-571d1ffb57b0" path="/var/lib/kubelet/pods/f614f9ab-b5e2-4548-93e7-571d1ffb57b0/volumes" Jan 31 14:45:16 crc kubenswrapper[4751]: I0131 14:45:16.635713 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-m4m6r" Jan 31 14:45:16 crc kubenswrapper[4751]: I0131 14:45:16.635755 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-m4m6r" Jan 31 14:45:16 crc kubenswrapper[4751]: I0131 14:45:16.694605 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-m4m6r" Jan 31 14:45:17 crc kubenswrapper[4751]: I0131 14:45:17.053471 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2lq4t" Jan 31 14:45:17 crc kubenswrapper[4751]: I0131 14:45:17.652926 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-m4m6r" Jan 31 14:45:17 crc kubenswrapper[4751]: I0131 14:45:17.974607 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2lq4t"] Jan 31 14:45:17 crc kubenswrapper[4751]: I0131 14:45:17.975153 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2lq4t" podUID="c447796d-48ac-4eeb-8fe6-ad411966b3d3" containerName="registry-server" containerID="cri-o://cd00a1023a77b74abf41fc259fe9b2a475f7ada25dc1b39aa83f13c57794ccfd" gracePeriod=2 Jan 31 14:45:18 crc kubenswrapper[4751]: I0131 14:45:18.447097 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-k2xfl" Jan 31 14:45:18 crc kubenswrapper[4751]: I0131 14:45:18.447249 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-k2xfl" Jan 31 14:45:18 crc kubenswrapper[4751]: I0131 14:45:18.481878 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-k2xfl" Jan 31 14:45:18 crc kubenswrapper[4751]: E0131 14:45:18.588699 4751 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc447796d_48ac_4eeb_8fe6_ad411966b3d3.slice/crio-conmon-cd00a1023a77b74abf41fc259fe9b2a475f7ada25dc1b39aa83f13c57794ccfd.scope\": RecentStats: unable to find data in memory cache]" Jan 31 14:45:18 crc kubenswrapper[4751]: I0131 14:45:18.620713 4751 generic.go:334] "Generic (PLEG): container finished" podID="c447796d-48ac-4eeb-8fe6-ad411966b3d3" containerID="cd00a1023a77b74abf41fc259fe9b2a475f7ada25dc1b39aa83f13c57794ccfd" exitCode=0 Jan 31 14:45:18 crc kubenswrapper[4751]: I0131 14:45:18.620816 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2lq4t" event={"ID":"c447796d-48ac-4eeb-8fe6-ad411966b3d3","Type":"ContainerDied","Data":"cd00a1023a77b74abf41fc259fe9b2a475f7ada25dc1b39aa83f13c57794ccfd"} Jan 31 14:45:18 crc kubenswrapper[4751]: I0131 14:45:18.665352 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-k2xfl" Jan 31 14:45:18 crc kubenswrapper[4751]: I0131 14:45:18.854624 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nfjx5" Jan 31 14:45:18 crc kubenswrapper[4751]: I0131 14:45:18.887756 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nfjx5" Jan 31 14:45:19 crc kubenswrapper[4751]: I0131 14:45:19.475612 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2lq4t" Jan 31 14:45:19 crc kubenswrapper[4751]: I0131 14:45:19.552433 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c447796d-48ac-4eeb-8fe6-ad411966b3d3-utilities\") pod \"c447796d-48ac-4eeb-8fe6-ad411966b3d3\" (UID: \"c447796d-48ac-4eeb-8fe6-ad411966b3d3\") " Jan 31 14:45:19 crc kubenswrapper[4751]: I0131 14:45:19.552570 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2g96\" (UniqueName: \"kubernetes.io/projected/c447796d-48ac-4eeb-8fe6-ad411966b3d3-kube-api-access-n2g96\") pod \"c447796d-48ac-4eeb-8fe6-ad411966b3d3\" (UID: \"c447796d-48ac-4eeb-8fe6-ad411966b3d3\") " Jan 31 14:45:19 crc kubenswrapper[4751]: I0131 14:45:19.552620 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c447796d-48ac-4eeb-8fe6-ad411966b3d3-catalog-content\") pod \"c447796d-48ac-4eeb-8fe6-ad411966b3d3\" (UID: \"c447796d-48ac-4eeb-8fe6-ad411966b3d3\") " Jan 31 14:45:19 crc kubenswrapper[4751]: I0131 14:45:19.553403 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c447796d-48ac-4eeb-8fe6-ad411966b3d3-utilities" (OuterVolumeSpecName: "utilities") pod "c447796d-48ac-4eeb-8fe6-ad411966b3d3" (UID: "c447796d-48ac-4eeb-8fe6-ad411966b3d3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 14:45:19 crc kubenswrapper[4751]: I0131 14:45:19.558400 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c447796d-48ac-4eeb-8fe6-ad411966b3d3-kube-api-access-n2g96" (OuterVolumeSpecName: "kube-api-access-n2g96") pod "c447796d-48ac-4eeb-8fe6-ad411966b3d3" (UID: "c447796d-48ac-4eeb-8fe6-ad411966b3d3"). InnerVolumeSpecName "kube-api-access-n2g96". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:45:19 crc kubenswrapper[4751]: I0131 14:45:19.594458 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c447796d-48ac-4eeb-8fe6-ad411966b3d3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c447796d-48ac-4eeb-8fe6-ad411966b3d3" (UID: "c447796d-48ac-4eeb-8fe6-ad411966b3d3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 14:45:19 crc kubenswrapper[4751]: I0131 14:45:19.627710 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2lq4t" event={"ID":"c447796d-48ac-4eeb-8fe6-ad411966b3d3","Type":"ContainerDied","Data":"5c1f5c13def0721993c42fbb7e9330a705cffc8e6326a288871d364ef1275f63"} Jan 31 14:45:19 crc kubenswrapper[4751]: I0131 14:45:19.627795 4751 scope.go:117] "RemoveContainer" containerID="cd00a1023a77b74abf41fc259fe9b2a475f7ada25dc1b39aa83f13c57794ccfd" Jan 31 14:45:19 crc kubenswrapper[4751]: I0131 14:45:19.627796 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2lq4t" Jan 31 14:45:19 crc kubenswrapper[4751]: I0131 14:45:19.647933 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gktqp" Jan 31 14:45:19 crc kubenswrapper[4751]: I0131 14:45:19.648643 4751 scope.go:117] "RemoveContainer" containerID="6ee8fb8b12b6ee8cd20623ca96e7a87fc43d879e6d76a01bc3e55e235825e807" Jan 31 14:45:19 crc kubenswrapper[4751]: I0131 14:45:19.655364 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c447796d-48ac-4eeb-8fe6-ad411966b3d3-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 14:45:19 crc kubenswrapper[4751]: I0131 14:45:19.655404 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2g96\" (UniqueName: \"kubernetes.io/projected/c447796d-48ac-4eeb-8fe6-ad411966b3d3-kube-api-access-n2g96\") on node \"crc\" DevicePath \"\"" Jan 31 14:45:19 crc kubenswrapper[4751]: I0131 14:45:19.655423 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c447796d-48ac-4eeb-8fe6-ad411966b3d3-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 14:45:19 crc kubenswrapper[4751]: I0131 14:45:19.655451 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2lq4t"] Jan 31 14:45:19 crc kubenswrapper[4751]: I0131 14:45:19.663158 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2lq4t"] Jan 31 14:45:19 crc kubenswrapper[4751]: I0131 14:45:19.687907 4751 scope.go:117] "RemoveContainer" containerID="f55678880104a29f2f67c32892dfe2939404ec7dce246a6e2dd6c365f96de5ab" Jan 31 14:45:19 crc kubenswrapper[4751]: I0131 14:45:19.694387 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gktqp" Jan 31 14:45:20 crc kubenswrapper[4751]: I0131 14:45:20.416995 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c447796d-48ac-4eeb-8fe6-ad411966b3d3" path="/var/lib/kubelet/pods/c447796d-48ac-4eeb-8fe6-ad411966b3d3/volumes" Jan 31 14:45:20 crc kubenswrapper[4751]: I0131 14:45:20.779759 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nfjx5"] Jan 31 14:45:20 crc kubenswrapper[4751]: I0131 14:45:20.780168 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nfjx5" podUID="e771b68a-beea-4c8b-a085-b869155ca20d" containerName="registry-server" containerID="cri-o://3b08935560f35380fa53730c9bdd64653cc7a118a7c4b8bd45e5eeddbd415e24" gracePeriod=2 Jan 31 14:45:21 crc kubenswrapper[4751]: I0131 14:45:21.224594 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nfjx5" Jan 31 14:45:21 crc kubenswrapper[4751]: I0131 14:45:21.278658 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e771b68a-beea-4c8b-a085-b869155ca20d-utilities\") pod \"e771b68a-beea-4c8b-a085-b869155ca20d\" (UID: \"e771b68a-beea-4c8b-a085-b869155ca20d\") " Jan 31 14:45:21 crc kubenswrapper[4751]: I0131 14:45:21.278712 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e771b68a-beea-4c8b-a085-b869155ca20d-catalog-content\") pod \"e771b68a-beea-4c8b-a085-b869155ca20d\" (UID: \"e771b68a-beea-4c8b-a085-b869155ca20d\") " Jan 31 14:45:21 crc kubenswrapper[4751]: I0131 14:45:21.278761 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nf7x\" (UniqueName: \"kubernetes.io/projected/e771b68a-beea-4c8b-a085-b869155ca20d-kube-api-access-4nf7x\") pod \"e771b68a-beea-4c8b-a085-b869155ca20d\" (UID: \"e771b68a-beea-4c8b-a085-b869155ca20d\") " Jan 31 14:45:21 crc kubenswrapper[4751]: I0131 14:45:21.279493 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e771b68a-beea-4c8b-a085-b869155ca20d-utilities" (OuterVolumeSpecName: "utilities") pod "e771b68a-beea-4c8b-a085-b869155ca20d" (UID: "e771b68a-beea-4c8b-a085-b869155ca20d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 14:45:21 crc kubenswrapper[4751]: I0131 14:45:21.282623 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e771b68a-beea-4c8b-a085-b869155ca20d-kube-api-access-4nf7x" (OuterVolumeSpecName: "kube-api-access-4nf7x") pod "e771b68a-beea-4c8b-a085-b869155ca20d" (UID: "e771b68a-beea-4c8b-a085-b869155ca20d"). InnerVolumeSpecName "kube-api-access-4nf7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:45:21 crc kubenswrapper[4751]: I0131 14:45:21.313209 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e771b68a-beea-4c8b-a085-b869155ca20d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e771b68a-beea-4c8b-a085-b869155ca20d" (UID: "e771b68a-beea-4c8b-a085-b869155ca20d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 14:45:21 crc kubenswrapper[4751]: I0131 14:45:21.379763 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nf7x\" (UniqueName: \"kubernetes.io/projected/e771b68a-beea-4c8b-a085-b869155ca20d-kube-api-access-4nf7x\") on node \"crc\" DevicePath \"\"" Jan 31 14:45:21 crc kubenswrapper[4751]: I0131 14:45:21.379797 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e771b68a-beea-4c8b-a085-b869155ca20d-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 14:45:21 crc kubenswrapper[4751]: I0131 14:45:21.379806 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e771b68a-beea-4c8b-a085-b869155ca20d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 14:45:21 crc kubenswrapper[4751]: I0131 14:45:21.641817 4751 generic.go:334] "Generic (PLEG): container finished" podID="e771b68a-beea-4c8b-a085-b869155ca20d" containerID="3b08935560f35380fa53730c9bdd64653cc7a118a7c4b8bd45e5eeddbd415e24" exitCode=0 Jan 31 14:45:21 crc kubenswrapper[4751]: I0131 14:45:21.641898 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nfjx5" event={"ID":"e771b68a-beea-4c8b-a085-b869155ca20d","Type":"ContainerDied","Data":"3b08935560f35380fa53730c9bdd64653cc7a118a7c4b8bd45e5eeddbd415e24"} Jan 31 14:45:21 crc kubenswrapper[4751]: I0131 14:45:21.641952 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nfjx5" Jan 31 14:45:21 crc kubenswrapper[4751]: I0131 14:45:21.641990 4751 scope.go:117] "RemoveContainer" containerID="3b08935560f35380fa53730c9bdd64653cc7a118a7c4b8bd45e5eeddbd415e24" Jan 31 14:45:21 crc kubenswrapper[4751]: I0131 14:45:21.641971 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nfjx5" event={"ID":"e771b68a-beea-4c8b-a085-b869155ca20d","Type":"ContainerDied","Data":"17e2b2135e55e973ccc015ba33cfd9e0c7a1763d73b3153f649e1c6747bac744"} Jan 31 14:45:21 crc kubenswrapper[4751]: I0131 14:45:21.660365 4751 scope.go:117] "RemoveContainer" containerID="449419de3999925adbffe064ed4c6d253fd8062b2c5e50eef6d0e389bc5f1a74" Jan 31 14:45:21 crc kubenswrapper[4751]: I0131 14:45:21.670328 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nfjx5"] Jan 31 14:45:21 crc kubenswrapper[4751]: I0131 14:45:21.673218 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nfjx5"] Jan 31 14:45:21 crc kubenswrapper[4751]: I0131 14:45:21.700844 4751 scope.go:117] "RemoveContainer" containerID="cc1400d076f7032bfa7b9349903c39c2a8d9d2e65e96a7551c8c78a1f7255455" Jan 31 14:45:21 crc kubenswrapper[4751]: I0131 14:45:21.716226 4751 scope.go:117] "RemoveContainer" containerID="3b08935560f35380fa53730c9bdd64653cc7a118a7c4b8bd45e5eeddbd415e24" Jan 31 14:45:21 crc kubenswrapper[4751]: E0131 14:45:21.716642 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b08935560f35380fa53730c9bdd64653cc7a118a7c4b8bd45e5eeddbd415e24\": container with ID starting with 3b08935560f35380fa53730c9bdd64653cc7a118a7c4b8bd45e5eeddbd415e24 not found: ID does not exist" containerID="3b08935560f35380fa53730c9bdd64653cc7a118a7c4b8bd45e5eeddbd415e24" Jan 31 14:45:21 crc kubenswrapper[4751]: I0131 14:45:21.716747 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b08935560f35380fa53730c9bdd64653cc7a118a7c4b8bd45e5eeddbd415e24"} err="failed to get container status \"3b08935560f35380fa53730c9bdd64653cc7a118a7c4b8bd45e5eeddbd415e24\": rpc error: code = NotFound desc = could not find container \"3b08935560f35380fa53730c9bdd64653cc7a118a7c4b8bd45e5eeddbd415e24\": container with ID starting with 3b08935560f35380fa53730c9bdd64653cc7a118a7c4b8bd45e5eeddbd415e24 not found: ID does not exist" Jan 31 14:45:21 crc kubenswrapper[4751]: I0131 14:45:21.716849 4751 scope.go:117] "RemoveContainer" containerID="449419de3999925adbffe064ed4c6d253fd8062b2c5e50eef6d0e389bc5f1a74" Jan 31 14:45:21 crc kubenswrapper[4751]: E0131 14:45:21.717444 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"449419de3999925adbffe064ed4c6d253fd8062b2c5e50eef6d0e389bc5f1a74\": container with ID starting with 449419de3999925adbffe064ed4c6d253fd8062b2c5e50eef6d0e389bc5f1a74 not found: ID does not exist" containerID="449419de3999925adbffe064ed4c6d253fd8062b2c5e50eef6d0e389bc5f1a74" Jan 31 14:45:21 crc kubenswrapper[4751]: I0131 14:45:21.717481 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"449419de3999925adbffe064ed4c6d253fd8062b2c5e50eef6d0e389bc5f1a74"} err="failed to get container status \"449419de3999925adbffe064ed4c6d253fd8062b2c5e50eef6d0e389bc5f1a74\": rpc error: code = NotFound desc = could not find container \"449419de3999925adbffe064ed4c6d253fd8062b2c5e50eef6d0e389bc5f1a74\": container with ID starting with 449419de3999925adbffe064ed4c6d253fd8062b2c5e50eef6d0e389bc5f1a74 not found: ID does not exist" Jan 31 14:45:21 crc kubenswrapper[4751]: I0131 14:45:21.717506 4751 scope.go:117] "RemoveContainer" containerID="cc1400d076f7032bfa7b9349903c39c2a8d9d2e65e96a7551c8c78a1f7255455" Jan 31 14:45:21 crc kubenswrapper[4751]: E0131 14:45:21.717790 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc1400d076f7032bfa7b9349903c39c2a8d9d2e65e96a7551c8c78a1f7255455\": container with ID starting with cc1400d076f7032bfa7b9349903c39c2a8d9d2e65e96a7551c8c78a1f7255455 not found: ID does not exist" containerID="cc1400d076f7032bfa7b9349903c39c2a8d9d2e65e96a7551c8c78a1f7255455" Jan 31 14:45:21 crc kubenswrapper[4751]: I0131 14:45:21.717891 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc1400d076f7032bfa7b9349903c39c2a8d9d2e65e96a7551c8c78a1f7255455"} err="failed to get container status \"cc1400d076f7032bfa7b9349903c39c2a8d9d2e65e96a7551c8c78a1f7255455\": rpc error: code = NotFound desc = could not find container \"cc1400d076f7032bfa7b9349903c39c2a8d9d2e65e96a7551c8c78a1f7255455\": container with ID starting with cc1400d076f7032bfa7b9349903c39c2a8d9d2e65e96a7551c8c78a1f7255455 not found: ID does not exist" Jan 31 14:45:22 crc kubenswrapper[4751]: I0131 14:45:22.413559 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e771b68a-beea-4c8b-a085-b869155ca20d" path="/var/lib/kubelet/pods/e771b68a-beea-4c8b-a085-b869155ca20d/volumes" Jan 31 14:45:23 crc kubenswrapper[4751]: I0131 14:45:23.886859 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-9ff5cfb77-zl2hc"] Jan 31 14:45:23 crc kubenswrapper[4751]: I0131 14:45:23.887128 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-9ff5cfb77-zl2hc" podUID="7689427f-2c92-4b56-9617-1139504142ee" containerName="controller-manager" containerID="cri-o://5ab0b26976de57e2a1cac017e1118160bad1efe8234f246fb4610cad0bf8fe55" gracePeriod=30 Jan 31 14:45:23 crc kubenswrapper[4751]: I0131 14:45:23.985995 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fb78c7854-7b78f"] Jan 31 14:45:23 crc kubenswrapper[4751]: I0131 14:45:23.986775 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6fb78c7854-7b78f" podUID="78cdf25f-daec-4cd7-8954-1fef6f3727db" containerName="route-controller-manager" containerID="cri-o://d6fcf4545fc743b58a8d591ee8c96761c9cf12c0b3c6f1393a495a38267ab81b" gracePeriod=30 Jan 31 14:45:24 crc kubenswrapper[4751]: I0131 14:45:24.389568 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6fb78c7854-7b78f" Jan 31 14:45:24 crc kubenswrapper[4751]: I0131 14:45:24.482981 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9ff5cfb77-zl2hc" Jan 31 14:45:24 crc kubenswrapper[4751]: I0131 14:45:24.523592 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/78cdf25f-daec-4cd7-8954-1fef6f3727db-client-ca\") pod \"78cdf25f-daec-4cd7-8954-1fef6f3727db\" (UID: \"78cdf25f-daec-4cd7-8954-1fef6f3727db\") " Jan 31 14:45:24 crc kubenswrapper[4751]: I0131 14:45:24.523646 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78cdf25f-daec-4cd7-8954-1fef6f3727db-serving-cert\") pod \"78cdf25f-daec-4cd7-8954-1fef6f3727db\" (UID: \"78cdf25f-daec-4cd7-8954-1fef6f3727db\") " Jan 31 14:45:24 crc kubenswrapper[4751]: I0131 14:45:24.523697 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tv6b\" (UniqueName: \"kubernetes.io/projected/78cdf25f-daec-4cd7-8954-1fef6f3727db-kube-api-access-6tv6b\") pod \"78cdf25f-daec-4cd7-8954-1fef6f3727db\" (UID: \"78cdf25f-daec-4cd7-8954-1fef6f3727db\") " Jan 31 14:45:24 crc kubenswrapper[4751]: I0131 14:45:24.523735 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78cdf25f-daec-4cd7-8954-1fef6f3727db-config\") pod \"78cdf25f-daec-4cd7-8954-1fef6f3727db\" (UID: \"78cdf25f-daec-4cd7-8954-1fef6f3727db\") " Jan 31 14:45:24 crc kubenswrapper[4751]: I0131 14:45:24.524659 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78cdf25f-daec-4cd7-8954-1fef6f3727db-client-ca" (OuterVolumeSpecName: "client-ca") pod "78cdf25f-daec-4cd7-8954-1fef6f3727db" (UID: "78cdf25f-daec-4cd7-8954-1fef6f3727db"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:45:24 crc kubenswrapper[4751]: I0131 14:45:24.524692 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78cdf25f-daec-4cd7-8954-1fef6f3727db-config" (OuterVolumeSpecName: "config") pod "78cdf25f-daec-4cd7-8954-1fef6f3727db" (UID: "78cdf25f-daec-4cd7-8954-1fef6f3727db"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:45:24 crc kubenswrapper[4751]: I0131 14:45:24.529060 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78cdf25f-daec-4cd7-8954-1fef6f3727db-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "78cdf25f-daec-4cd7-8954-1fef6f3727db" (UID: "78cdf25f-daec-4cd7-8954-1fef6f3727db"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:45:24 crc kubenswrapper[4751]: I0131 14:45:24.529308 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78cdf25f-daec-4cd7-8954-1fef6f3727db-kube-api-access-6tv6b" (OuterVolumeSpecName: "kube-api-access-6tv6b") pod "78cdf25f-daec-4cd7-8954-1fef6f3727db" (UID: "78cdf25f-daec-4cd7-8954-1fef6f3727db"). InnerVolumeSpecName "kube-api-access-6tv6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:45:24 crc kubenswrapper[4751]: I0131 14:45:24.624760 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7689427f-2c92-4b56-9617-1139504142ee-client-ca\") pod \"7689427f-2c92-4b56-9617-1139504142ee\" (UID: \"7689427f-2c92-4b56-9617-1139504142ee\") " Jan 31 14:45:24 crc kubenswrapper[4751]: I0131 14:45:24.624822 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7689427f-2c92-4b56-9617-1139504142ee-config\") pod \"7689427f-2c92-4b56-9617-1139504142ee\" (UID: \"7689427f-2c92-4b56-9617-1139504142ee\") " Jan 31 14:45:24 crc kubenswrapper[4751]: I0131 14:45:24.624844 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7689427f-2c92-4b56-9617-1139504142ee-serving-cert\") pod \"7689427f-2c92-4b56-9617-1139504142ee\" (UID: \"7689427f-2c92-4b56-9617-1139504142ee\") " Jan 31 14:45:24 crc kubenswrapper[4751]: I0131 14:45:24.624913 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7689427f-2c92-4b56-9617-1139504142ee-proxy-ca-bundles\") pod \"7689427f-2c92-4b56-9617-1139504142ee\" (UID: \"7689427f-2c92-4b56-9617-1139504142ee\") " Jan 31 14:45:24 crc kubenswrapper[4751]: I0131 14:45:24.624933 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krs4f\" (UniqueName: \"kubernetes.io/projected/7689427f-2c92-4b56-9617-1139504142ee-kube-api-access-krs4f\") pod \"7689427f-2c92-4b56-9617-1139504142ee\" (UID: \"7689427f-2c92-4b56-9617-1139504142ee\") " Jan 31 14:45:24 crc kubenswrapper[4751]: I0131 14:45:24.625141 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tv6b\" (UniqueName: \"kubernetes.io/projected/78cdf25f-daec-4cd7-8954-1fef6f3727db-kube-api-access-6tv6b\") on node \"crc\" DevicePath \"\"" Jan 31 14:45:24 crc kubenswrapper[4751]: I0131 14:45:24.625152 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78cdf25f-daec-4cd7-8954-1fef6f3727db-config\") on node \"crc\" DevicePath \"\"" Jan 31 14:45:24 crc kubenswrapper[4751]: I0131 14:45:24.625163 4751 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/78cdf25f-daec-4cd7-8954-1fef6f3727db-client-ca\") on node \"crc\" DevicePath \"\"" Jan 31 14:45:24 crc kubenswrapper[4751]: I0131 14:45:24.625172 4751 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78cdf25f-daec-4cd7-8954-1fef6f3727db-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 14:45:24 crc kubenswrapper[4751]: I0131 14:45:24.626057 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7689427f-2c92-4b56-9617-1139504142ee-client-ca" (OuterVolumeSpecName: "client-ca") pod "7689427f-2c92-4b56-9617-1139504142ee" (UID: "7689427f-2c92-4b56-9617-1139504142ee"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:45:24 crc kubenswrapper[4751]: I0131 14:45:24.626083 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7689427f-2c92-4b56-9617-1139504142ee-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7689427f-2c92-4b56-9617-1139504142ee" (UID: "7689427f-2c92-4b56-9617-1139504142ee"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:45:24 crc kubenswrapper[4751]: I0131 14:45:24.626263 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7689427f-2c92-4b56-9617-1139504142ee-config" (OuterVolumeSpecName: "config") pod "7689427f-2c92-4b56-9617-1139504142ee" (UID: "7689427f-2c92-4b56-9617-1139504142ee"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:45:24 crc kubenswrapper[4751]: I0131 14:45:24.630216 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7689427f-2c92-4b56-9617-1139504142ee-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7689427f-2c92-4b56-9617-1139504142ee" (UID: "7689427f-2c92-4b56-9617-1139504142ee"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:45:24 crc kubenswrapper[4751]: I0131 14:45:24.631271 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7689427f-2c92-4b56-9617-1139504142ee-kube-api-access-krs4f" (OuterVolumeSpecName: "kube-api-access-krs4f") pod "7689427f-2c92-4b56-9617-1139504142ee" (UID: "7689427f-2c92-4b56-9617-1139504142ee"). InnerVolumeSpecName "kube-api-access-krs4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:45:24 crc kubenswrapper[4751]: I0131 14:45:24.656534 4751 generic.go:334] "Generic (PLEG): container finished" podID="7689427f-2c92-4b56-9617-1139504142ee" containerID="5ab0b26976de57e2a1cac017e1118160bad1efe8234f246fb4610cad0bf8fe55" exitCode=0 Jan 31 14:45:24 crc kubenswrapper[4751]: I0131 14:45:24.656626 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9ff5cfb77-zl2hc" Jan 31 14:45:24 crc kubenswrapper[4751]: I0131 14:45:24.656603 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9ff5cfb77-zl2hc" event={"ID":"7689427f-2c92-4b56-9617-1139504142ee","Type":"ContainerDied","Data":"5ab0b26976de57e2a1cac017e1118160bad1efe8234f246fb4610cad0bf8fe55"} Jan 31 14:45:24 crc kubenswrapper[4751]: I0131 14:45:24.656753 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9ff5cfb77-zl2hc" event={"ID":"7689427f-2c92-4b56-9617-1139504142ee","Type":"ContainerDied","Data":"53e8f013a379679cef6168fde6d18706b1593479aa88c48f6b833cf1be744d64"} Jan 31 14:45:24 crc kubenswrapper[4751]: I0131 14:45:24.656778 4751 scope.go:117] "RemoveContainer" containerID="5ab0b26976de57e2a1cac017e1118160bad1efe8234f246fb4610cad0bf8fe55" Jan 31 14:45:24 crc kubenswrapper[4751]: I0131 14:45:24.657790 4751 generic.go:334] "Generic (PLEG): container finished" podID="78cdf25f-daec-4cd7-8954-1fef6f3727db" containerID="d6fcf4545fc743b58a8d591ee8c96761c9cf12c0b3c6f1393a495a38267ab81b" exitCode=0 Jan 31 14:45:24 crc kubenswrapper[4751]: I0131 14:45:24.657813 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6fb78c7854-7b78f" event={"ID":"78cdf25f-daec-4cd7-8954-1fef6f3727db","Type":"ContainerDied","Data":"d6fcf4545fc743b58a8d591ee8c96761c9cf12c0b3c6f1393a495a38267ab81b"} Jan 31 14:45:24 crc kubenswrapper[4751]: I0131 14:45:24.657826 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6fb78c7854-7b78f" event={"ID":"78cdf25f-daec-4cd7-8954-1fef6f3727db","Type":"ContainerDied","Data":"970e5859fee2ce045474246c69d85b582bd206939166d1d5d70f07913b640d31"} Jan 31 14:45:24 crc kubenswrapper[4751]: I0131 14:45:24.657868 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6fb78c7854-7b78f" Jan 31 14:45:24 crc kubenswrapper[4751]: I0131 14:45:24.675222 4751 scope.go:117] "RemoveContainer" containerID="5ab0b26976de57e2a1cac017e1118160bad1efe8234f246fb4610cad0bf8fe55" Jan 31 14:45:24 crc kubenswrapper[4751]: E0131 14:45:24.675524 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ab0b26976de57e2a1cac017e1118160bad1efe8234f246fb4610cad0bf8fe55\": container with ID starting with 5ab0b26976de57e2a1cac017e1118160bad1efe8234f246fb4610cad0bf8fe55 not found: ID does not exist" containerID="5ab0b26976de57e2a1cac017e1118160bad1efe8234f246fb4610cad0bf8fe55" Jan 31 14:45:24 crc kubenswrapper[4751]: I0131 14:45:24.675552 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ab0b26976de57e2a1cac017e1118160bad1efe8234f246fb4610cad0bf8fe55"} err="failed to get container status \"5ab0b26976de57e2a1cac017e1118160bad1efe8234f246fb4610cad0bf8fe55\": rpc error: code = NotFound desc = could not find container \"5ab0b26976de57e2a1cac017e1118160bad1efe8234f246fb4610cad0bf8fe55\": container with ID starting with 5ab0b26976de57e2a1cac017e1118160bad1efe8234f246fb4610cad0bf8fe55 not found: ID does not exist" Jan 31 14:45:24 crc kubenswrapper[4751]: I0131 14:45:24.675573 4751 scope.go:117] "RemoveContainer" containerID="d6fcf4545fc743b58a8d591ee8c96761c9cf12c0b3c6f1393a495a38267ab81b" Jan 31 14:45:24 crc kubenswrapper[4751]: I0131 14:45:24.684088 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fb78c7854-7b78f"] Jan 31 14:45:24 crc kubenswrapper[4751]: I0131 14:45:24.687522 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fb78c7854-7b78f"] Jan 31 14:45:24 crc kubenswrapper[4751]: I0131 14:45:24.688492 4751 scope.go:117] "RemoveContainer" containerID="d6fcf4545fc743b58a8d591ee8c96761c9cf12c0b3c6f1393a495a38267ab81b" Jan 31 14:45:24 crc kubenswrapper[4751]: E0131 14:45:24.688966 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6fcf4545fc743b58a8d591ee8c96761c9cf12c0b3c6f1393a495a38267ab81b\": container with ID starting with d6fcf4545fc743b58a8d591ee8c96761c9cf12c0b3c6f1393a495a38267ab81b not found: ID does not exist" containerID="d6fcf4545fc743b58a8d591ee8c96761c9cf12c0b3c6f1393a495a38267ab81b" Jan 31 14:45:24 crc kubenswrapper[4751]: I0131 14:45:24.689001 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6fcf4545fc743b58a8d591ee8c96761c9cf12c0b3c6f1393a495a38267ab81b"} err="failed to get container status \"d6fcf4545fc743b58a8d591ee8c96761c9cf12c0b3c6f1393a495a38267ab81b\": rpc error: code = NotFound desc = could not find container \"d6fcf4545fc743b58a8d591ee8c96761c9cf12c0b3c6f1393a495a38267ab81b\": container with ID starting with d6fcf4545fc743b58a8d591ee8c96761c9cf12c0b3c6f1393a495a38267ab81b not found: ID does not exist" Jan 31 14:45:24 crc kubenswrapper[4751]: I0131 14:45:24.700346 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-9ff5cfb77-zl2hc"] Jan 31 14:45:24 crc kubenswrapper[4751]: I0131 14:45:24.703212 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-9ff5cfb77-zl2hc"] Jan 31 14:45:24 crc kubenswrapper[4751]: I0131 14:45:24.726715 4751 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7689427f-2c92-4b56-9617-1139504142ee-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 31 14:45:24 crc kubenswrapper[4751]: I0131 14:45:24.726755 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krs4f\" (UniqueName: \"kubernetes.io/projected/7689427f-2c92-4b56-9617-1139504142ee-kube-api-access-krs4f\") on node \"crc\" DevicePath \"\"" Jan 31 14:45:24 crc kubenswrapper[4751]: I0131 14:45:24.726768 4751 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7689427f-2c92-4b56-9617-1139504142ee-client-ca\") on node \"crc\" DevicePath \"\"" Jan 31 14:45:24 crc kubenswrapper[4751]: I0131 14:45:24.726777 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7689427f-2c92-4b56-9617-1139504142ee-config\") on node \"crc\" DevicePath \"\"" Jan 31 14:45:24 crc kubenswrapper[4751]: I0131 14:45:24.726786 4751 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7689427f-2c92-4b56-9617-1139504142ee-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.136398 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77b9dd5444-kpj8z"] Jan 31 14:45:25 crc kubenswrapper[4751]: E0131 14:45:25.136636 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e771b68a-beea-4c8b-a085-b869155ca20d" containerName="registry-server" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.136652 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="e771b68a-beea-4c8b-a085-b869155ca20d" containerName="registry-server" Jan 31 14:45:25 crc kubenswrapper[4751]: E0131 14:45:25.136663 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78cdf25f-daec-4cd7-8954-1fef6f3727db" containerName="route-controller-manager" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.136672 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="78cdf25f-daec-4cd7-8954-1fef6f3727db" containerName="route-controller-manager" Jan 31 14:45:25 crc kubenswrapper[4751]: E0131 14:45:25.136683 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5c0f5c8-cecf-451f-abef-bf357716eb71" containerName="extract-content" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.136691 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5c0f5c8-cecf-451f-abef-bf357716eb71" containerName="extract-content" Jan 31 14:45:25 crc kubenswrapper[4751]: E0131 14:45:25.136704 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5c0f5c8-cecf-451f-abef-bf357716eb71" containerName="extract-utilities" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.136711 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5c0f5c8-cecf-451f-abef-bf357716eb71" containerName="extract-utilities" Jan 31 14:45:25 crc kubenswrapper[4751]: E0131 14:45:25.136724 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e771b68a-beea-4c8b-a085-b869155ca20d" containerName="extract-content" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.136732 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="e771b68a-beea-4c8b-a085-b869155ca20d" containerName="extract-content" Jan 31 14:45:25 crc kubenswrapper[4751]: E0131 14:45:25.136756 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ee27ad5-3acb-4388-a964-3b526b79e776" containerName="pruner" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.136764 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ee27ad5-3acb-4388-a964-3b526b79e776" containerName="pruner" Jan 31 14:45:25 crc kubenswrapper[4751]: E0131 14:45:25.136775 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c447796d-48ac-4eeb-8fe6-ad411966b3d3" containerName="registry-server" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.136782 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="c447796d-48ac-4eeb-8fe6-ad411966b3d3" containerName="registry-server" Jan 31 14:45:25 crc kubenswrapper[4751]: E0131 14:45:25.136791 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f614f9ab-b5e2-4548-93e7-571d1ffb57b0" containerName="registry-server" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.136799 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="f614f9ab-b5e2-4548-93e7-571d1ffb57b0" containerName="registry-server" Jan 31 14:45:25 crc kubenswrapper[4751]: E0131 14:45:25.136807 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a7534cc-afa8-4cd1-acb0-e4269e55316b" containerName="collect-profiles" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.136815 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a7534cc-afa8-4cd1-acb0-e4269e55316b" containerName="collect-profiles" Jan 31 14:45:25 crc kubenswrapper[4751]: E0131 14:45:25.136823 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f614f9ab-b5e2-4548-93e7-571d1ffb57b0" containerName="extract-content" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.136830 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="f614f9ab-b5e2-4548-93e7-571d1ffb57b0" containerName="extract-content" Jan 31 14:45:25 crc kubenswrapper[4751]: E0131 14:45:25.136844 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5c0f5c8-cecf-451f-abef-bf357716eb71" containerName="registry-server" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.136852 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5c0f5c8-cecf-451f-abef-bf357716eb71" containerName="registry-server" Jan 31 14:45:25 crc kubenswrapper[4751]: E0131 14:45:25.136866 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f614f9ab-b5e2-4548-93e7-571d1ffb57b0" containerName="extract-utilities" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.136874 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="f614f9ab-b5e2-4548-93e7-571d1ffb57b0" containerName="extract-utilities" Jan 31 14:45:25 crc kubenswrapper[4751]: E0131 14:45:25.136882 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e771b68a-beea-4c8b-a085-b869155ca20d" containerName="extract-utilities" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.136889 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="e771b68a-beea-4c8b-a085-b869155ca20d" containerName="extract-utilities" Jan 31 14:45:25 crc kubenswrapper[4751]: E0131 14:45:25.136901 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7689427f-2c92-4b56-9617-1139504142ee" containerName="controller-manager" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.136908 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="7689427f-2c92-4b56-9617-1139504142ee" containerName="controller-manager" Jan 31 14:45:25 crc kubenswrapper[4751]: E0131 14:45:25.136936 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c447796d-48ac-4eeb-8fe6-ad411966b3d3" containerName="extract-utilities" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.136945 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="c447796d-48ac-4eeb-8fe6-ad411966b3d3" containerName="extract-utilities" Jan 31 14:45:25 crc kubenswrapper[4751]: E0131 14:45:25.136953 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c447796d-48ac-4eeb-8fe6-ad411966b3d3" containerName="extract-content" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.136960 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="c447796d-48ac-4eeb-8fe6-ad411966b3d3" containerName="extract-content" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.137093 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5c0f5c8-cecf-451f-abef-bf357716eb71" containerName="registry-server" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.137111 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="f614f9ab-b5e2-4548-93e7-571d1ffb57b0" containerName="registry-server" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.137121 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a7534cc-afa8-4cd1-acb0-e4269e55316b" containerName="collect-profiles" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.137133 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="e771b68a-beea-4c8b-a085-b869155ca20d" containerName="registry-server" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.137145 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="c447796d-48ac-4eeb-8fe6-ad411966b3d3" containerName="registry-server" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.137156 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ee27ad5-3acb-4388-a964-3b526b79e776" containerName="pruner" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.137165 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="78cdf25f-daec-4cd7-8954-1fef6f3727db" containerName="route-controller-manager" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.137176 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="7689427f-2c92-4b56-9617-1139504142ee" containerName="controller-manager" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.137592 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-77b9dd5444-kpj8z" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.140329 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7df5c4f8d-6z7qk"] Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.140377 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.140444 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.140538 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.140984 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7df5c4f8d-6z7qk" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.141003 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.141153 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.143874 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.144219 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.144596 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.145181 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.145359 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.146170 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.149213 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.151723 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.156162 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7df5c4f8d-6z7qk"] Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.163910 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77b9dd5444-kpj8z"] Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.234703 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10e50c97-9956-48fc-a759-6d6a2e2d8ca5-config\") pod \"controller-manager-7df5c4f8d-6z7qk\" (UID: \"10e50c97-9956-48fc-a759-6d6a2e2d8ca5\") " pod="openshift-controller-manager/controller-manager-7df5c4f8d-6z7qk" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.235157 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/10e50c97-9956-48fc-a759-6d6a2e2d8ca5-client-ca\") pod \"controller-manager-7df5c4f8d-6z7qk\" (UID: \"10e50c97-9956-48fc-a759-6d6a2e2d8ca5\") " pod="openshift-controller-manager/controller-manager-7df5c4f8d-6z7qk" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.235366 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e51033f6-0061-4b08-9d82-11c610c7d396-config\") pod \"route-controller-manager-77b9dd5444-kpj8z\" (UID: \"e51033f6-0061-4b08-9d82-11c610c7d396\") " pod="openshift-route-controller-manager/route-controller-manager-77b9dd5444-kpj8z" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.235559 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/10e50c97-9956-48fc-a759-6d6a2e2d8ca5-proxy-ca-bundles\") pod \"controller-manager-7df5c4f8d-6z7qk\" (UID: \"10e50c97-9956-48fc-a759-6d6a2e2d8ca5\") " pod="openshift-controller-manager/controller-manager-7df5c4f8d-6z7qk" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.235737 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e51033f6-0061-4b08-9d82-11c610c7d396-client-ca\") pod \"route-controller-manager-77b9dd5444-kpj8z\" (UID: \"e51033f6-0061-4b08-9d82-11c610c7d396\") " pod="openshift-route-controller-manager/route-controller-manager-77b9dd5444-kpj8z" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.235914 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t85s\" (UniqueName: \"kubernetes.io/projected/10e50c97-9956-48fc-a759-6d6a2e2d8ca5-kube-api-access-9t85s\") pod \"controller-manager-7df5c4f8d-6z7qk\" (UID: \"10e50c97-9956-48fc-a759-6d6a2e2d8ca5\") " pod="openshift-controller-manager/controller-manager-7df5c4f8d-6z7qk" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.236136 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e51033f6-0061-4b08-9d82-11c610c7d396-serving-cert\") pod \"route-controller-manager-77b9dd5444-kpj8z\" (UID: \"e51033f6-0061-4b08-9d82-11c610c7d396\") " pod="openshift-route-controller-manager/route-controller-manager-77b9dd5444-kpj8z" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.236300 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10e50c97-9956-48fc-a759-6d6a2e2d8ca5-serving-cert\") pod \"controller-manager-7df5c4f8d-6z7qk\" (UID: \"10e50c97-9956-48fc-a759-6d6a2e2d8ca5\") " pod="openshift-controller-manager/controller-manager-7df5c4f8d-6z7qk" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.236472 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqnt2\" (UniqueName: \"kubernetes.io/projected/e51033f6-0061-4b08-9d82-11c610c7d396-kube-api-access-mqnt2\") pod \"route-controller-manager-77b9dd5444-kpj8z\" (UID: \"e51033f6-0061-4b08-9d82-11c610c7d396\") " pod="openshift-route-controller-manager/route-controller-manager-77b9dd5444-kpj8z" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.337430 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e51033f6-0061-4b08-9d82-11c610c7d396-serving-cert\") pod \"route-controller-manager-77b9dd5444-kpj8z\" (UID: \"e51033f6-0061-4b08-9d82-11c610c7d396\") " pod="openshift-route-controller-manager/route-controller-manager-77b9dd5444-kpj8z" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.337473 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10e50c97-9956-48fc-a759-6d6a2e2d8ca5-serving-cert\") pod \"controller-manager-7df5c4f8d-6z7qk\" (UID: \"10e50c97-9956-48fc-a759-6d6a2e2d8ca5\") " pod="openshift-controller-manager/controller-manager-7df5c4f8d-6z7qk" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.337494 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqnt2\" (UniqueName: \"kubernetes.io/projected/e51033f6-0061-4b08-9d82-11c610c7d396-kube-api-access-mqnt2\") pod \"route-controller-manager-77b9dd5444-kpj8z\" (UID: \"e51033f6-0061-4b08-9d82-11c610c7d396\") " pod="openshift-route-controller-manager/route-controller-manager-77b9dd5444-kpj8z" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.337528 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/10e50c97-9956-48fc-a759-6d6a2e2d8ca5-client-ca\") pod \"controller-manager-7df5c4f8d-6z7qk\" (UID: \"10e50c97-9956-48fc-a759-6d6a2e2d8ca5\") " pod="openshift-controller-manager/controller-manager-7df5c4f8d-6z7qk" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.337546 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10e50c97-9956-48fc-a759-6d6a2e2d8ca5-config\") pod \"controller-manager-7df5c4f8d-6z7qk\" (UID: \"10e50c97-9956-48fc-a759-6d6a2e2d8ca5\") " pod="openshift-controller-manager/controller-manager-7df5c4f8d-6z7qk" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.337565 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e51033f6-0061-4b08-9d82-11c610c7d396-config\") pod \"route-controller-manager-77b9dd5444-kpj8z\" (UID: \"e51033f6-0061-4b08-9d82-11c610c7d396\") " pod="openshift-route-controller-manager/route-controller-manager-77b9dd5444-kpj8z" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.337594 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/10e50c97-9956-48fc-a759-6d6a2e2d8ca5-proxy-ca-bundles\") pod \"controller-manager-7df5c4f8d-6z7qk\" (UID: \"10e50c97-9956-48fc-a759-6d6a2e2d8ca5\") " pod="openshift-controller-manager/controller-manager-7df5c4f8d-6z7qk" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.337618 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e51033f6-0061-4b08-9d82-11c610c7d396-client-ca\") pod \"route-controller-manager-77b9dd5444-kpj8z\" (UID: \"e51033f6-0061-4b08-9d82-11c610c7d396\") " pod="openshift-route-controller-manager/route-controller-manager-77b9dd5444-kpj8z" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.337635 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9t85s\" (UniqueName: \"kubernetes.io/projected/10e50c97-9956-48fc-a759-6d6a2e2d8ca5-kube-api-access-9t85s\") pod \"controller-manager-7df5c4f8d-6z7qk\" (UID: \"10e50c97-9956-48fc-a759-6d6a2e2d8ca5\") " pod="openshift-controller-manager/controller-manager-7df5c4f8d-6z7qk" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.338996 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/10e50c97-9956-48fc-a759-6d6a2e2d8ca5-proxy-ca-bundles\") pod \"controller-manager-7df5c4f8d-6z7qk\" (UID: \"10e50c97-9956-48fc-a759-6d6a2e2d8ca5\") " pod="openshift-controller-manager/controller-manager-7df5c4f8d-6z7qk" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.339118 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/10e50c97-9956-48fc-a759-6d6a2e2d8ca5-client-ca\") pod \"controller-manager-7df5c4f8d-6z7qk\" (UID: \"10e50c97-9956-48fc-a759-6d6a2e2d8ca5\") " pod="openshift-controller-manager/controller-manager-7df5c4f8d-6z7qk" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.339405 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e51033f6-0061-4b08-9d82-11c610c7d396-config\") pod \"route-controller-manager-77b9dd5444-kpj8z\" (UID: \"e51033f6-0061-4b08-9d82-11c610c7d396\") " pod="openshift-route-controller-manager/route-controller-manager-77b9dd5444-kpj8z" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.339658 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10e50c97-9956-48fc-a759-6d6a2e2d8ca5-config\") pod \"controller-manager-7df5c4f8d-6z7qk\" (UID: \"10e50c97-9956-48fc-a759-6d6a2e2d8ca5\") " pod="openshift-controller-manager/controller-manager-7df5c4f8d-6z7qk" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.339946 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e51033f6-0061-4b08-9d82-11c610c7d396-client-ca\") pod \"route-controller-manager-77b9dd5444-kpj8z\" (UID: \"e51033f6-0061-4b08-9d82-11c610c7d396\") " pod="openshift-route-controller-manager/route-controller-manager-77b9dd5444-kpj8z" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.342471 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e51033f6-0061-4b08-9d82-11c610c7d396-serving-cert\") pod \"route-controller-manager-77b9dd5444-kpj8z\" (UID: \"e51033f6-0061-4b08-9d82-11c610c7d396\") " pod="openshift-route-controller-manager/route-controller-manager-77b9dd5444-kpj8z" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.351212 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10e50c97-9956-48fc-a759-6d6a2e2d8ca5-serving-cert\") pod \"controller-manager-7df5c4f8d-6z7qk\" (UID: \"10e50c97-9956-48fc-a759-6d6a2e2d8ca5\") " pod="openshift-controller-manager/controller-manager-7df5c4f8d-6z7qk" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.353557 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqnt2\" (UniqueName: \"kubernetes.io/projected/e51033f6-0061-4b08-9d82-11c610c7d396-kube-api-access-mqnt2\") pod \"route-controller-manager-77b9dd5444-kpj8z\" (UID: \"e51033f6-0061-4b08-9d82-11c610c7d396\") " pod="openshift-route-controller-manager/route-controller-manager-77b9dd5444-kpj8z" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.364746 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t85s\" (UniqueName: \"kubernetes.io/projected/10e50c97-9956-48fc-a759-6d6a2e2d8ca5-kube-api-access-9t85s\") pod \"controller-manager-7df5c4f8d-6z7qk\" (UID: \"10e50c97-9956-48fc-a759-6d6a2e2d8ca5\") " pod="openshift-controller-manager/controller-manager-7df5c4f8d-6z7qk" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.457673 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-77b9dd5444-kpj8z" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.470664 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7df5c4f8d-6z7qk" Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.773100 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77b9dd5444-kpj8z"] Jan 31 14:45:25 crc kubenswrapper[4751]: W0131 14:45:25.773577 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode51033f6_0061_4b08_9d82_11c610c7d396.slice/crio-701b25b8e3fbaec6025474cc0863bcdd565567a075d8cac932f6692b1bdc32fa WatchSource:0}: Error finding container 701b25b8e3fbaec6025474cc0863bcdd565567a075d8cac932f6692b1bdc32fa: Status 404 returned error can't find the container with id 701b25b8e3fbaec6025474cc0863bcdd565567a075d8cac932f6692b1bdc32fa Jan 31 14:45:25 crc kubenswrapper[4751]: I0131 14:45:25.903976 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7df5c4f8d-6z7qk"] Jan 31 14:45:25 crc kubenswrapper[4751]: W0131 14:45:25.915422 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10e50c97_9956_48fc_a759_6d6a2e2d8ca5.slice/crio-f5eab53aa57319123515f3ce0a1dc6f4bc60f14152bae2520bba9ec245f1d592 WatchSource:0}: Error finding container f5eab53aa57319123515f3ce0a1dc6f4bc60f14152bae2520bba9ec245f1d592: Status 404 returned error can't find the container with id f5eab53aa57319123515f3ce0a1dc6f4bc60f14152bae2520bba9ec245f1d592 Jan 31 14:45:26 crc kubenswrapper[4751]: I0131 14:45:26.411508 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7689427f-2c92-4b56-9617-1139504142ee" path="/var/lib/kubelet/pods/7689427f-2c92-4b56-9617-1139504142ee/volumes" Jan 31 14:45:26 crc kubenswrapper[4751]: I0131 14:45:26.412544 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78cdf25f-daec-4cd7-8954-1fef6f3727db" path="/var/lib/kubelet/pods/78cdf25f-daec-4cd7-8954-1fef6f3727db/volumes" Jan 31 14:45:26 crc kubenswrapper[4751]: I0131 14:45:26.672297 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-77b9dd5444-kpj8z" event={"ID":"e51033f6-0061-4b08-9d82-11c610c7d396","Type":"ContainerStarted","Data":"d3805c33079a37613edf0ea51929b4cc19479078ccdac739485e2af4ae10c78a"} Jan 31 14:45:26 crc kubenswrapper[4751]: I0131 14:45:26.672554 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-77b9dd5444-kpj8z" Jan 31 14:45:26 crc kubenswrapper[4751]: I0131 14:45:26.672626 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-77b9dd5444-kpj8z" event={"ID":"e51033f6-0061-4b08-9d82-11c610c7d396","Type":"ContainerStarted","Data":"701b25b8e3fbaec6025474cc0863bcdd565567a075d8cac932f6692b1bdc32fa"} Jan 31 14:45:26 crc kubenswrapper[4751]: I0131 14:45:26.674684 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7df5c4f8d-6z7qk" event={"ID":"10e50c97-9956-48fc-a759-6d6a2e2d8ca5","Type":"ContainerStarted","Data":"ddc997e5bd0f4e42afe9a829495321c7b00002150e75b55e2c4d433cd4092402"} Jan 31 14:45:26 crc kubenswrapper[4751]: I0131 14:45:26.674724 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7df5c4f8d-6z7qk" event={"ID":"10e50c97-9956-48fc-a759-6d6a2e2d8ca5","Type":"ContainerStarted","Data":"f5eab53aa57319123515f3ce0a1dc6f4bc60f14152bae2520bba9ec245f1d592"} Jan 31 14:45:26 crc kubenswrapper[4751]: I0131 14:45:26.675037 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7df5c4f8d-6z7qk" Jan 31 14:45:26 crc kubenswrapper[4751]: I0131 14:45:26.676619 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-77b9dd5444-kpj8z" Jan 31 14:45:26 crc kubenswrapper[4751]: I0131 14:45:26.678685 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7df5c4f8d-6z7qk" Jan 31 14:45:26 crc kubenswrapper[4751]: I0131 14:45:26.688210 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-77b9dd5444-kpj8z" podStartSLOduration=2.688193596 podStartE2EDuration="2.688193596s" podCreationTimestamp="2026-01-31 14:45:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:45:26.687272842 +0000 UTC m=+231.061985727" watchObservedRunningTime="2026-01-31 14:45:26.688193596 +0000 UTC m=+231.062906491" Jan 31 14:45:26 crc kubenswrapper[4751]: I0131 14:45:26.728090 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7df5c4f8d-6z7qk" podStartSLOduration=3.728050414 podStartE2EDuration="3.728050414s" podCreationTimestamp="2026-01-31 14:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:45:26.726472592 +0000 UTC m=+231.101185487" watchObservedRunningTime="2026-01-31 14:45:26.728050414 +0000 UTC m=+231.102763309" Jan 31 14:45:32 crc kubenswrapper[4751]: I0131 14:45:32.259378 4751 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 31 14:45:32 crc kubenswrapper[4751]: I0131 14:45:32.260238 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://39e4fe03f50f95359d284f7d5aebb0df697fba39743cf83857c7f17893cb2bff" gracePeriod=15 Jan 31 14:45:32 crc kubenswrapper[4751]: I0131 14:45:32.260299 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://7d34543b9a432a0dd029c18b90dd065749cf5a74b9221dcedf22a810afa749a3" gracePeriod=15 Jan 31 14:45:32 crc kubenswrapper[4751]: I0131 14:45:32.260296 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://8df6c6abfaa0e16e8c6de87da9d3584b57a9e83e9dee03f98bb9de38cf71cc19" gracePeriod=15 Jan 31 14:45:32 crc kubenswrapper[4751]: I0131 14:45:32.260294 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://08603a998264e94ef7871338a47019047495685fb867c5fa4f290f8a990dc6ea" gracePeriod=15 Jan 31 14:45:32 crc kubenswrapper[4751]: I0131 14:45:32.260434 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://ddd61fd469b2710e86edf5644cff4c4272ce73a59b6578c4cf941c5408e2d218" gracePeriod=15 Jan 31 14:45:32 crc kubenswrapper[4751]: I0131 14:45:32.261528 4751 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 31 14:45:32 crc kubenswrapper[4751]: E0131 14:45:32.261838 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 31 14:45:32 crc kubenswrapper[4751]: I0131 14:45:32.261869 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 31 14:45:32 crc kubenswrapper[4751]: E0131 14:45:32.261897 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 31 14:45:32 crc kubenswrapper[4751]: I0131 14:45:32.261909 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 31 14:45:32 crc kubenswrapper[4751]: E0131 14:45:32.261927 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 31 14:45:32 crc kubenswrapper[4751]: I0131 14:45:32.261940 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 31 14:45:32 crc kubenswrapper[4751]: E0131 14:45:32.261958 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 31 14:45:32 crc kubenswrapper[4751]: I0131 14:45:32.261971 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 31 14:45:32 crc kubenswrapper[4751]: E0131 14:45:32.261987 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 31 14:45:32 crc kubenswrapper[4751]: I0131 14:45:32.261999 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 31 14:45:32 crc kubenswrapper[4751]: E0131 14:45:32.262013 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 31 14:45:32 crc kubenswrapper[4751]: I0131 14:45:32.262025 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 31 14:45:32 crc kubenswrapper[4751]: E0131 14:45:32.262040 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 31 14:45:32 crc kubenswrapper[4751]: I0131 14:45:32.262051 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 31 14:45:32 crc kubenswrapper[4751]: I0131 14:45:32.262247 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 31 14:45:32 crc kubenswrapper[4751]: I0131 14:45:32.262272 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 31 14:45:32 crc kubenswrapper[4751]: I0131 14:45:32.262288 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 31 14:45:32 crc kubenswrapper[4751]: I0131 14:45:32.262308 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 31 14:45:32 crc kubenswrapper[4751]: I0131 14:45:32.262322 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 31 14:45:32 crc kubenswrapper[4751]: I0131 14:45:32.262341 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 31 14:45:32 crc kubenswrapper[4751]: I0131 14:45:32.264691 4751 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 31 14:45:32 crc kubenswrapper[4751]: I0131 14:45:32.265886 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 14:45:32 crc kubenswrapper[4751]: I0131 14:45:32.271728 4751 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Jan 31 14:45:32 crc kubenswrapper[4751]: I0131 14:45:32.336184 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 14:45:32 crc kubenswrapper[4751]: I0131 14:45:32.336313 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 14:45:32 crc kubenswrapper[4751]: I0131 14:45:32.336362 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 14:45:32 crc kubenswrapper[4751]: I0131 14:45:32.336425 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 14:45:32 crc kubenswrapper[4751]: I0131 14:45:32.336454 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 14:45:32 crc kubenswrapper[4751]: I0131 14:45:32.336491 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 14:45:32 crc kubenswrapper[4751]: I0131 14:45:32.336532 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 14:45:32 crc kubenswrapper[4751]: I0131 14:45:32.336588 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 14:45:32 crc kubenswrapper[4751]: E0131 14:45:32.338827 4751 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.98:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 14:45:32 crc kubenswrapper[4751]: I0131 14:45:32.437884 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 14:45:32 crc kubenswrapper[4751]: I0131 14:45:32.437983 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 14:45:32 crc kubenswrapper[4751]: I0131 14:45:32.437985 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 14:45:32 crc kubenswrapper[4751]: I0131 14:45:32.438044 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 14:45:32 crc kubenswrapper[4751]: I0131 14:45:32.438058 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 14:45:32 crc kubenswrapper[4751]: I0131 14:45:32.438104 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 14:45:32 crc kubenswrapper[4751]: I0131 14:45:32.438359 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 14:45:32 crc kubenswrapper[4751]: I0131 14:45:32.438570 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 14:45:32 crc kubenswrapper[4751]: I0131 14:45:32.438652 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 14:45:32 crc kubenswrapper[4751]: I0131 14:45:32.438663 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 14:45:32 crc kubenswrapper[4751]: I0131 14:45:32.438490 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 14:45:32 crc kubenswrapper[4751]: I0131 14:45:32.438708 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 14:45:32 crc kubenswrapper[4751]: I0131 14:45:32.439363 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 14:45:32 crc kubenswrapper[4751]: I0131 14:45:32.439435 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 14:45:32 crc kubenswrapper[4751]: I0131 14:45:32.439551 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 14:45:32 crc kubenswrapper[4751]: I0131 14:45:32.439579 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 14:45:32 crc kubenswrapper[4751]: I0131 14:45:32.640292 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 14:45:32 crc kubenswrapper[4751]: W0131 14:45:32.690008 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-ded9b48a582c8efc38308d8ddc31d8083958671210498b68ef4025e597af42e4 WatchSource:0}: Error finding container ded9b48a582c8efc38308d8ddc31d8083958671210498b68ef4025e597af42e4: Status 404 returned error can't find the container with id ded9b48a582c8efc38308d8ddc31d8083958671210498b68ef4025e597af42e4 Jan 31 14:45:32 crc kubenswrapper[4751]: E0131 14:45:32.693331 4751 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.98:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188fd80de0800fbf openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-31 14:45:32.692557759 +0000 UTC m=+237.067270684,LastTimestamp:2026-01-31 14:45:32.692557759 +0000 UTC m=+237.067270684,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 31 14:45:32 crc kubenswrapper[4751]: I0131 14:45:32.720707 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 31 14:45:32 crc kubenswrapper[4751]: I0131 14:45:32.722714 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 31 14:45:32 crc kubenswrapper[4751]: I0131 14:45:32.724170 4751 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8df6c6abfaa0e16e8c6de87da9d3584b57a9e83e9dee03f98bb9de38cf71cc19" exitCode=0 Jan 31 14:45:32 crc kubenswrapper[4751]: I0131 14:45:32.724213 4751 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7d34543b9a432a0dd029c18b90dd065749cf5a74b9221dcedf22a810afa749a3" exitCode=0 Jan 31 14:45:32 crc kubenswrapper[4751]: I0131 14:45:32.724228 4751 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="08603a998264e94ef7871338a47019047495685fb867c5fa4f290f8a990dc6ea" exitCode=0 Jan 31 14:45:32 crc kubenswrapper[4751]: I0131 14:45:32.724246 4751 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ddd61fd469b2710e86edf5644cff4c4272ce73a59b6578c4cf941c5408e2d218" exitCode=2 Jan 31 14:45:32 crc kubenswrapper[4751]: I0131 14:45:32.724328 4751 scope.go:117] "RemoveContainer" containerID="16f93de7a002a1e6a281d90042a003acf677ad136b29da511ead9acd31eb7ab2" Jan 31 14:45:32 crc kubenswrapper[4751]: I0131 14:45:32.726297 4751 generic.go:334] "Generic (PLEG): container finished" podID="ccfa0c88-7f51-4d85-8a49-e05865c6a06e" containerID="8b4d59b21d9818b51f757f56dda578d9b5e64551b0acae90d2098c728b3290ee" exitCode=0 Jan 31 14:45:32 crc kubenswrapper[4751]: I0131 14:45:32.726385 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ccfa0c88-7f51-4d85-8a49-e05865c6a06e","Type":"ContainerDied","Data":"8b4d59b21d9818b51f757f56dda578d9b5e64551b0acae90d2098c728b3290ee"} Jan 31 14:45:32 crc kubenswrapper[4751]: I0131 14:45:32.727373 4751 status_manager.go:851] "Failed to get status for pod" podUID="ccfa0c88-7f51-4d85-8a49-e05865c6a06e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.98:6443: connect: connection refused" Jan 31 14:45:32 crc kubenswrapper[4751]: I0131 14:45:32.727793 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"ded9b48a582c8efc38308d8ddc31d8083958671210498b68ef4025e597af42e4"} Jan 31 14:45:33 crc kubenswrapper[4751]: I0131 14:45:33.737028 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"be1bca22b91e771b11166bb91585a254e34658c1ab13b852f1301a3b8029237f"} Jan 31 14:45:33 crc kubenswrapper[4751]: I0131 14:45:33.738052 4751 status_manager.go:851] "Failed to get status for pod" podUID="ccfa0c88-7f51-4d85-8a49-e05865c6a06e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.98:6443: connect: connection refused" Jan 31 14:45:33 crc kubenswrapper[4751]: E0131 14:45:33.738137 4751 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.98:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 14:45:33 crc kubenswrapper[4751]: I0131 14:45:33.742311 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 31 14:45:34 crc kubenswrapper[4751]: I0131 14:45:34.191834 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 31 14:45:34 crc kubenswrapper[4751]: I0131 14:45:34.193238 4751 status_manager.go:851] "Failed to get status for pod" podUID="ccfa0c88-7f51-4d85-8a49-e05865c6a06e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.98:6443: connect: connection refused" Jan 31 14:45:34 crc kubenswrapper[4751]: I0131 14:45:34.268677 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ccfa0c88-7f51-4d85-8a49-e05865c6a06e-kubelet-dir\") pod \"ccfa0c88-7f51-4d85-8a49-e05865c6a06e\" (UID: \"ccfa0c88-7f51-4d85-8a49-e05865c6a06e\") " Jan 31 14:45:34 crc kubenswrapper[4751]: I0131 14:45:34.268862 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ccfa0c88-7f51-4d85-8a49-e05865c6a06e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ccfa0c88-7f51-4d85-8a49-e05865c6a06e" (UID: "ccfa0c88-7f51-4d85-8a49-e05865c6a06e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 14:45:34 crc kubenswrapper[4751]: I0131 14:45:34.268882 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ccfa0c88-7f51-4d85-8a49-e05865c6a06e-var-lock\") pod \"ccfa0c88-7f51-4d85-8a49-e05865c6a06e\" (UID: \"ccfa0c88-7f51-4d85-8a49-e05865c6a06e\") " Jan 31 14:45:34 crc kubenswrapper[4751]: I0131 14:45:34.268946 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ccfa0c88-7f51-4d85-8a49-e05865c6a06e-var-lock" (OuterVolumeSpecName: "var-lock") pod "ccfa0c88-7f51-4d85-8a49-e05865c6a06e" (UID: "ccfa0c88-7f51-4d85-8a49-e05865c6a06e"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 14:45:34 crc kubenswrapper[4751]: I0131 14:45:34.269018 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ccfa0c88-7f51-4d85-8a49-e05865c6a06e-kube-api-access\") pod \"ccfa0c88-7f51-4d85-8a49-e05865c6a06e\" (UID: \"ccfa0c88-7f51-4d85-8a49-e05865c6a06e\") " Jan 31 14:45:34 crc kubenswrapper[4751]: I0131 14:45:34.269582 4751 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ccfa0c88-7f51-4d85-8a49-e05865c6a06e-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 31 14:45:34 crc kubenswrapper[4751]: I0131 14:45:34.269615 4751 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ccfa0c88-7f51-4d85-8a49-e05865c6a06e-var-lock\") on node \"crc\" DevicePath \"\"" Jan 31 14:45:34 crc kubenswrapper[4751]: I0131 14:45:34.275973 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccfa0c88-7f51-4d85-8a49-e05865c6a06e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ccfa0c88-7f51-4d85-8a49-e05865c6a06e" (UID: "ccfa0c88-7f51-4d85-8a49-e05865c6a06e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:45:34 crc kubenswrapper[4751]: I0131 14:45:34.392030 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ccfa0c88-7f51-4d85-8a49-e05865c6a06e-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 31 14:45:34 crc kubenswrapper[4751]: I0131 14:45:34.656916 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 31 14:45:34 crc kubenswrapper[4751]: I0131 14:45:34.658518 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 14:45:34 crc kubenswrapper[4751]: I0131 14:45:34.659444 4751 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.98:6443: connect: connection refused" Jan 31 14:45:34 crc kubenswrapper[4751]: I0131 14:45:34.660203 4751 status_manager.go:851] "Failed to get status for pod" podUID="ccfa0c88-7f51-4d85-8a49-e05865c6a06e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.98:6443: connect: connection refused" Jan 31 14:45:34 crc kubenswrapper[4751]: I0131 14:45:34.753669 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 31 14:45:34 crc kubenswrapper[4751]: I0131 14:45:34.756138 4751 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="39e4fe03f50f95359d284f7d5aebb0df697fba39743cf83857c7f17893cb2bff" exitCode=0 Jan 31 14:45:34 crc kubenswrapper[4751]: I0131 14:45:34.756270 4751 scope.go:117] "RemoveContainer" containerID="8df6c6abfaa0e16e8c6de87da9d3584b57a9e83e9dee03f98bb9de38cf71cc19" Jan 31 14:45:34 crc kubenswrapper[4751]: I0131 14:45:34.756485 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 14:45:34 crc kubenswrapper[4751]: I0131 14:45:34.761178 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 31 14:45:34 crc kubenswrapper[4751]: I0131 14:45:34.762020 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ccfa0c88-7f51-4d85-8a49-e05865c6a06e","Type":"ContainerDied","Data":"ca8ebba9df4a8c9712a669a8d97759aea5c95bd694f2cead6b4521af30eb8469"} Jan 31 14:45:34 crc kubenswrapper[4751]: I0131 14:45:34.762140 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca8ebba9df4a8c9712a669a8d97759aea5c95bd694f2cead6b4521af30eb8469" Jan 31 14:45:34 crc kubenswrapper[4751]: E0131 14:45:34.762305 4751 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.98:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 14:45:34 crc kubenswrapper[4751]: I0131 14:45:34.797912 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 31 14:45:34 crc kubenswrapper[4751]: I0131 14:45:34.798050 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 14:45:34 crc kubenswrapper[4751]: I0131 14:45:34.798042 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 31 14:45:34 crc kubenswrapper[4751]: I0131 14:45:34.798142 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 31 14:45:34 crc kubenswrapper[4751]: I0131 14:45:34.798227 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 14:45:34 crc kubenswrapper[4751]: I0131 14:45:34.798340 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 14:45:34 crc kubenswrapper[4751]: I0131 14:45:34.798686 4751 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Jan 31 14:45:34 crc kubenswrapper[4751]: I0131 14:45:34.798724 4751 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 31 14:45:34 crc kubenswrapper[4751]: I0131 14:45:34.798780 4751 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 31 14:45:34 crc kubenswrapper[4751]: I0131 14:45:34.798927 4751 status_manager.go:851] "Failed to get status for pod" podUID="ccfa0c88-7f51-4d85-8a49-e05865c6a06e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.98:6443: connect: connection refused" Jan 31 14:45:34 crc kubenswrapper[4751]: I0131 14:45:34.799415 4751 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.98:6443: connect: connection refused" Jan 31 14:45:34 crc kubenswrapper[4751]: I0131 14:45:34.811343 4751 scope.go:117] "RemoveContainer" containerID="7d34543b9a432a0dd029c18b90dd065749cf5a74b9221dcedf22a810afa749a3" Jan 31 14:45:34 crc kubenswrapper[4751]: I0131 14:45:34.830361 4751 scope.go:117] "RemoveContainer" containerID="08603a998264e94ef7871338a47019047495685fb867c5fa4f290f8a990dc6ea" Jan 31 14:45:34 crc kubenswrapper[4751]: I0131 14:45:34.848744 4751 scope.go:117] "RemoveContainer" containerID="ddd61fd469b2710e86edf5644cff4c4272ce73a59b6578c4cf941c5408e2d218" Jan 31 14:45:34 crc kubenswrapper[4751]: I0131 14:45:34.869889 4751 scope.go:117] "RemoveContainer" containerID="39e4fe03f50f95359d284f7d5aebb0df697fba39743cf83857c7f17893cb2bff" Jan 31 14:45:34 crc kubenswrapper[4751]: I0131 14:45:34.892220 4751 scope.go:117] "RemoveContainer" containerID="92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254" Jan 31 14:45:34 crc kubenswrapper[4751]: I0131 14:45:34.930100 4751 scope.go:117] "RemoveContainer" containerID="8df6c6abfaa0e16e8c6de87da9d3584b57a9e83e9dee03f98bb9de38cf71cc19" Jan 31 14:45:34 crc kubenswrapper[4751]: E0131 14:45:34.930969 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8df6c6abfaa0e16e8c6de87da9d3584b57a9e83e9dee03f98bb9de38cf71cc19\": container with ID starting with 8df6c6abfaa0e16e8c6de87da9d3584b57a9e83e9dee03f98bb9de38cf71cc19 not found: ID does not exist" containerID="8df6c6abfaa0e16e8c6de87da9d3584b57a9e83e9dee03f98bb9de38cf71cc19" Jan 31 14:45:34 crc kubenswrapper[4751]: I0131 14:45:34.931063 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8df6c6abfaa0e16e8c6de87da9d3584b57a9e83e9dee03f98bb9de38cf71cc19"} err="failed to get container status \"8df6c6abfaa0e16e8c6de87da9d3584b57a9e83e9dee03f98bb9de38cf71cc19\": rpc error: code = NotFound desc = could not find container \"8df6c6abfaa0e16e8c6de87da9d3584b57a9e83e9dee03f98bb9de38cf71cc19\": container with ID starting with 8df6c6abfaa0e16e8c6de87da9d3584b57a9e83e9dee03f98bb9de38cf71cc19 not found: ID does not exist" Jan 31 14:45:34 crc kubenswrapper[4751]: I0131 14:45:34.931143 4751 scope.go:117] "RemoveContainer" containerID="7d34543b9a432a0dd029c18b90dd065749cf5a74b9221dcedf22a810afa749a3" Jan 31 14:45:34 crc kubenswrapper[4751]: E0131 14:45:34.931726 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d34543b9a432a0dd029c18b90dd065749cf5a74b9221dcedf22a810afa749a3\": container with ID starting with 7d34543b9a432a0dd029c18b90dd065749cf5a74b9221dcedf22a810afa749a3 not found: ID does not exist" containerID="7d34543b9a432a0dd029c18b90dd065749cf5a74b9221dcedf22a810afa749a3" Jan 31 14:45:34 crc kubenswrapper[4751]: I0131 14:45:34.931764 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d34543b9a432a0dd029c18b90dd065749cf5a74b9221dcedf22a810afa749a3"} err="failed to get container status \"7d34543b9a432a0dd029c18b90dd065749cf5a74b9221dcedf22a810afa749a3\": rpc error: code = NotFound desc = could not find container \"7d34543b9a432a0dd029c18b90dd065749cf5a74b9221dcedf22a810afa749a3\": container with ID starting with 7d34543b9a432a0dd029c18b90dd065749cf5a74b9221dcedf22a810afa749a3 not found: ID does not exist" Jan 31 14:45:34 crc kubenswrapper[4751]: I0131 14:45:34.931792 4751 scope.go:117] "RemoveContainer" containerID="08603a998264e94ef7871338a47019047495685fb867c5fa4f290f8a990dc6ea" Jan 31 14:45:34 crc kubenswrapper[4751]: E0131 14:45:34.932258 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08603a998264e94ef7871338a47019047495685fb867c5fa4f290f8a990dc6ea\": container with ID starting with 08603a998264e94ef7871338a47019047495685fb867c5fa4f290f8a990dc6ea not found: ID does not exist" containerID="08603a998264e94ef7871338a47019047495685fb867c5fa4f290f8a990dc6ea" Jan 31 14:45:34 crc kubenswrapper[4751]: I0131 14:45:34.932291 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08603a998264e94ef7871338a47019047495685fb867c5fa4f290f8a990dc6ea"} err="failed to get container status \"08603a998264e94ef7871338a47019047495685fb867c5fa4f290f8a990dc6ea\": rpc error: code = NotFound desc = could not find container \"08603a998264e94ef7871338a47019047495685fb867c5fa4f290f8a990dc6ea\": container with ID starting with 08603a998264e94ef7871338a47019047495685fb867c5fa4f290f8a990dc6ea not found: ID does not exist" Jan 31 14:45:34 crc kubenswrapper[4751]: I0131 14:45:34.932312 4751 scope.go:117] "RemoveContainer" containerID="ddd61fd469b2710e86edf5644cff4c4272ce73a59b6578c4cf941c5408e2d218" Jan 31 14:45:34 crc kubenswrapper[4751]: E0131 14:45:34.933054 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddd61fd469b2710e86edf5644cff4c4272ce73a59b6578c4cf941c5408e2d218\": container with ID starting with ddd61fd469b2710e86edf5644cff4c4272ce73a59b6578c4cf941c5408e2d218 not found: ID does not exist" containerID="ddd61fd469b2710e86edf5644cff4c4272ce73a59b6578c4cf941c5408e2d218" Jan 31 14:45:34 crc kubenswrapper[4751]: I0131 14:45:34.933130 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddd61fd469b2710e86edf5644cff4c4272ce73a59b6578c4cf941c5408e2d218"} err="failed to get container status \"ddd61fd469b2710e86edf5644cff4c4272ce73a59b6578c4cf941c5408e2d218\": rpc error: code = NotFound desc = could not find container \"ddd61fd469b2710e86edf5644cff4c4272ce73a59b6578c4cf941c5408e2d218\": container with ID starting with ddd61fd469b2710e86edf5644cff4c4272ce73a59b6578c4cf941c5408e2d218 not found: ID does not exist" Jan 31 14:45:34 crc kubenswrapper[4751]: I0131 14:45:34.933161 4751 scope.go:117] "RemoveContainer" containerID="39e4fe03f50f95359d284f7d5aebb0df697fba39743cf83857c7f17893cb2bff" Jan 31 14:45:34 crc kubenswrapper[4751]: E0131 14:45:34.933883 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39e4fe03f50f95359d284f7d5aebb0df697fba39743cf83857c7f17893cb2bff\": container with ID starting with 39e4fe03f50f95359d284f7d5aebb0df697fba39743cf83857c7f17893cb2bff not found: ID does not exist" containerID="39e4fe03f50f95359d284f7d5aebb0df697fba39743cf83857c7f17893cb2bff" Jan 31 14:45:34 crc kubenswrapper[4751]: I0131 14:45:34.933905 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39e4fe03f50f95359d284f7d5aebb0df697fba39743cf83857c7f17893cb2bff"} err="failed to get container status \"39e4fe03f50f95359d284f7d5aebb0df697fba39743cf83857c7f17893cb2bff\": rpc error: code = NotFound desc = could not find container \"39e4fe03f50f95359d284f7d5aebb0df697fba39743cf83857c7f17893cb2bff\": container with ID starting with 39e4fe03f50f95359d284f7d5aebb0df697fba39743cf83857c7f17893cb2bff not found: ID does not exist" Jan 31 14:45:34 crc kubenswrapper[4751]: I0131 14:45:34.933922 4751 scope.go:117] "RemoveContainer" containerID="92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254" Jan 31 14:45:34 crc kubenswrapper[4751]: E0131 14:45:34.934617 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\": container with ID starting with 92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254 not found: ID does not exist" containerID="92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254" Jan 31 14:45:34 crc kubenswrapper[4751]: I0131 14:45:34.934701 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254"} err="failed to get container status \"92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\": rpc error: code = NotFound desc = could not find container \"92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254\": container with ID starting with 92ff81193764a63b8b46092711612e005ad922936fea86d92b4ef07a16f2a254 not found: ID does not exist" Jan 31 14:45:35 crc kubenswrapper[4751]: I0131 14:45:35.082846 4751 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.98:6443: connect: connection refused" Jan 31 14:45:35 crc kubenswrapper[4751]: I0131 14:45:35.083411 4751 status_manager.go:851] "Failed to get status for pod" podUID="ccfa0c88-7f51-4d85-8a49-e05865c6a06e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.98:6443: connect: connection refused" Jan 31 14:45:35 crc kubenswrapper[4751]: I0131 14:45:35.486743 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" podUID="802d5225-ef3f-485c-bb85-3c0f18e42952" containerName="oauth-openshift" containerID="cri-o://01af9b04a121e47de6d720ef96908370b377b2bf6ed16ab772bd8cea30c24502" gracePeriod=15 Jan 31 14:45:35 crc kubenswrapper[4751]: I0131 14:45:35.772091 4751 generic.go:334] "Generic (PLEG): container finished" podID="802d5225-ef3f-485c-bb85-3c0f18e42952" containerID="01af9b04a121e47de6d720ef96908370b377b2bf6ed16ab772bd8cea30c24502" exitCode=0 Jan 31 14:45:35 crc kubenswrapper[4751]: I0131 14:45:35.772186 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" event={"ID":"802d5225-ef3f-485c-bb85-3c0f18e42952","Type":"ContainerDied","Data":"01af9b04a121e47de6d720ef96908370b377b2bf6ed16ab772bd8cea30c24502"} Jan 31 14:45:35 crc kubenswrapper[4751]: I0131 14:45:35.979016 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" Jan 31 14:45:35 crc kubenswrapper[4751]: I0131 14:45:35.979782 4751 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.98:6443: connect: connection refused" Jan 31 14:45:35 crc kubenswrapper[4751]: I0131 14:45:35.980007 4751 status_manager.go:851] "Failed to get status for pod" podUID="802d5225-ef3f-485c-bb85-3c0f18e42952" pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-xr2gt\": dial tcp 38.102.83.98:6443: connect: connection refused" Jan 31 14:45:35 crc kubenswrapper[4751]: I0131 14:45:35.980232 4751 status_manager.go:851] "Failed to get status for pod" podUID="ccfa0c88-7f51-4d85-8a49-e05865c6a06e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.98:6443: connect: connection refused" Jan 31 14:45:36 crc kubenswrapper[4751]: I0131 14:45:36.121964 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/802d5225-ef3f-485c-bb85-3c0f18e42952-audit-policies\") pod \"802d5225-ef3f-485c-bb85-3c0f18e42952\" (UID: \"802d5225-ef3f-485c-bb85-3c0f18e42952\") " Jan 31 14:45:36 crc kubenswrapper[4751]: I0131 14:45:36.122152 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-system-session\") pod \"802d5225-ef3f-485c-bb85-3c0f18e42952\" (UID: \"802d5225-ef3f-485c-bb85-3c0f18e42952\") " Jan 31 14:45:36 crc kubenswrapper[4751]: I0131 14:45:36.122253 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-system-trusted-ca-bundle\") pod \"802d5225-ef3f-485c-bb85-3c0f18e42952\" (UID: \"802d5225-ef3f-485c-bb85-3c0f18e42952\") " Jan 31 14:45:36 crc kubenswrapper[4751]: I0131 14:45:36.122321 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-system-serving-cert\") pod \"802d5225-ef3f-485c-bb85-3c0f18e42952\" (UID: \"802d5225-ef3f-485c-bb85-3c0f18e42952\") " Jan 31 14:45:36 crc kubenswrapper[4751]: I0131 14:45:36.122370 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnk4h\" (UniqueName: \"kubernetes.io/projected/802d5225-ef3f-485c-bb85-3c0f18e42952-kube-api-access-rnk4h\") pod \"802d5225-ef3f-485c-bb85-3c0f18e42952\" (UID: \"802d5225-ef3f-485c-bb85-3c0f18e42952\") " Jan 31 14:45:36 crc kubenswrapper[4751]: I0131 14:45:36.122422 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-user-template-login\") pod \"802d5225-ef3f-485c-bb85-3c0f18e42952\" (UID: \"802d5225-ef3f-485c-bb85-3c0f18e42952\") " Jan 31 14:45:36 crc kubenswrapper[4751]: I0131 14:45:36.122475 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-system-cliconfig\") pod \"802d5225-ef3f-485c-bb85-3c0f18e42952\" (UID: \"802d5225-ef3f-485c-bb85-3c0f18e42952\") " Jan 31 14:45:36 crc kubenswrapper[4751]: I0131 14:45:36.122583 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-system-ocp-branding-template\") pod \"802d5225-ef3f-485c-bb85-3c0f18e42952\" (UID: \"802d5225-ef3f-485c-bb85-3c0f18e42952\") " Jan 31 14:45:36 crc kubenswrapper[4751]: I0131 14:45:36.122629 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-user-idp-0-file-data\") pod \"802d5225-ef3f-485c-bb85-3c0f18e42952\" (UID: \"802d5225-ef3f-485c-bb85-3c0f18e42952\") " Jan 31 14:45:36 crc kubenswrapper[4751]: I0131 14:45:36.122677 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/802d5225-ef3f-485c-bb85-3c0f18e42952-audit-dir\") pod \"802d5225-ef3f-485c-bb85-3c0f18e42952\" (UID: \"802d5225-ef3f-485c-bb85-3c0f18e42952\") " Jan 31 14:45:36 crc kubenswrapper[4751]: I0131 14:45:36.122772 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-user-template-error\") pod \"802d5225-ef3f-485c-bb85-3c0f18e42952\" (UID: \"802d5225-ef3f-485c-bb85-3c0f18e42952\") " Jan 31 14:45:36 crc kubenswrapper[4751]: I0131 14:45:36.122821 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-user-template-provider-selection\") pod \"802d5225-ef3f-485c-bb85-3c0f18e42952\" (UID: \"802d5225-ef3f-485c-bb85-3c0f18e42952\") " Jan 31 14:45:36 crc kubenswrapper[4751]: I0131 14:45:36.122891 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-system-service-ca\") pod \"802d5225-ef3f-485c-bb85-3c0f18e42952\" (UID: \"802d5225-ef3f-485c-bb85-3c0f18e42952\") " Jan 31 14:45:36 crc kubenswrapper[4751]: I0131 14:45:36.122945 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-system-router-certs\") pod \"802d5225-ef3f-485c-bb85-3c0f18e42952\" (UID: \"802d5225-ef3f-485c-bb85-3c0f18e42952\") " Jan 31 14:45:36 crc kubenswrapper[4751]: I0131 14:45:36.123636 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "802d5225-ef3f-485c-bb85-3c0f18e42952" (UID: "802d5225-ef3f-485c-bb85-3c0f18e42952"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:45:36 crc kubenswrapper[4751]: I0131 14:45:36.123900 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/802d5225-ef3f-485c-bb85-3c0f18e42952-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "802d5225-ef3f-485c-bb85-3c0f18e42952" (UID: "802d5225-ef3f-485c-bb85-3c0f18e42952"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:45:36 crc kubenswrapper[4751]: I0131 14:45:36.124140 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/802d5225-ef3f-485c-bb85-3c0f18e42952-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "802d5225-ef3f-485c-bb85-3c0f18e42952" (UID: "802d5225-ef3f-485c-bb85-3c0f18e42952"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 14:45:36 crc kubenswrapper[4751]: I0131 14:45:36.124832 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "802d5225-ef3f-485c-bb85-3c0f18e42952" (UID: "802d5225-ef3f-485c-bb85-3c0f18e42952"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:45:36 crc kubenswrapper[4751]: I0131 14:45:36.125528 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "802d5225-ef3f-485c-bb85-3c0f18e42952" (UID: "802d5225-ef3f-485c-bb85-3c0f18e42952"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:45:36 crc kubenswrapper[4751]: I0131 14:45:36.130032 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "802d5225-ef3f-485c-bb85-3c0f18e42952" (UID: "802d5225-ef3f-485c-bb85-3c0f18e42952"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:45:36 crc kubenswrapper[4751]: I0131 14:45:36.130243 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "802d5225-ef3f-485c-bb85-3c0f18e42952" (UID: "802d5225-ef3f-485c-bb85-3c0f18e42952"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:45:36 crc kubenswrapper[4751]: I0131 14:45:36.130502 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "802d5225-ef3f-485c-bb85-3c0f18e42952" (UID: "802d5225-ef3f-485c-bb85-3c0f18e42952"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:45:36 crc kubenswrapper[4751]: I0131 14:45:36.136937 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/802d5225-ef3f-485c-bb85-3c0f18e42952-kube-api-access-rnk4h" (OuterVolumeSpecName: "kube-api-access-rnk4h") pod "802d5225-ef3f-485c-bb85-3c0f18e42952" (UID: "802d5225-ef3f-485c-bb85-3c0f18e42952"). InnerVolumeSpecName "kube-api-access-rnk4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:45:36 crc kubenswrapper[4751]: I0131 14:45:36.137349 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "802d5225-ef3f-485c-bb85-3c0f18e42952" (UID: "802d5225-ef3f-485c-bb85-3c0f18e42952"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:45:36 crc kubenswrapper[4751]: I0131 14:45:36.137784 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "802d5225-ef3f-485c-bb85-3c0f18e42952" (UID: "802d5225-ef3f-485c-bb85-3c0f18e42952"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:45:36 crc kubenswrapper[4751]: I0131 14:45:36.138339 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "802d5225-ef3f-485c-bb85-3c0f18e42952" (UID: "802d5225-ef3f-485c-bb85-3c0f18e42952"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:45:36 crc kubenswrapper[4751]: I0131 14:45:36.139258 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "802d5225-ef3f-485c-bb85-3c0f18e42952" (UID: "802d5225-ef3f-485c-bb85-3c0f18e42952"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:45:36 crc kubenswrapper[4751]: I0131 14:45:36.139802 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "802d5225-ef3f-485c-bb85-3c0f18e42952" (UID: "802d5225-ef3f-485c-bb85-3c0f18e42952"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:45:36 crc kubenswrapper[4751]: I0131 14:45:36.224898 4751 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 31 14:45:36 crc kubenswrapper[4751]: I0131 14:45:36.224964 4751 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 14:45:36 crc kubenswrapper[4751]: I0131 14:45:36.224987 4751 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 14:45:36 crc kubenswrapper[4751]: I0131 14:45:36.225008 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnk4h\" (UniqueName: \"kubernetes.io/projected/802d5225-ef3f-485c-bb85-3c0f18e42952-kube-api-access-rnk4h\") on node \"crc\" DevicePath \"\"" Jan 31 14:45:36 crc kubenswrapper[4751]: I0131 14:45:36.225029 4751 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 31 14:45:36 crc kubenswrapper[4751]: I0131 14:45:36.225049 4751 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 31 14:45:36 crc kubenswrapper[4751]: I0131 14:45:36.225190 4751 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 31 14:45:36 crc kubenswrapper[4751]: I0131 14:45:36.225212 4751 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 31 14:45:36 crc kubenswrapper[4751]: I0131 14:45:36.225233 4751 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/802d5225-ef3f-485c-bb85-3c0f18e42952-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 31 14:45:36 crc kubenswrapper[4751]: I0131 14:45:36.225252 4751 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 31 14:45:36 crc kubenswrapper[4751]: I0131 14:45:36.225274 4751 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 31 14:45:36 crc kubenswrapper[4751]: I0131 14:45:36.225294 4751 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 31 14:45:36 crc kubenswrapper[4751]: I0131 14:45:36.225313 4751 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/802d5225-ef3f-485c-bb85-3c0f18e42952-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 31 14:45:36 crc kubenswrapper[4751]: I0131 14:45:36.225333 4751 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/802d5225-ef3f-485c-bb85-3c0f18e42952-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 31 14:45:36 crc kubenswrapper[4751]: I0131 14:45:36.410514 4751 status_manager.go:851] "Failed to get status for pod" podUID="ccfa0c88-7f51-4d85-8a49-e05865c6a06e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.98:6443: connect: connection refused" Jan 31 14:45:36 crc kubenswrapper[4751]: I0131 14:45:36.411282 4751 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.98:6443: connect: connection refused" Jan 31 14:45:36 crc kubenswrapper[4751]: I0131 14:45:36.411966 4751 status_manager.go:851] "Failed to get status for pod" podUID="802d5225-ef3f-485c-bb85-3c0f18e42952" pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-xr2gt\": dial tcp 38.102.83.98:6443: connect: connection refused" Jan 31 14:45:36 crc kubenswrapper[4751]: I0131 14:45:36.418589 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Jan 31 14:45:36 crc kubenswrapper[4751]: E0131 14:45:36.494521 4751 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.98:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188fd80de0800fbf openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-31 14:45:32.692557759 +0000 UTC m=+237.067270684,LastTimestamp:2026-01-31 14:45:32.692557759 +0000 UTC m=+237.067270684,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 31 14:45:36 crc kubenswrapper[4751]: I0131 14:45:36.793363 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" Jan 31 14:45:36 crc kubenswrapper[4751]: I0131 14:45:36.793372 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" event={"ID":"802d5225-ef3f-485c-bb85-3c0f18e42952","Type":"ContainerDied","Data":"5084329a5b9f4efb799b2485cd137ef3c2a4c4cd5ed6e746dd1d5ef125ea23bd"} Jan 31 14:45:36 crc kubenswrapper[4751]: I0131 14:45:36.793692 4751 scope.go:117] "RemoveContainer" containerID="01af9b04a121e47de6d720ef96908370b377b2bf6ed16ab772bd8cea30c24502" Jan 31 14:45:36 crc kubenswrapper[4751]: I0131 14:45:36.797108 4751 status_manager.go:851] "Failed to get status for pod" podUID="802d5225-ef3f-485c-bb85-3c0f18e42952" pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-xr2gt\": dial tcp 38.102.83.98:6443: connect: connection refused" Jan 31 14:45:36 crc kubenswrapper[4751]: I0131 14:45:36.800014 4751 status_manager.go:851] "Failed to get status for pod" podUID="ccfa0c88-7f51-4d85-8a49-e05865c6a06e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.98:6443: connect: connection refused" Jan 31 14:45:36 crc kubenswrapper[4751]: I0131 14:45:36.800752 4751 status_manager.go:851] "Failed to get status for pod" podUID="ccfa0c88-7f51-4d85-8a49-e05865c6a06e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.98:6443: connect: connection refused" Jan 31 14:45:36 crc kubenswrapper[4751]: I0131 14:45:36.801294 4751 status_manager.go:851] "Failed to get status for pod" podUID="802d5225-ef3f-485c-bb85-3c0f18e42952" pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-xr2gt\": dial tcp 38.102.83.98:6443: connect: connection refused" Jan 31 14:45:37 crc kubenswrapper[4751]: E0131 14:45:37.256353 4751 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.98:6443: connect: connection refused" Jan 31 14:45:37 crc kubenswrapper[4751]: E0131 14:45:37.256954 4751 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.98:6443: connect: connection refused" Jan 31 14:45:37 crc kubenswrapper[4751]: E0131 14:45:37.257334 4751 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.98:6443: connect: connection refused" Jan 31 14:45:37 crc kubenswrapper[4751]: E0131 14:45:37.257728 4751 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.98:6443: connect: connection refused" Jan 31 14:45:37 crc kubenswrapper[4751]: E0131 14:45:37.257988 4751 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.98:6443: connect: connection refused" Jan 31 14:45:37 crc kubenswrapper[4751]: I0131 14:45:37.258015 4751 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Jan 31 14:45:37 crc kubenswrapper[4751]: E0131 14:45:37.258255 4751 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.98:6443: connect: connection refused" interval="200ms" Jan 31 14:45:37 crc kubenswrapper[4751]: E0131 14:45:37.459226 4751 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.98:6443: connect: connection refused" interval="400ms" Jan 31 14:45:37 crc kubenswrapper[4751]: E0131 14:45:37.860199 4751 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.98:6443: connect: connection refused" interval="800ms" Jan 31 14:45:38 crc kubenswrapper[4751]: E0131 14:45:38.661798 4751 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.98:6443: connect: connection refused" interval="1.6s" Jan 31 14:45:40 crc kubenswrapper[4751]: E0131 14:45:40.263374 4751 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.98:6443: connect: connection refused" interval="3.2s" Jan 31 14:45:43 crc kubenswrapper[4751]: E0131 14:45:43.465147 4751 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.98:6443: connect: connection refused" interval="6.4s" Jan 31 14:45:44 crc kubenswrapper[4751]: E0131 14:45:44.484177 4751 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.98:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" volumeName="registry-storage" Jan 31 14:45:46 crc kubenswrapper[4751]: I0131 14:45:46.405794 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 14:45:46 crc kubenswrapper[4751]: I0131 14:45:46.412428 4751 status_manager.go:851] "Failed to get status for pod" podUID="802d5225-ef3f-485c-bb85-3c0f18e42952" pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-xr2gt\": dial tcp 38.102.83.98:6443: connect: connection refused" Jan 31 14:45:46 crc kubenswrapper[4751]: I0131 14:45:46.413103 4751 status_manager.go:851] "Failed to get status for pod" podUID="ccfa0c88-7f51-4d85-8a49-e05865c6a06e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.98:6443: connect: connection refused" Jan 31 14:45:46 crc kubenswrapper[4751]: I0131 14:45:46.413940 4751 status_manager.go:851] "Failed to get status for pod" podUID="802d5225-ef3f-485c-bb85-3c0f18e42952" pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-xr2gt\": dial tcp 38.102.83.98:6443: connect: connection refused" Jan 31 14:45:46 crc kubenswrapper[4751]: I0131 14:45:46.414667 4751 status_manager.go:851] "Failed to get status for pod" podUID="ccfa0c88-7f51-4d85-8a49-e05865c6a06e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.98:6443: connect: connection refused" Jan 31 14:45:46 crc kubenswrapper[4751]: I0131 14:45:46.425615 4751 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6afb84a0-c564-45aa-b7a1-cd6f8273fe45" Jan 31 14:45:46 crc kubenswrapper[4751]: I0131 14:45:46.425649 4751 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6afb84a0-c564-45aa-b7a1-cd6f8273fe45" Jan 31 14:45:46 crc kubenswrapper[4751]: E0131 14:45:46.426123 4751 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.98:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 14:45:46 crc kubenswrapper[4751]: I0131 14:45:46.426754 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 14:45:46 crc kubenswrapper[4751]: W0131 14:45:46.451307 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-404b70ebaf5fed25a5454946db611d40733044239df31a2f8f6dda576f28c2c0 WatchSource:0}: Error finding container 404b70ebaf5fed25a5454946db611d40733044239df31a2f8f6dda576f28c2c0: Status 404 returned error can't find the container with id 404b70ebaf5fed25a5454946db611d40733044239df31a2f8f6dda576f28c2c0 Jan 31 14:45:46 crc kubenswrapper[4751]: E0131 14:45:46.495299 4751 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.98:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188fd80de0800fbf openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-31 14:45:32.692557759 +0000 UTC m=+237.067270684,LastTimestamp:2026-01-31 14:45:32.692557759 +0000 UTC m=+237.067270684,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 31 14:45:46 crc kubenswrapper[4751]: I0131 14:45:46.886119 4751 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="742a7adfa45d74153bac627996d99b69392b41fb162b6188adf8e23c123aad69" exitCode=0 Jan 31 14:45:46 crc kubenswrapper[4751]: I0131 14:45:46.886226 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"742a7adfa45d74153bac627996d99b69392b41fb162b6188adf8e23c123aad69"} Jan 31 14:45:46 crc kubenswrapper[4751]: I0131 14:45:46.886624 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"404b70ebaf5fed25a5454946db611d40733044239df31a2f8f6dda576f28c2c0"} Jan 31 14:45:46 crc kubenswrapper[4751]: I0131 14:45:46.887126 4751 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6afb84a0-c564-45aa-b7a1-cd6f8273fe45" Jan 31 14:45:46 crc kubenswrapper[4751]: I0131 14:45:46.887165 4751 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6afb84a0-c564-45aa-b7a1-cd6f8273fe45" Jan 31 14:45:46 crc kubenswrapper[4751]: I0131 14:45:46.887604 4751 status_manager.go:851] "Failed to get status for pod" podUID="802d5225-ef3f-485c-bb85-3c0f18e42952" pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-xr2gt\": dial tcp 38.102.83.98:6443: connect: connection refused" Jan 31 14:45:46 crc kubenswrapper[4751]: E0131 14:45:46.887790 4751 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.98:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 14:45:46 crc kubenswrapper[4751]: I0131 14:45:46.888164 4751 status_manager.go:851] "Failed to get status for pod" podUID="ccfa0c88-7f51-4d85-8a49-e05865c6a06e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.98:6443: connect: connection refused" Jan 31 14:45:46 crc kubenswrapper[4751]: I0131 14:45:46.891690 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 31 14:45:46 crc kubenswrapper[4751]: I0131 14:45:46.891759 4751 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="b37355e2b9d95a941be3552b713f7fe96973a49528f7a979524779b263412853" exitCode=1 Jan 31 14:45:46 crc kubenswrapper[4751]: I0131 14:45:46.891797 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"b37355e2b9d95a941be3552b713f7fe96973a49528f7a979524779b263412853"} Jan 31 14:45:46 crc kubenswrapper[4751]: I0131 14:45:46.892486 4751 scope.go:117] "RemoveContainer" containerID="b37355e2b9d95a941be3552b713f7fe96973a49528f7a979524779b263412853" Jan 31 14:45:46 crc kubenswrapper[4751]: I0131 14:45:46.892774 4751 status_manager.go:851] "Failed to get status for pod" podUID="802d5225-ef3f-485c-bb85-3c0f18e42952" pod="openshift-authentication/oauth-openshift-558db77b4-xr2gt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-xr2gt\": dial tcp 38.102.83.98:6443: connect: connection refused" Jan 31 14:45:46 crc kubenswrapper[4751]: I0131 14:45:46.893380 4751 status_manager.go:851] "Failed to get status for pod" podUID="ccfa0c88-7f51-4d85-8a49-e05865c6a06e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.98:6443: connect: connection refused" Jan 31 14:45:46 crc kubenswrapper[4751]: I0131 14:45:46.893990 4751 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.98:6443: connect: connection refused" Jan 31 14:45:47 crc kubenswrapper[4751]: I0131 14:45:47.903620 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"64a8038adbf46c1a0accb259e789e83fc67321dda6a8a3aa486ff19d330d054d"} Jan 31 14:45:47 crc kubenswrapper[4751]: I0131 14:45:47.903669 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4fc4b7e8589c2df8783c4da81ac9dbb4fdd25aa1d0b5de7ee5f479299d107d91"} Jan 31 14:45:47 crc kubenswrapper[4751]: I0131 14:45:47.903683 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"39599b3b855998562e1f7c861dd5691648eedf04ff2a6db2d75224a54c464df7"} Jan 31 14:45:47 crc kubenswrapper[4751]: I0131 14:45:47.907323 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 31 14:45:47 crc kubenswrapper[4751]: I0131 14:45:47.907372 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f457cbdc5e9ea9752a74adf3088e3884f6d9789fd54e67d4c3ec0ff19f6d5401"} Jan 31 14:45:48 crc kubenswrapper[4751]: I0131 14:45:48.695603 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 14:45:48 crc kubenswrapper[4751]: I0131 14:45:48.926826 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7e06d2b77d840d75967ee6907153f96eebedeff3900fb9de287bb4c4ff7b817b"} Jan 31 14:45:48 crc kubenswrapper[4751]: I0131 14:45:48.926870 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"307f74fe108cfba0f6cd3ab198e901fea1902ccc42924a9b7a66076b3b0e53a2"} Jan 31 14:45:48 crc kubenswrapper[4751]: I0131 14:45:48.927239 4751 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6afb84a0-c564-45aa-b7a1-cd6f8273fe45" Jan 31 14:45:48 crc kubenswrapper[4751]: I0131 14:45:48.927254 4751 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6afb84a0-c564-45aa-b7a1-cd6f8273fe45" Jan 31 14:45:49 crc kubenswrapper[4751]: I0131 14:45:49.777593 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 14:45:49 crc kubenswrapper[4751]: I0131 14:45:49.789670 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 14:45:51 crc kubenswrapper[4751]: I0131 14:45:51.427660 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 14:45:51 crc kubenswrapper[4751]: I0131 14:45:51.428025 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 14:45:51 crc kubenswrapper[4751]: I0131 14:45:51.435720 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 14:45:54 crc kubenswrapper[4751]: I0131 14:45:54.049199 4751 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 14:45:55 crc kubenswrapper[4751]: I0131 14:45:55.001799 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 14:45:55 crc kubenswrapper[4751]: I0131 14:45:55.001894 4751 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6afb84a0-c564-45aa-b7a1-cd6f8273fe45" Jan 31 14:45:55 crc kubenswrapper[4751]: I0131 14:45:55.002049 4751 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6afb84a0-c564-45aa-b7a1-cd6f8273fe45" Jan 31 14:45:55 crc kubenswrapper[4751]: I0131 14:45:55.007029 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 14:45:56 crc kubenswrapper[4751]: I0131 14:45:56.008691 4751 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6afb84a0-c564-45aa-b7a1-cd6f8273fe45" Jan 31 14:45:56 crc kubenswrapper[4751]: I0131 14:45:56.008819 4751 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6afb84a0-c564-45aa-b7a1-cd6f8273fe45" Jan 31 14:45:56 crc kubenswrapper[4751]: I0131 14:45:56.418374 4751 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="e25c6070-8daf-4743-8483-2439c48514be" Jan 31 14:45:57 crc kubenswrapper[4751]: I0131 14:45:57.016250 4751 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6afb84a0-c564-45aa-b7a1-cd6f8273fe45" Jan 31 14:45:57 crc kubenswrapper[4751]: I0131 14:45:57.016296 4751 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6afb84a0-c564-45aa-b7a1-cd6f8273fe45" Jan 31 14:45:57 crc kubenswrapper[4751]: I0131 14:45:57.019746 4751 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="e25c6070-8daf-4743-8483-2439c48514be" Jan 31 14:45:58 crc kubenswrapper[4751]: I0131 14:45:58.700104 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 14:46:03 crc kubenswrapper[4751]: I0131 14:46:03.671939 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 31 14:46:03 crc kubenswrapper[4751]: I0131 14:46:03.796945 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 31 14:46:03 crc kubenswrapper[4751]: I0131 14:46:03.823745 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 31 14:46:03 crc kubenswrapper[4751]: I0131 14:46:03.881478 4751 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 31 14:46:03 crc kubenswrapper[4751]: I0131 14:46:03.889547 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-xr2gt"] Jan 31 14:46:03 crc kubenswrapper[4751]: I0131 14:46:03.889654 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 31 14:46:03 crc kubenswrapper[4751]: I0131 14:46:03.896296 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 14:46:03 crc kubenswrapper[4751]: I0131 14:46:03.925566 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=9.925540265 podStartE2EDuration="9.925540265s" podCreationTimestamp="2026-01-31 14:45:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:46:03.916006294 +0000 UTC m=+268.290719219" watchObservedRunningTime="2026-01-31 14:46:03.925540265 +0000 UTC m=+268.300253190" Jan 31 14:46:04 crc kubenswrapper[4751]: I0131 14:46:04.165931 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 31 14:46:04 crc kubenswrapper[4751]: I0131 14:46:04.417231 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="802d5225-ef3f-485c-bb85-3c0f18e42952" path="/var/lib/kubelet/pods/802d5225-ef3f-485c-bb85-3c0f18e42952/volumes" Jan 31 14:46:04 crc kubenswrapper[4751]: I0131 14:46:04.630696 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 31 14:46:04 crc kubenswrapper[4751]: I0131 14:46:04.974119 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 31 14:46:05 crc kubenswrapper[4751]: I0131 14:46:05.192421 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 31 14:46:05 crc kubenswrapper[4751]: I0131 14:46:05.234227 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 31 14:46:05 crc kubenswrapper[4751]: I0131 14:46:05.482102 4751 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 31 14:46:05 crc kubenswrapper[4751]: I0131 14:46:05.482321 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://be1bca22b91e771b11166bb91585a254e34658c1ab13b852f1301a3b8029237f" gracePeriod=5 Jan 31 14:46:05 crc kubenswrapper[4751]: I0131 14:46:05.615248 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 31 14:46:05 crc kubenswrapper[4751]: I0131 14:46:05.735553 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 31 14:46:05 crc kubenswrapper[4751]: I0131 14:46:05.820268 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 31 14:46:05 crc kubenswrapper[4751]: I0131 14:46:05.860730 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 31 14:46:06 crc kubenswrapper[4751]: I0131 14:46:06.075008 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 31 14:46:06 crc kubenswrapper[4751]: I0131 14:46:06.088943 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 31 14:46:06 crc kubenswrapper[4751]: I0131 14:46:06.125298 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 31 14:46:06 crc kubenswrapper[4751]: I0131 14:46:06.205565 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 31 14:46:06 crc kubenswrapper[4751]: I0131 14:46:06.245464 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 31 14:46:06 crc kubenswrapper[4751]: I0131 14:46:06.392702 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 31 14:46:06 crc kubenswrapper[4751]: I0131 14:46:06.675219 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 31 14:46:06 crc kubenswrapper[4751]: I0131 14:46:06.946628 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 31 14:46:06 crc kubenswrapper[4751]: I0131 14:46:06.968627 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 31 14:46:07 crc kubenswrapper[4751]: I0131 14:46:07.050695 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 31 14:46:07 crc kubenswrapper[4751]: I0131 14:46:07.065173 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 31 14:46:07 crc kubenswrapper[4751]: I0131 14:46:07.091271 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 31 14:46:07 crc kubenswrapper[4751]: I0131 14:46:07.196130 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 31 14:46:07 crc kubenswrapper[4751]: I0131 14:46:07.208262 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 31 14:46:07 crc kubenswrapper[4751]: I0131 14:46:07.399706 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 31 14:46:07 crc kubenswrapper[4751]: I0131 14:46:07.414400 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 31 14:46:07 crc kubenswrapper[4751]: I0131 14:46:07.430305 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 31 14:46:07 crc kubenswrapper[4751]: I0131 14:46:07.549561 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 31 14:46:07 crc kubenswrapper[4751]: I0131 14:46:07.557843 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 31 14:46:07 crc kubenswrapper[4751]: I0131 14:46:07.716993 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 31 14:46:07 crc kubenswrapper[4751]: I0131 14:46:07.805203 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 31 14:46:07 crc kubenswrapper[4751]: I0131 14:46:07.929753 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 31 14:46:07 crc kubenswrapper[4751]: I0131 14:46:07.975783 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 31 14:46:07 crc kubenswrapper[4751]: I0131 14:46:07.995480 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 31 14:46:08 crc kubenswrapper[4751]: I0131 14:46:08.202158 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 31 14:46:08 crc kubenswrapper[4751]: I0131 14:46:08.216863 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 31 14:46:08 crc kubenswrapper[4751]: I0131 14:46:08.270764 4751 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 31 14:46:08 crc kubenswrapper[4751]: I0131 14:46:08.389193 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 31 14:46:08 crc kubenswrapper[4751]: I0131 14:46:08.409634 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 31 14:46:08 crc kubenswrapper[4751]: I0131 14:46:08.530630 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 31 14:46:08 crc kubenswrapper[4751]: I0131 14:46:08.533322 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 31 14:46:08 crc kubenswrapper[4751]: I0131 14:46:08.616786 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 31 14:46:08 crc kubenswrapper[4751]: I0131 14:46:08.710427 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 31 14:46:08 crc kubenswrapper[4751]: I0131 14:46:08.813652 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 31 14:46:08 crc kubenswrapper[4751]: I0131 14:46:08.842061 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 31 14:46:08 crc kubenswrapper[4751]: I0131 14:46:08.914880 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 31 14:46:08 crc kubenswrapper[4751]: I0131 14:46:08.965130 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 31 14:46:08 crc kubenswrapper[4751]: I0131 14:46:08.969311 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 31 14:46:09 crc kubenswrapper[4751]: I0131 14:46:09.039185 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 31 14:46:09 crc kubenswrapper[4751]: I0131 14:46:09.183104 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 31 14:46:09 crc kubenswrapper[4751]: I0131 14:46:09.374183 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 31 14:46:09 crc kubenswrapper[4751]: I0131 14:46:09.426933 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 31 14:46:09 crc kubenswrapper[4751]: I0131 14:46:09.429735 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 31 14:46:09 crc kubenswrapper[4751]: I0131 14:46:09.461810 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 31 14:46:09 crc kubenswrapper[4751]: I0131 14:46:09.516294 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 31 14:46:09 crc kubenswrapper[4751]: I0131 14:46:09.524012 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 31 14:46:09 crc kubenswrapper[4751]: I0131 14:46:09.630578 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 31 14:46:09 crc kubenswrapper[4751]: I0131 14:46:09.859163 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 31 14:46:09 crc kubenswrapper[4751]: I0131 14:46:09.877060 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 31 14:46:09 crc kubenswrapper[4751]: I0131 14:46:09.883717 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 31 14:46:10 crc kubenswrapper[4751]: I0131 14:46:10.016519 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 31 14:46:10 crc kubenswrapper[4751]: I0131 14:46:10.176313 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 31 14:46:10 crc kubenswrapper[4751]: I0131 14:46:10.201590 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 31 14:46:10 crc kubenswrapper[4751]: I0131 14:46:10.259745 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 31 14:46:10 crc kubenswrapper[4751]: I0131 14:46:10.321139 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 31 14:46:10 crc kubenswrapper[4751]: I0131 14:46:10.326514 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 31 14:46:10 crc kubenswrapper[4751]: I0131 14:46:10.343823 4751 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 31 14:46:10 crc kubenswrapper[4751]: I0131 14:46:10.346564 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 31 14:46:10 crc kubenswrapper[4751]: I0131 14:46:10.380713 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 31 14:46:10 crc kubenswrapper[4751]: I0131 14:46:10.482389 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 31 14:46:10 crc kubenswrapper[4751]: I0131 14:46:10.534519 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 31 14:46:10 crc kubenswrapper[4751]: I0131 14:46:10.546267 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 31 14:46:10 crc kubenswrapper[4751]: I0131 14:46:10.622818 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 31 14:46:10 crc kubenswrapper[4751]: I0131 14:46:10.624824 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 31 14:46:10 crc kubenswrapper[4751]: I0131 14:46:10.640675 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 31 14:46:10 crc kubenswrapper[4751]: I0131 14:46:10.679542 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 31 14:46:10 crc kubenswrapper[4751]: I0131 14:46:10.847560 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 31 14:46:10 crc kubenswrapper[4751]: I0131 14:46:10.935953 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 31 14:46:10 crc kubenswrapper[4751]: I0131 14:46:10.955921 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 31 14:46:10 crc kubenswrapper[4751]: I0131 14:46:10.961317 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 31 14:46:10 crc kubenswrapper[4751]: I0131 14:46:10.996371 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 31 14:46:11 crc kubenswrapper[4751]: I0131 14:46:11.032755 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 31 14:46:11 crc kubenswrapper[4751]: I0131 14:46:11.106989 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 31 14:46:11 crc kubenswrapper[4751]: I0131 14:46:11.107032 4751 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="be1bca22b91e771b11166bb91585a254e34658c1ab13b852f1301a3b8029237f" exitCode=137 Jan 31 14:46:11 crc kubenswrapper[4751]: I0131 14:46:11.132229 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 31 14:46:11 crc kubenswrapper[4751]: I0131 14:46:11.179638 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 31 14:46:11 crc kubenswrapper[4751]: I0131 14:46:11.189955 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 31 14:46:11 crc kubenswrapper[4751]: I0131 14:46:11.213366 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 31 14:46:11 crc kubenswrapper[4751]: I0131 14:46:11.213438 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 14:46:11 crc kubenswrapper[4751]: I0131 14:46:11.219569 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 31 14:46:11 crc kubenswrapper[4751]: I0131 14:46:11.225847 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 31 14:46:11 crc kubenswrapper[4751]: I0131 14:46:11.235567 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 31 14:46:11 crc kubenswrapper[4751]: I0131 14:46:11.338728 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 31 14:46:11 crc kubenswrapper[4751]: I0131 14:46:11.338825 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 31 14:46:11 crc kubenswrapper[4751]: I0131 14:46:11.338893 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 31 14:46:11 crc kubenswrapper[4751]: I0131 14:46:11.338919 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 31 14:46:11 crc kubenswrapper[4751]: I0131 14:46:11.338973 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 31 14:46:11 crc kubenswrapper[4751]: I0131 14:46:11.339030 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 14:46:11 crc kubenswrapper[4751]: I0131 14:46:11.339121 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 14:46:11 crc kubenswrapper[4751]: I0131 14:46:11.339171 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 14:46:11 crc kubenswrapper[4751]: I0131 14:46:11.339253 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 14:46:11 crc kubenswrapper[4751]: I0131 14:46:11.339410 4751 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Jan 31 14:46:11 crc kubenswrapper[4751]: I0131 14:46:11.339442 4751 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 31 14:46:11 crc kubenswrapper[4751]: I0131 14:46:11.339467 4751 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Jan 31 14:46:11 crc kubenswrapper[4751]: I0131 14:46:11.339490 4751 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Jan 31 14:46:11 crc kubenswrapper[4751]: I0131 14:46:11.342506 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 31 14:46:11 crc kubenswrapper[4751]: I0131 14:46:11.355909 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 14:46:11 crc kubenswrapper[4751]: I0131 14:46:11.441113 4751 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 31 14:46:11 crc kubenswrapper[4751]: I0131 14:46:11.467582 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 31 14:46:11 crc kubenswrapper[4751]: I0131 14:46:11.502664 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 31 14:46:11 crc kubenswrapper[4751]: I0131 14:46:11.614862 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 31 14:46:11 crc kubenswrapper[4751]: I0131 14:46:11.623848 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 31 14:46:11 crc kubenswrapper[4751]: I0131 14:46:11.725248 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 31 14:46:11 crc kubenswrapper[4751]: I0131 14:46:11.847175 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 31 14:46:11 crc kubenswrapper[4751]: I0131 14:46:11.878233 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 31 14:46:11 crc kubenswrapper[4751]: I0131 14:46:11.923600 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 31 14:46:11 crc kubenswrapper[4751]: I0131 14:46:11.953527 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 31 14:46:11 crc kubenswrapper[4751]: I0131 14:46:11.970969 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 31 14:46:12 crc kubenswrapper[4751]: I0131 14:46:12.047255 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 31 14:46:12 crc kubenswrapper[4751]: I0131 14:46:12.077238 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 31 14:46:12 crc kubenswrapper[4751]: I0131 14:46:12.115554 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 31 14:46:12 crc kubenswrapper[4751]: I0131 14:46:12.115626 4751 scope.go:117] "RemoveContainer" containerID="be1bca22b91e771b11166bb91585a254e34658c1ab13b852f1301a3b8029237f" Jan 31 14:46:12 crc kubenswrapper[4751]: I0131 14:46:12.115710 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 14:46:12 crc kubenswrapper[4751]: I0131 14:46:12.120175 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 31 14:46:12 crc kubenswrapper[4751]: I0131 14:46:12.253918 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 31 14:46:12 crc kubenswrapper[4751]: I0131 14:46:12.276994 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 31 14:46:12 crc kubenswrapper[4751]: I0131 14:46:12.282145 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 31 14:46:12 crc kubenswrapper[4751]: I0131 14:46:12.374381 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 31 14:46:12 crc kubenswrapper[4751]: I0131 14:46:12.399879 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 31 14:46:12 crc kubenswrapper[4751]: I0131 14:46:12.415332 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 31 14:46:12 crc kubenswrapper[4751]: I0131 14:46:12.416396 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Jan 31 14:46:12 crc kubenswrapper[4751]: I0131 14:46:12.416933 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 31 14:46:12 crc kubenswrapper[4751]: I0131 14:46:12.528876 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 31 14:46:12 crc kubenswrapper[4751]: I0131 14:46:12.589030 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 31 14:46:12 crc kubenswrapper[4751]: I0131 14:46:12.629869 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 31 14:46:12 crc kubenswrapper[4751]: I0131 14:46:12.753793 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 31 14:46:12 crc kubenswrapper[4751]: I0131 14:46:12.802340 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 31 14:46:12 crc kubenswrapper[4751]: I0131 14:46:12.832026 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 31 14:46:12 crc kubenswrapper[4751]: I0131 14:46:12.838234 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 31 14:46:12 crc kubenswrapper[4751]: I0131 14:46:12.847639 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 31 14:46:13 crc kubenswrapper[4751]: I0131 14:46:13.121416 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 31 14:46:13 crc kubenswrapper[4751]: I0131 14:46:13.171595 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 31 14:46:13 crc kubenswrapper[4751]: I0131 14:46:13.332036 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 31 14:46:13 crc kubenswrapper[4751]: I0131 14:46:13.456275 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 31 14:46:13 crc kubenswrapper[4751]: I0131 14:46:13.470198 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 31 14:46:13 crc kubenswrapper[4751]: I0131 14:46:13.501428 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 31 14:46:13 crc kubenswrapper[4751]: I0131 14:46:13.503144 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 31 14:46:13 crc kubenswrapper[4751]: I0131 14:46:13.570644 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 31 14:46:13 crc kubenswrapper[4751]: I0131 14:46:13.678756 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 31 14:46:13 crc kubenswrapper[4751]: I0131 14:46:13.787257 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 31 14:46:13 crc kubenswrapper[4751]: I0131 14:46:13.877211 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 31 14:46:13 crc kubenswrapper[4751]: I0131 14:46:13.947593 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 31 14:46:13 crc kubenswrapper[4751]: I0131 14:46:13.992372 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 31 14:46:14 crc kubenswrapper[4751]: I0131 14:46:14.033894 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 31 14:46:14 crc kubenswrapper[4751]: I0131 14:46:14.116229 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 31 14:46:14 crc kubenswrapper[4751]: I0131 14:46:14.122893 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 31 14:46:14 crc kubenswrapper[4751]: I0131 14:46:14.127814 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 31 14:46:14 crc kubenswrapper[4751]: I0131 14:46:14.182528 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 31 14:46:14 crc kubenswrapper[4751]: I0131 14:46:14.217206 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 31 14:46:14 crc kubenswrapper[4751]: I0131 14:46:14.563325 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 31 14:46:14 crc kubenswrapper[4751]: I0131 14:46:14.615013 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 31 14:46:14 crc kubenswrapper[4751]: I0131 14:46:14.730901 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 31 14:46:14 crc kubenswrapper[4751]: I0131 14:46:14.766194 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 31 14:46:14 crc kubenswrapper[4751]: I0131 14:46:14.797606 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 31 14:46:14 crc kubenswrapper[4751]: I0131 14:46:14.930251 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 31 14:46:14 crc kubenswrapper[4751]: I0131 14:46:14.970674 4751 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 31 14:46:14 crc kubenswrapper[4751]: I0131 14:46:14.995244 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 31 14:46:15 crc kubenswrapper[4751]: I0131 14:46:15.096943 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 31 14:46:15 crc kubenswrapper[4751]: I0131 14:46:15.097016 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 31 14:46:15 crc kubenswrapper[4751]: I0131 14:46:15.141147 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 31 14:46:15 crc kubenswrapper[4751]: I0131 14:46:15.169226 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 31 14:46:15 crc kubenswrapper[4751]: I0131 14:46:15.225130 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 31 14:46:15 crc kubenswrapper[4751]: I0131 14:46:15.320115 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 31 14:46:15 crc kubenswrapper[4751]: I0131 14:46:15.601766 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 31 14:46:15 crc kubenswrapper[4751]: I0131 14:46:15.611456 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 31 14:46:15 crc kubenswrapper[4751]: I0131 14:46:15.614039 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 31 14:46:15 crc kubenswrapper[4751]: I0131 14:46:15.643539 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 31 14:46:15 crc kubenswrapper[4751]: I0131 14:46:15.646628 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 31 14:46:15 crc kubenswrapper[4751]: I0131 14:46:15.719006 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 31 14:46:15 crc kubenswrapper[4751]: I0131 14:46:15.720531 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 31 14:46:15 crc kubenswrapper[4751]: I0131 14:46:15.751696 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 31 14:46:15 crc kubenswrapper[4751]: I0131 14:46:15.754620 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 31 14:46:15 crc kubenswrapper[4751]: I0131 14:46:15.756391 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 31 14:46:15 crc kubenswrapper[4751]: I0131 14:46:15.770274 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 31 14:46:15 crc kubenswrapper[4751]: I0131 14:46:15.851846 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 31 14:46:15 crc kubenswrapper[4751]: I0131 14:46:15.997635 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 31 14:46:16 crc kubenswrapper[4751]: I0131 14:46:16.007472 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 31 14:46:16 crc kubenswrapper[4751]: I0131 14:46:16.038332 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 31 14:46:16 crc kubenswrapper[4751]: I0131 14:46:16.090654 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 31 14:46:16 crc kubenswrapper[4751]: I0131 14:46:16.095676 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 31 14:46:16 crc kubenswrapper[4751]: I0131 14:46:16.146749 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 31 14:46:16 crc kubenswrapper[4751]: I0131 14:46:16.168385 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 31 14:46:16 crc kubenswrapper[4751]: I0131 14:46:16.202218 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 31 14:46:16 crc kubenswrapper[4751]: I0131 14:46:16.206995 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 31 14:46:16 crc kubenswrapper[4751]: I0131 14:46:16.217670 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 31 14:46:16 crc kubenswrapper[4751]: I0131 14:46:16.220981 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 31 14:46:16 crc kubenswrapper[4751]: I0131 14:46:16.231880 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 31 14:46:16 crc kubenswrapper[4751]: I0131 14:46:16.249741 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 31 14:46:16 crc kubenswrapper[4751]: I0131 14:46:16.309175 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 31 14:46:16 crc kubenswrapper[4751]: I0131 14:46:16.436670 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 31 14:46:16 crc kubenswrapper[4751]: I0131 14:46:16.521712 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 31 14:46:16 crc kubenswrapper[4751]: I0131 14:46:16.581209 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 31 14:46:16 crc kubenswrapper[4751]: I0131 14:46:16.584997 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 31 14:46:16 crc kubenswrapper[4751]: I0131 14:46:16.615561 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 31 14:46:16 crc kubenswrapper[4751]: I0131 14:46:16.674716 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 31 14:46:16 crc kubenswrapper[4751]: I0131 14:46:16.696591 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 31 14:46:16 crc kubenswrapper[4751]: I0131 14:46:16.900521 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 31 14:46:16 crc kubenswrapper[4751]: I0131 14:46:16.916472 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 31 14:46:16 crc kubenswrapper[4751]: I0131 14:46:16.984663 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 31 14:46:17 crc kubenswrapper[4751]: I0131 14:46:17.042179 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 31 14:46:17 crc kubenswrapper[4751]: I0131 14:46:17.190221 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 31 14:46:17 crc kubenswrapper[4751]: I0131 14:46:17.199141 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 31 14:46:17 crc kubenswrapper[4751]: I0131 14:46:17.212455 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 31 14:46:17 crc kubenswrapper[4751]: I0131 14:46:17.241880 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 31 14:46:17 crc kubenswrapper[4751]: I0131 14:46:17.509874 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 31 14:46:17 crc kubenswrapper[4751]: I0131 14:46:17.545098 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 31 14:46:17 crc kubenswrapper[4751]: I0131 14:46:17.603571 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 31 14:46:17 crc kubenswrapper[4751]: I0131 14:46:17.654244 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 31 14:46:17 crc kubenswrapper[4751]: I0131 14:46:17.664647 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 31 14:46:17 crc kubenswrapper[4751]: I0131 14:46:17.677745 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 31 14:46:17 crc kubenswrapper[4751]: I0131 14:46:17.685361 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 31 14:46:17 crc kubenswrapper[4751]: I0131 14:46:17.689215 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 31 14:46:17 crc kubenswrapper[4751]: I0131 14:46:17.790867 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 31 14:46:17 crc kubenswrapper[4751]: I0131 14:46:17.796491 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 31 14:46:17 crc kubenswrapper[4751]: I0131 14:46:17.797411 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 31 14:46:17 crc kubenswrapper[4751]: I0131 14:46:17.804622 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 31 14:46:17 crc kubenswrapper[4751]: I0131 14:46:17.884734 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 31 14:46:18 crc kubenswrapper[4751]: I0131 14:46:18.000363 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 31 14:46:18 crc kubenswrapper[4751]: I0131 14:46:18.053607 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 31 14:46:18 crc kubenswrapper[4751]: I0131 14:46:18.070426 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 31 14:46:18 crc kubenswrapper[4751]: I0131 14:46:18.121469 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 31 14:46:18 crc kubenswrapper[4751]: I0131 14:46:18.257112 4751 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 31 14:46:18 crc kubenswrapper[4751]: I0131 14:46:18.260683 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 31 14:46:18 crc kubenswrapper[4751]: I0131 14:46:18.396027 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 31 14:46:18 crc kubenswrapper[4751]: I0131 14:46:18.420925 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 31 14:46:18 crc kubenswrapper[4751]: I0131 14:46:18.465150 4751 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 31 14:46:18 crc kubenswrapper[4751]: I0131 14:46:18.470410 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 31 14:46:18 crc kubenswrapper[4751]: I0131 14:46:18.549187 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 31 14:46:18 crc kubenswrapper[4751]: I0131 14:46:18.567342 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 31 14:46:18 crc kubenswrapper[4751]: I0131 14:46:18.636813 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 31 14:46:18 crc kubenswrapper[4751]: I0131 14:46:18.646900 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 31 14:46:18 crc kubenswrapper[4751]: I0131 14:46:18.691899 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 31 14:46:18 crc kubenswrapper[4751]: I0131 14:46:18.773216 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 31 14:46:18 crc kubenswrapper[4751]: I0131 14:46:18.813050 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 31 14:46:18 crc kubenswrapper[4751]: I0131 14:46:18.837932 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 31 14:46:18 crc kubenswrapper[4751]: I0131 14:46:18.849687 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 31 14:46:18 crc kubenswrapper[4751]: I0131 14:46:18.852000 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 31 14:46:19 crc kubenswrapper[4751]: I0131 14:46:19.085698 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 31 14:46:19 crc kubenswrapper[4751]: I0131 14:46:19.132946 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 31 14:46:19 crc kubenswrapper[4751]: I0131 14:46:19.333936 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 31 14:46:19 crc kubenswrapper[4751]: I0131 14:46:19.724171 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 31 14:46:19 crc kubenswrapper[4751]: I0131 14:46:19.772811 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 31 14:46:19 crc kubenswrapper[4751]: I0131 14:46:19.933249 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.041970 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-f699678c6-qbqdh"] Jan 31 14:46:20 crc kubenswrapper[4751]: E0131 14:46:20.042385 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.042397 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 31 14:46:20 crc kubenswrapper[4751]: E0131 14:46:20.042413 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccfa0c88-7f51-4d85-8a49-e05865c6a06e" containerName="installer" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.042419 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccfa0c88-7f51-4d85-8a49-e05865c6a06e" containerName="installer" Jan 31 14:46:20 crc kubenswrapper[4751]: E0131 14:46:20.042428 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="802d5225-ef3f-485c-bb85-3c0f18e42952" containerName="oauth-openshift" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.042434 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="802d5225-ef3f-485c-bb85-3c0f18e42952" containerName="oauth-openshift" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.042512 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccfa0c88-7f51-4d85-8a49-e05865c6a06e" containerName="installer" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.042527 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="802d5225-ef3f-485c-bb85-3c0f18e42952" containerName="oauth-openshift" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.042536 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.042876 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-f699678c6-qbqdh" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.049234 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.049507 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.049721 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.050024 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.050115 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.050260 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.050511 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.050688 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.051027 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.051142 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.051244 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.054345 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.060703 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-f699678c6-qbqdh"] Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.063581 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c36bd7cf-5b67-414c-87f5-96de17336696-audit-policies\") pod \"oauth-openshift-f699678c6-qbqdh\" (UID: \"c36bd7cf-5b67-414c-87f5-96de17336696\") " pod="openshift-authentication/oauth-openshift-f699678c6-qbqdh" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.063647 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqpbg\" (UniqueName: \"kubernetes.io/projected/c36bd7cf-5b67-414c-87f5-96de17336696-kube-api-access-cqpbg\") pod \"oauth-openshift-f699678c6-qbqdh\" (UID: \"c36bd7cf-5b67-414c-87f5-96de17336696\") " pod="openshift-authentication/oauth-openshift-f699678c6-qbqdh" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.063694 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c36bd7cf-5b67-414c-87f5-96de17336696-v4-0-config-system-router-certs\") pod \"oauth-openshift-f699678c6-qbqdh\" (UID: \"c36bd7cf-5b67-414c-87f5-96de17336696\") " pod="openshift-authentication/oauth-openshift-f699678c6-qbqdh" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.063751 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c36bd7cf-5b67-414c-87f5-96de17336696-v4-0-config-user-template-login\") pod \"oauth-openshift-f699678c6-qbqdh\" (UID: \"c36bd7cf-5b67-414c-87f5-96de17336696\") " pod="openshift-authentication/oauth-openshift-f699678c6-qbqdh" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.063798 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c36bd7cf-5b67-414c-87f5-96de17336696-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-f699678c6-qbqdh\" (UID: \"c36bd7cf-5b67-414c-87f5-96de17336696\") " pod="openshift-authentication/oauth-openshift-f699678c6-qbqdh" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.063831 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c36bd7cf-5b67-414c-87f5-96de17336696-v4-0-config-system-session\") pod \"oauth-openshift-f699678c6-qbqdh\" (UID: \"c36bd7cf-5b67-414c-87f5-96de17336696\") " pod="openshift-authentication/oauth-openshift-f699678c6-qbqdh" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.063872 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c36bd7cf-5b67-414c-87f5-96de17336696-audit-dir\") pod \"oauth-openshift-f699678c6-qbqdh\" (UID: \"c36bd7cf-5b67-414c-87f5-96de17336696\") " pod="openshift-authentication/oauth-openshift-f699678c6-qbqdh" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.063909 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c36bd7cf-5b67-414c-87f5-96de17336696-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-f699678c6-qbqdh\" (UID: \"c36bd7cf-5b67-414c-87f5-96de17336696\") " pod="openshift-authentication/oauth-openshift-f699678c6-qbqdh" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.063950 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c36bd7cf-5b67-414c-87f5-96de17336696-v4-0-config-system-service-ca\") pod \"oauth-openshift-f699678c6-qbqdh\" (UID: \"c36bd7cf-5b67-414c-87f5-96de17336696\") " pod="openshift-authentication/oauth-openshift-f699678c6-qbqdh" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.063985 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c36bd7cf-5b67-414c-87f5-96de17336696-v4-0-config-user-template-error\") pod \"oauth-openshift-f699678c6-qbqdh\" (UID: \"c36bd7cf-5b67-414c-87f5-96de17336696\") " pod="openshift-authentication/oauth-openshift-f699678c6-qbqdh" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.064017 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c36bd7cf-5b67-414c-87f5-96de17336696-v4-0-config-system-serving-cert\") pod \"oauth-openshift-f699678c6-qbqdh\" (UID: \"c36bd7cf-5b67-414c-87f5-96de17336696\") " pod="openshift-authentication/oauth-openshift-f699678c6-qbqdh" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.064122 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c36bd7cf-5b67-414c-87f5-96de17336696-v4-0-config-system-cliconfig\") pod \"oauth-openshift-f699678c6-qbqdh\" (UID: \"c36bd7cf-5b67-414c-87f5-96de17336696\") " pod="openshift-authentication/oauth-openshift-f699678c6-qbqdh" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.064165 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c36bd7cf-5b67-414c-87f5-96de17336696-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-f699678c6-qbqdh\" (UID: \"c36bd7cf-5b67-414c-87f5-96de17336696\") " pod="openshift-authentication/oauth-openshift-f699678c6-qbqdh" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.064198 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c36bd7cf-5b67-414c-87f5-96de17336696-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-f699678c6-qbqdh\" (UID: \"c36bd7cf-5b67-414c-87f5-96de17336696\") " pod="openshift-authentication/oauth-openshift-f699678c6-qbqdh" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.064763 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.070249 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.072566 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.076932 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.165903 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c36bd7cf-5b67-414c-87f5-96de17336696-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-f699678c6-qbqdh\" (UID: \"c36bd7cf-5b67-414c-87f5-96de17336696\") " pod="openshift-authentication/oauth-openshift-f699678c6-qbqdh" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.165973 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c36bd7cf-5b67-414c-87f5-96de17336696-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-f699678c6-qbqdh\" (UID: \"c36bd7cf-5b67-414c-87f5-96de17336696\") " pod="openshift-authentication/oauth-openshift-f699678c6-qbqdh" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.166036 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c36bd7cf-5b67-414c-87f5-96de17336696-audit-policies\") pod \"oauth-openshift-f699678c6-qbqdh\" (UID: \"c36bd7cf-5b67-414c-87f5-96de17336696\") " pod="openshift-authentication/oauth-openshift-f699678c6-qbqdh" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.166099 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqpbg\" (UniqueName: \"kubernetes.io/projected/c36bd7cf-5b67-414c-87f5-96de17336696-kube-api-access-cqpbg\") pod \"oauth-openshift-f699678c6-qbqdh\" (UID: \"c36bd7cf-5b67-414c-87f5-96de17336696\") " pod="openshift-authentication/oauth-openshift-f699678c6-qbqdh" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.166140 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c36bd7cf-5b67-414c-87f5-96de17336696-v4-0-config-system-router-certs\") pod \"oauth-openshift-f699678c6-qbqdh\" (UID: \"c36bd7cf-5b67-414c-87f5-96de17336696\") " pod="openshift-authentication/oauth-openshift-f699678c6-qbqdh" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.166198 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c36bd7cf-5b67-414c-87f5-96de17336696-v4-0-config-user-template-login\") pod \"oauth-openshift-f699678c6-qbqdh\" (UID: \"c36bd7cf-5b67-414c-87f5-96de17336696\") " pod="openshift-authentication/oauth-openshift-f699678c6-qbqdh" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.166248 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c36bd7cf-5b67-414c-87f5-96de17336696-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-f699678c6-qbqdh\" (UID: \"c36bd7cf-5b67-414c-87f5-96de17336696\") " pod="openshift-authentication/oauth-openshift-f699678c6-qbqdh" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.166282 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c36bd7cf-5b67-414c-87f5-96de17336696-v4-0-config-system-session\") pod \"oauth-openshift-f699678c6-qbqdh\" (UID: \"c36bd7cf-5b67-414c-87f5-96de17336696\") " pod="openshift-authentication/oauth-openshift-f699678c6-qbqdh" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.166322 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c36bd7cf-5b67-414c-87f5-96de17336696-audit-dir\") pod \"oauth-openshift-f699678c6-qbqdh\" (UID: \"c36bd7cf-5b67-414c-87f5-96de17336696\") " pod="openshift-authentication/oauth-openshift-f699678c6-qbqdh" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.166359 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c36bd7cf-5b67-414c-87f5-96de17336696-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-f699678c6-qbqdh\" (UID: \"c36bd7cf-5b67-414c-87f5-96de17336696\") " pod="openshift-authentication/oauth-openshift-f699678c6-qbqdh" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.166399 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c36bd7cf-5b67-414c-87f5-96de17336696-v4-0-config-system-service-ca\") pod \"oauth-openshift-f699678c6-qbqdh\" (UID: \"c36bd7cf-5b67-414c-87f5-96de17336696\") " pod="openshift-authentication/oauth-openshift-f699678c6-qbqdh" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.166437 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c36bd7cf-5b67-414c-87f5-96de17336696-v4-0-config-user-template-error\") pod \"oauth-openshift-f699678c6-qbqdh\" (UID: \"c36bd7cf-5b67-414c-87f5-96de17336696\") " pod="openshift-authentication/oauth-openshift-f699678c6-qbqdh" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.166476 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c36bd7cf-5b67-414c-87f5-96de17336696-v4-0-config-system-serving-cert\") pod \"oauth-openshift-f699678c6-qbqdh\" (UID: \"c36bd7cf-5b67-414c-87f5-96de17336696\") " pod="openshift-authentication/oauth-openshift-f699678c6-qbqdh" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.166524 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c36bd7cf-5b67-414c-87f5-96de17336696-v4-0-config-system-cliconfig\") pod \"oauth-openshift-f699678c6-qbqdh\" (UID: \"c36bd7cf-5b67-414c-87f5-96de17336696\") " pod="openshift-authentication/oauth-openshift-f699678c6-qbqdh" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.166563 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c36bd7cf-5b67-414c-87f5-96de17336696-audit-dir\") pod \"oauth-openshift-f699678c6-qbqdh\" (UID: \"c36bd7cf-5b67-414c-87f5-96de17336696\") " pod="openshift-authentication/oauth-openshift-f699678c6-qbqdh" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.167454 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c36bd7cf-5b67-414c-87f5-96de17336696-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-f699678c6-qbqdh\" (UID: \"c36bd7cf-5b67-414c-87f5-96de17336696\") " pod="openshift-authentication/oauth-openshift-f699678c6-qbqdh" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.167754 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c36bd7cf-5b67-414c-87f5-96de17336696-v4-0-config-system-cliconfig\") pod \"oauth-openshift-f699678c6-qbqdh\" (UID: \"c36bd7cf-5b67-414c-87f5-96de17336696\") " pod="openshift-authentication/oauth-openshift-f699678c6-qbqdh" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.168300 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c36bd7cf-5b67-414c-87f5-96de17336696-v4-0-config-system-service-ca\") pod \"oauth-openshift-f699678c6-qbqdh\" (UID: \"c36bd7cf-5b67-414c-87f5-96de17336696\") " pod="openshift-authentication/oauth-openshift-f699678c6-qbqdh" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.168676 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c36bd7cf-5b67-414c-87f5-96de17336696-audit-policies\") pod \"oauth-openshift-f699678c6-qbqdh\" (UID: \"c36bd7cf-5b67-414c-87f5-96de17336696\") " pod="openshift-authentication/oauth-openshift-f699678c6-qbqdh" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.172511 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c36bd7cf-5b67-414c-87f5-96de17336696-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-f699678c6-qbqdh\" (UID: \"c36bd7cf-5b67-414c-87f5-96de17336696\") " pod="openshift-authentication/oauth-openshift-f699678c6-qbqdh" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.172680 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c36bd7cf-5b67-414c-87f5-96de17336696-v4-0-config-system-router-certs\") pod \"oauth-openshift-f699678c6-qbqdh\" (UID: \"c36bd7cf-5b67-414c-87f5-96de17336696\") " pod="openshift-authentication/oauth-openshift-f699678c6-qbqdh" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.173499 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c36bd7cf-5b67-414c-87f5-96de17336696-v4-0-config-system-session\") pod \"oauth-openshift-f699678c6-qbqdh\" (UID: \"c36bd7cf-5b67-414c-87f5-96de17336696\") " pod="openshift-authentication/oauth-openshift-f699678c6-qbqdh" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.175338 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c36bd7cf-5b67-414c-87f5-96de17336696-v4-0-config-system-serving-cert\") pod \"oauth-openshift-f699678c6-qbqdh\" (UID: \"c36bd7cf-5b67-414c-87f5-96de17336696\") " pod="openshift-authentication/oauth-openshift-f699678c6-qbqdh" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.176136 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c36bd7cf-5b67-414c-87f5-96de17336696-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-f699678c6-qbqdh\" (UID: \"c36bd7cf-5b67-414c-87f5-96de17336696\") " pod="openshift-authentication/oauth-openshift-f699678c6-qbqdh" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.176930 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c36bd7cf-5b67-414c-87f5-96de17336696-v4-0-config-user-template-error\") pod \"oauth-openshift-f699678c6-qbqdh\" (UID: \"c36bd7cf-5b67-414c-87f5-96de17336696\") " pod="openshift-authentication/oauth-openshift-f699678c6-qbqdh" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.179875 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c36bd7cf-5b67-414c-87f5-96de17336696-v4-0-config-user-template-login\") pod \"oauth-openshift-f699678c6-qbqdh\" (UID: \"c36bd7cf-5b67-414c-87f5-96de17336696\") " pod="openshift-authentication/oauth-openshift-f699678c6-qbqdh" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.182922 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c36bd7cf-5b67-414c-87f5-96de17336696-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-f699678c6-qbqdh\" (UID: \"c36bd7cf-5b67-414c-87f5-96de17336696\") " pod="openshift-authentication/oauth-openshift-f699678c6-qbqdh" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.186872 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqpbg\" (UniqueName: \"kubernetes.io/projected/c36bd7cf-5b67-414c-87f5-96de17336696-kube-api-access-cqpbg\") pod \"oauth-openshift-f699678c6-qbqdh\" (UID: \"c36bd7cf-5b67-414c-87f5-96de17336696\") " pod="openshift-authentication/oauth-openshift-f699678c6-qbqdh" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.265426 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.363094 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.370442 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-f699678c6-qbqdh" Jan 31 14:46:20 crc kubenswrapper[4751]: I0131 14:46:20.806045 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-f699678c6-qbqdh"] Jan 31 14:46:21 crc kubenswrapper[4751]: I0131 14:46:21.169866 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-f699678c6-qbqdh" event={"ID":"c36bd7cf-5b67-414c-87f5-96de17336696","Type":"ContainerStarted","Data":"2b5cff92e9c639be33386db99fdb0476e4a0f37da7395b60fcf06f1a4046a4e5"} Jan 31 14:46:21 crc kubenswrapper[4751]: I0131 14:46:21.169904 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-f699678c6-qbqdh" event={"ID":"c36bd7cf-5b67-414c-87f5-96de17336696","Type":"ContainerStarted","Data":"b5ca8568e3e0799b2108dad67eb1e21f53d70396acd240a9bf5acb4e22d83be5"} Jan 31 14:46:21 crc kubenswrapper[4751]: I0131 14:46:21.170132 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-f699678c6-qbqdh" Jan 31 14:46:21 crc kubenswrapper[4751]: I0131 14:46:21.196978 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-f699678c6-qbqdh" podStartSLOduration=71.196959662 podStartE2EDuration="1m11.196959662s" podCreationTimestamp="2026-01-31 14:45:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:46:21.19650433 +0000 UTC m=+285.571217215" watchObservedRunningTime="2026-01-31 14:46:21.196959662 +0000 UTC m=+285.571672557" Jan 31 14:46:21 crc kubenswrapper[4751]: I0131 14:46:21.410497 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-f699678c6-qbqdh" Jan 31 14:46:23 crc kubenswrapper[4751]: I0131 14:46:23.912897 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7df5c4f8d-6z7qk"] Jan 31 14:46:23 crc kubenswrapper[4751]: I0131 14:46:23.913409 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7df5c4f8d-6z7qk" podUID="10e50c97-9956-48fc-a759-6d6a2e2d8ca5" containerName="controller-manager" containerID="cri-o://ddc997e5bd0f4e42afe9a829495321c7b00002150e75b55e2c4d433cd4092402" gracePeriod=30 Jan 31 14:46:24 crc kubenswrapper[4751]: I0131 14:46:24.006736 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77b9dd5444-kpj8z"] Jan 31 14:46:24 crc kubenswrapper[4751]: I0131 14:46:24.006971 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-77b9dd5444-kpj8z" podUID="e51033f6-0061-4b08-9d82-11c610c7d396" containerName="route-controller-manager" containerID="cri-o://d3805c33079a37613edf0ea51929b4cc19479078ccdac739485e2af4ae10c78a" gracePeriod=30 Jan 31 14:46:24 crc kubenswrapper[4751]: I0131 14:46:24.193296 4751 generic.go:334] "Generic (PLEG): container finished" podID="10e50c97-9956-48fc-a759-6d6a2e2d8ca5" containerID="ddc997e5bd0f4e42afe9a829495321c7b00002150e75b55e2c4d433cd4092402" exitCode=0 Jan 31 14:46:24 crc kubenswrapper[4751]: I0131 14:46:24.193401 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7df5c4f8d-6z7qk" event={"ID":"10e50c97-9956-48fc-a759-6d6a2e2d8ca5","Type":"ContainerDied","Data":"ddc997e5bd0f4e42afe9a829495321c7b00002150e75b55e2c4d433cd4092402"} Jan 31 14:46:24 crc kubenswrapper[4751]: I0131 14:46:24.195802 4751 generic.go:334] "Generic (PLEG): container finished" podID="e51033f6-0061-4b08-9d82-11c610c7d396" containerID="d3805c33079a37613edf0ea51929b4cc19479078ccdac739485e2af4ae10c78a" exitCode=0 Jan 31 14:46:24 crc kubenswrapper[4751]: I0131 14:46:24.195856 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-77b9dd5444-kpj8z" event={"ID":"e51033f6-0061-4b08-9d82-11c610c7d396","Type":"ContainerDied","Data":"d3805c33079a37613edf0ea51929b4cc19479078ccdac739485e2af4ae10c78a"} Jan 31 14:46:24 crc kubenswrapper[4751]: I0131 14:46:24.461163 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-77b9dd5444-kpj8z" Jan 31 14:46:24 crc kubenswrapper[4751]: I0131 14:46:24.521344 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e51033f6-0061-4b08-9d82-11c610c7d396-serving-cert\") pod \"e51033f6-0061-4b08-9d82-11c610c7d396\" (UID: \"e51033f6-0061-4b08-9d82-11c610c7d396\") " Jan 31 14:46:24 crc kubenswrapper[4751]: I0131 14:46:24.521463 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e51033f6-0061-4b08-9d82-11c610c7d396-config\") pod \"e51033f6-0061-4b08-9d82-11c610c7d396\" (UID: \"e51033f6-0061-4b08-9d82-11c610c7d396\") " Jan 31 14:46:24 crc kubenswrapper[4751]: I0131 14:46:24.522204 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqnt2\" (UniqueName: \"kubernetes.io/projected/e51033f6-0061-4b08-9d82-11c610c7d396-kube-api-access-mqnt2\") pod \"e51033f6-0061-4b08-9d82-11c610c7d396\" (UID: \"e51033f6-0061-4b08-9d82-11c610c7d396\") " Jan 31 14:46:24 crc kubenswrapper[4751]: I0131 14:46:24.522250 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e51033f6-0061-4b08-9d82-11c610c7d396-client-ca\") pod \"e51033f6-0061-4b08-9d82-11c610c7d396\" (UID: \"e51033f6-0061-4b08-9d82-11c610c7d396\") " Jan 31 14:46:24 crc kubenswrapper[4751]: I0131 14:46:24.522336 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e51033f6-0061-4b08-9d82-11c610c7d396-config" (OuterVolumeSpecName: "config") pod "e51033f6-0061-4b08-9d82-11c610c7d396" (UID: "e51033f6-0061-4b08-9d82-11c610c7d396"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:46:24 crc kubenswrapper[4751]: I0131 14:46:24.522746 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e51033f6-0061-4b08-9d82-11c610c7d396-client-ca" (OuterVolumeSpecName: "client-ca") pod "e51033f6-0061-4b08-9d82-11c610c7d396" (UID: "e51033f6-0061-4b08-9d82-11c610c7d396"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:46:24 crc kubenswrapper[4751]: I0131 14:46:24.528269 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e51033f6-0061-4b08-9d82-11c610c7d396-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e51033f6-0061-4b08-9d82-11c610c7d396" (UID: "e51033f6-0061-4b08-9d82-11c610c7d396"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:46:24 crc kubenswrapper[4751]: I0131 14:46:24.530909 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e51033f6-0061-4b08-9d82-11c610c7d396-kube-api-access-mqnt2" (OuterVolumeSpecName: "kube-api-access-mqnt2") pod "e51033f6-0061-4b08-9d82-11c610c7d396" (UID: "e51033f6-0061-4b08-9d82-11c610c7d396"). InnerVolumeSpecName "kube-api-access-mqnt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:46:24 crc kubenswrapper[4751]: I0131 14:46:24.624031 4751 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e51033f6-0061-4b08-9d82-11c610c7d396-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 14:46:24 crc kubenswrapper[4751]: I0131 14:46:24.624082 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e51033f6-0061-4b08-9d82-11c610c7d396-config\") on node \"crc\" DevicePath \"\"" Jan 31 14:46:24 crc kubenswrapper[4751]: I0131 14:46:24.624094 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqnt2\" (UniqueName: \"kubernetes.io/projected/e51033f6-0061-4b08-9d82-11c610c7d396-kube-api-access-mqnt2\") on node \"crc\" DevicePath \"\"" Jan 31 14:46:24 crc kubenswrapper[4751]: I0131 14:46:24.624107 4751 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e51033f6-0061-4b08-9d82-11c610c7d396-client-ca\") on node \"crc\" DevicePath \"\"" Jan 31 14:46:24 crc kubenswrapper[4751]: I0131 14:46:24.928010 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7df5c4f8d-6z7qk" Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.028597 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/10e50c97-9956-48fc-a759-6d6a2e2d8ca5-proxy-ca-bundles\") pod \"10e50c97-9956-48fc-a759-6d6a2e2d8ca5\" (UID: \"10e50c97-9956-48fc-a759-6d6a2e2d8ca5\") " Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.030027 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/10e50c97-9956-48fc-a759-6d6a2e2d8ca5-client-ca\") pod \"10e50c97-9956-48fc-a759-6d6a2e2d8ca5\" (UID: \"10e50c97-9956-48fc-a759-6d6a2e2d8ca5\") " Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.030140 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9t85s\" (UniqueName: \"kubernetes.io/projected/10e50c97-9956-48fc-a759-6d6a2e2d8ca5-kube-api-access-9t85s\") pod \"10e50c97-9956-48fc-a759-6d6a2e2d8ca5\" (UID: \"10e50c97-9956-48fc-a759-6d6a2e2d8ca5\") " Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.030198 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10e50c97-9956-48fc-a759-6d6a2e2d8ca5-config\") pod \"10e50c97-9956-48fc-a759-6d6a2e2d8ca5\" (UID: \"10e50c97-9956-48fc-a759-6d6a2e2d8ca5\") " Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.030241 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10e50c97-9956-48fc-a759-6d6a2e2d8ca5-serving-cert\") pod \"10e50c97-9956-48fc-a759-6d6a2e2d8ca5\" (UID: \"10e50c97-9956-48fc-a759-6d6a2e2d8ca5\") " Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.029592 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10e50c97-9956-48fc-a759-6d6a2e2d8ca5-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "10e50c97-9956-48fc-a759-6d6a2e2d8ca5" (UID: "10e50c97-9956-48fc-a759-6d6a2e2d8ca5"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.030917 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10e50c97-9956-48fc-a759-6d6a2e2d8ca5-client-ca" (OuterVolumeSpecName: "client-ca") pod "10e50c97-9956-48fc-a759-6d6a2e2d8ca5" (UID: "10e50c97-9956-48fc-a759-6d6a2e2d8ca5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.031049 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10e50c97-9956-48fc-a759-6d6a2e2d8ca5-config" (OuterVolumeSpecName: "config") pod "10e50c97-9956-48fc-a759-6d6a2e2d8ca5" (UID: "10e50c97-9956-48fc-a759-6d6a2e2d8ca5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.035443 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10e50c97-9956-48fc-a759-6d6a2e2d8ca5-kube-api-access-9t85s" (OuterVolumeSpecName: "kube-api-access-9t85s") pod "10e50c97-9956-48fc-a759-6d6a2e2d8ca5" (UID: "10e50c97-9956-48fc-a759-6d6a2e2d8ca5"). InnerVolumeSpecName "kube-api-access-9t85s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.037725 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10e50c97-9956-48fc-a759-6d6a2e2d8ca5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "10e50c97-9956-48fc-a759-6d6a2e2d8ca5" (UID: "10e50c97-9956-48fc-a759-6d6a2e2d8ca5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.132470 4751 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/10e50c97-9956-48fc-a759-6d6a2e2d8ca5-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.132505 4751 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/10e50c97-9956-48fc-a759-6d6a2e2d8ca5-client-ca\") on node \"crc\" DevicePath \"\"" Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.132517 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9t85s\" (UniqueName: \"kubernetes.io/projected/10e50c97-9956-48fc-a759-6d6a2e2d8ca5-kube-api-access-9t85s\") on node \"crc\" DevicePath \"\"" Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.132531 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10e50c97-9956-48fc-a759-6d6a2e2d8ca5-config\") on node \"crc\" DevicePath \"\"" Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.132540 4751 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10e50c97-9956-48fc-a759-6d6a2e2d8ca5-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.177585 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-69f964bddc-kkkr7"] Jan 31 14:46:25 crc kubenswrapper[4751]: E0131 14:46:25.178050 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e51033f6-0061-4b08-9d82-11c610c7d396" containerName="route-controller-manager" Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.178113 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="e51033f6-0061-4b08-9d82-11c610c7d396" containerName="route-controller-manager" Jan 31 14:46:25 crc kubenswrapper[4751]: E0131 14:46:25.178155 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10e50c97-9956-48fc-a759-6d6a2e2d8ca5" containerName="controller-manager" Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.178174 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="10e50c97-9956-48fc-a759-6d6a2e2d8ca5" containerName="controller-manager" Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.178359 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="e51033f6-0061-4b08-9d82-11c610c7d396" containerName="route-controller-manager" Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.178392 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="10e50c97-9956-48fc-a759-6d6a2e2d8ca5" containerName="controller-manager" Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.179252 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-69f964bddc-kkkr7" Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.184119 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-786458fd97-jr5xx"] Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.185298 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-786458fd97-jr5xx" Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.189713 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-786458fd97-jr5xx"] Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.196590 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-69f964bddc-kkkr7"] Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.210950 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7df5c4f8d-6z7qk" event={"ID":"10e50c97-9956-48fc-a759-6d6a2e2d8ca5","Type":"ContainerDied","Data":"f5eab53aa57319123515f3ce0a1dc6f4bc60f14152bae2520bba9ec245f1d592"} Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.211001 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7df5c4f8d-6z7qk" Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.211038 4751 scope.go:117] "RemoveContainer" containerID="ddc997e5bd0f4e42afe9a829495321c7b00002150e75b55e2c4d433cd4092402" Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.215335 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-77b9dd5444-kpj8z" event={"ID":"e51033f6-0061-4b08-9d82-11c610c7d396","Type":"ContainerDied","Data":"701b25b8e3fbaec6025474cc0863bcdd565567a075d8cac932f6692b1bdc32fa"} Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.215402 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-77b9dd5444-kpj8z" Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.232374 4751 scope.go:117] "RemoveContainer" containerID="d3805c33079a37613edf0ea51929b4cc19479078ccdac739485e2af4ae10c78a" Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.233291 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4a69f6b0-3803-4184-bfd0-0fac841243c9-proxy-ca-bundles\") pod \"controller-manager-69f964bddc-kkkr7\" (UID: \"4a69f6b0-3803-4184-bfd0-0fac841243c9\") " pod="openshift-controller-manager/controller-manager-69f964bddc-kkkr7" Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.233342 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63f706f0-4681-4255-9479-fa83f336faf3-config\") pod \"route-controller-manager-786458fd97-jr5xx\" (UID: \"63f706f0-4681-4255-9479-fa83f336faf3\") " pod="openshift-route-controller-manager/route-controller-manager-786458fd97-jr5xx" Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.233383 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqtdt\" (UniqueName: \"kubernetes.io/projected/63f706f0-4681-4255-9479-fa83f336faf3-kube-api-access-fqtdt\") pod \"route-controller-manager-786458fd97-jr5xx\" (UID: \"63f706f0-4681-4255-9479-fa83f336faf3\") " pod="openshift-route-controller-manager/route-controller-manager-786458fd97-jr5xx" Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.233414 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4a69f6b0-3803-4184-bfd0-0fac841243c9-client-ca\") pod \"controller-manager-69f964bddc-kkkr7\" (UID: \"4a69f6b0-3803-4184-bfd0-0fac841243c9\") " pod="openshift-controller-manager/controller-manager-69f964bddc-kkkr7" Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.233541 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/63f706f0-4681-4255-9479-fa83f336faf3-client-ca\") pod \"route-controller-manager-786458fd97-jr5xx\" (UID: \"63f706f0-4681-4255-9479-fa83f336faf3\") " pod="openshift-route-controller-manager/route-controller-manager-786458fd97-jr5xx" Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.233567 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-598sf\" (UniqueName: \"kubernetes.io/projected/4a69f6b0-3803-4184-bfd0-0fac841243c9-kube-api-access-598sf\") pod \"controller-manager-69f964bddc-kkkr7\" (UID: \"4a69f6b0-3803-4184-bfd0-0fac841243c9\") " pod="openshift-controller-manager/controller-manager-69f964bddc-kkkr7" Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.233597 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a69f6b0-3803-4184-bfd0-0fac841243c9-config\") pod \"controller-manager-69f964bddc-kkkr7\" (UID: \"4a69f6b0-3803-4184-bfd0-0fac841243c9\") " pod="openshift-controller-manager/controller-manager-69f964bddc-kkkr7" Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.233613 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a69f6b0-3803-4184-bfd0-0fac841243c9-serving-cert\") pod \"controller-manager-69f964bddc-kkkr7\" (UID: \"4a69f6b0-3803-4184-bfd0-0fac841243c9\") " pod="openshift-controller-manager/controller-manager-69f964bddc-kkkr7" Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.233648 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63f706f0-4681-4255-9479-fa83f336faf3-serving-cert\") pod \"route-controller-manager-786458fd97-jr5xx\" (UID: \"63f706f0-4681-4255-9479-fa83f336faf3\") " pod="openshift-route-controller-manager/route-controller-manager-786458fd97-jr5xx" Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.253993 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7df5c4f8d-6z7qk"] Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.258455 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7df5c4f8d-6z7qk"] Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.276767 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77b9dd5444-kpj8z"] Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.280742 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77b9dd5444-kpj8z"] Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.334984 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63f706f0-4681-4255-9479-fa83f336faf3-serving-cert\") pod \"route-controller-manager-786458fd97-jr5xx\" (UID: \"63f706f0-4681-4255-9479-fa83f336faf3\") " pod="openshift-route-controller-manager/route-controller-manager-786458fd97-jr5xx" Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.335052 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4a69f6b0-3803-4184-bfd0-0fac841243c9-proxy-ca-bundles\") pod \"controller-manager-69f964bddc-kkkr7\" (UID: \"4a69f6b0-3803-4184-bfd0-0fac841243c9\") " pod="openshift-controller-manager/controller-manager-69f964bddc-kkkr7" Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.335098 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63f706f0-4681-4255-9479-fa83f336faf3-config\") pod \"route-controller-manager-786458fd97-jr5xx\" (UID: \"63f706f0-4681-4255-9479-fa83f336faf3\") " pod="openshift-route-controller-manager/route-controller-manager-786458fd97-jr5xx" Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.335143 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqtdt\" (UniqueName: \"kubernetes.io/projected/63f706f0-4681-4255-9479-fa83f336faf3-kube-api-access-fqtdt\") pod \"route-controller-manager-786458fd97-jr5xx\" (UID: \"63f706f0-4681-4255-9479-fa83f336faf3\") " pod="openshift-route-controller-manager/route-controller-manager-786458fd97-jr5xx" Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.335183 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4a69f6b0-3803-4184-bfd0-0fac841243c9-client-ca\") pod \"controller-manager-69f964bddc-kkkr7\" (UID: \"4a69f6b0-3803-4184-bfd0-0fac841243c9\") " pod="openshift-controller-manager/controller-manager-69f964bddc-kkkr7" Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.335206 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/63f706f0-4681-4255-9479-fa83f336faf3-client-ca\") pod \"route-controller-manager-786458fd97-jr5xx\" (UID: \"63f706f0-4681-4255-9479-fa83f336faf3\") " pod="openshift-route-controller-manager/route-controller-manager-786458fd97-jr5xx" Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.335227 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-598sf\" (UniqueName: \"kubernetes.io/projected/4a69f6b0-3803-4184-bfd0-0fac841243c9-kube-api-access-598sf\") pod \"controller-manager-69f964bddc-kkkr7\" (UID: \"4a69f6b0-3803-4184-bfd0-0fac841243c9\") " pod="openshift-controller-manager/controller-manager-69f964bddc-kkkr7" Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.335262 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a69f6b0-3803-4184-bfd0-0fac841243c9-config\") pod \"controller-manager-69f964bddc-kkkr7\" (UID: \"4a69f6b0-3803-4184-bfd0-0fac841243c9\") " pod="openshift-controller-manager/controller-manager-69f964bddc-kkkr7" Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.335281 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a69f6b0-3803-4184-bfd0-0fac841243c9-serving-cert\") pod \"controller-manager-69f964bddc-kkkr7\" (UID: \"4a69f6b0-3803-4184-bfd0-0fac841243c9\") " pod="openshift-controller-manager/controller-manager-69f964bddc-kkkr7" Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.336350 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4a69f6b0-3803-4184-bfd0-0fac841243c9-proxy-ca-bundles\") pod \"controller-manager-69f964bddc-kkkr7\" (UID: \"4a69f6b0-3803-4184-bfd0-0fac841243c9\") " pod="openshift-controller-manager/controller-manager-69f964bddc-kkkr7" Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.336811 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4a69f6b0-3803-4184-bfd0-0fac841243c9-client-ca\") pod \"controller-manager-69f964bddc-kkkr7\" (UID: \"4a69f6b0-3803-4184-bfd0-0fac841243c9\") " pod="openshift-controller-manager/controller-manager-69f964bddc-kkkr7" Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.337034 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/63f706f0-4681-4255-9479-fa83f336faf3-client-ca\") pod \"route-controller-manager-786458fd97-jr5xx\" (UID: \"63f706f0-4681-4255-9479-fa83f336faf3\") " pod="openshift-route-controller-manager/route-controller-manager-786458fd97-jr5xx" Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.338529 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a69f6b0-3803-4184-bfd0-0fac841243c9-serving-cert\") pod \"controller-manager-69f964bddc-kkkr7\" (UID: \"4a69f6b0-3803-4184-bfd0-0fac841243c9\") " pod="openshift-controller-manager/controller-manager-69f964bddc-kkkr7" Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.338918 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63f706f0-4681-4255-9479-fa83f336faf3-serving-cert\") pod \"route-controller-manager-786458fd97-jr5xx\" (UID: \"63f706f0-4681-4255-9479-fa83f336faf3\") " pod="openshift-route-controller-manager/route-controller-manager-786458fd97-jr5xx" Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.339239 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a69f6b0-3803-4184-bfd0-0fac841243c9-config\") pod \"controller-manager-69f964bddc-kkkr7\" (UID: \"4a69f6b0-3803-4184-bfd0-0fac841243c9\") " pod="openshift-controller-manager/controller-manager-69f964bddc-kkkr7" Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.350684 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63f706f0-4681-4255-9479-fa83f336faf3-config\") pod \"route-controller-manager-786458fd97-jr5xx\" (UID: \"63f706f0-4681-4255-9479-fa83f336faf3\") " pod="openshift-route-controller-manager/route-controller-manager-786458fd97-jr5xx" Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.354752 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-598sf\" (UniqueName: \"kubernetes.io/projected/4a69f6b0-3803-4184-bfd0-0fac841243c9-kube-api-access-598sf\") pod \"controller-manager-69f964bddc-kkkr7\" (UID: \"4a69f6b0-3803-4184-bfd0-0fac841243c9\") " pod="openshift-controller-manager/controller-manager-69f964bddc-kkkr7" Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.354949 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqtdt\" (UniqueName: \"kubernetes.io/projected/63f706f0-4681-4255-9479-fa83f336faf3-kube-api-access-fqtdt\") pod \"route-controller-manager-786458fd97-jr5xx\" (UID: \"63f706f0-4681-4255-9479-fa83f336faf3\") " pod="openshift-route-controller-manager/route-controller-manager-786458fd97-jr5xx" Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.505752 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-69f964bddc-kkkr7" Jan 31 14:46:25 crc kubenswrapper[4751]: I0131 14:46:25.528444 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-786458fd97-jr5xx" Jan 31 14:46:26 crc kubenswrapper[4751]: I0131 14:46:26.254671 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-69f964bddc-kkkr7"] Jan 31 14:46:26 crc kubenswrapper[4751]: I0131 14:46:26.265768 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-786458fd97-jr5xx"] Jan 31 14:46:26 crc kubenswrapper[4751]: W0131 14:46:26.282015 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63f706f0_4681_4255_9479_fa83f336faf3.slice/crio-f1b03d6b6b0f49ab50bcdb3528cc23a3e1574218292e25557a87fa9bff9d14bc WatchSource:0}: Error finding container f1b03d6b6b0f49ab50bcdb3528cc23a3e1574218292e25557a87fa9bff9d14bc: Status 404 returned error can't find the container with id f1b03d6b6b0f49ab50bcdb3528cc23a3e1574218292e25557a87fa9bff9d14bc Jan 31 14:46:26 crc kubenswrapper[4751]: I0131 14:46:26.429671 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10e50c97-9956-48fc-a759-6d6a2e2d8ca5" path="/var/lib/kubelet/pods/10e50c97-9956-48fc-a759-6d6a2e2d8ca5/volumes" Jan 31 14:46:26 crc kubenswrapper[4751]: I0131 14:46:26.432116 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e51033f6-0061-4b08-9d82-11c610c7d396" path="/var/lib/kubelet/pods/e51033f6-0061-4b08-9d82-11c610c7d396/volumes" Jan 31 14:46:27 crc kubenswrapper[4751]: I0131 14:46:27.234663 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-69f964bddc-kkkr7" event={"ID":"4a69f6b0-3803-4184-bfd0-0fac841243c9","Type":"ContainerStarted","Data":"fe82122738b743a68b1378a4b01c84a2b7160746fd4157fd800d3764207bde57"} Jan 31 14:46:27 crc kubenswrapper[4751]: I0131 14:46:27.234707 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-69f964bddc-kkkr7" event={"ID":"4a69f6b0-3803-4184-bfd0-0fac841243c9","Type":"ContainerStarted","Data":"86d14448a4ea80ec6af8481d7e6007a568a5184b3838fc04a9ee1ffb1652ee65"} Jan 31 14:46:27 crc kubenswrapper[4751]: I0131 14:46:27.236106 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-786458fd97-jr5xx" event={"ID":"63f706f0-4681-4255-9479-fa83f336faf3","Type":"ContainerStarted","Data":"2f228eb3d0a527e09dcf6548c6df7eca5ca8b08cfdbf6688de33c64dc5398abe"} Jan 31 14:46:27 crc kubenswrapper[4751]: I0131 14:46:27.236139 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-786458fd97-jr5xx" event={"ID":"63f706f0-4681-4255-9479-fa83f336faf3","Type":"ContainerStarted","Data":"f1b03d6b6b0f49ab50bcdb3528cc23a3e1574218292e25557a87fa9bff9d14bc"} Jan 31 14:46:27 crc kubenswrapper[4751]: I0131 14:46:27.236493 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-786458fd97-jr5xx" Jan 31 14:46:27 crc kubenswrapper[4751]: I0131 14:46:27.259065 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-69f964bddc-kkkr7" podStartSLOduration=4.25904459 podStartE2EDuration="4.25904459s" podCreationTimestamp="2026-01-31 14:46:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:46:27.257290664 +0000 UTC m=+291.632003559" watchObservedRunningTime="2026-01-31 14:46:27.25904459 +0000 UTC m=+291.633757475" Jan 31 14:46:27 crc kubenswrapper[4751]: I0131 14:46:27.287893 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-786458fd97-jr5xx" podStartSLOduration=3.287873358 podStartE2EDuration="3.287873358s" podCreationTimestamp="2026-01-31 14:46:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:46:27.287555909 +0000 UTC m=+291.662268794" watchObservedRunningTime="2026-01-31 14:46:27.287873358 +0000 UTC m=+291.662586243" Jan 31 14:46:27 crc kubenswrapper[4751]: I0131 14:46:27.708874 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-786458fd97-jr5xx" Jan 31 14:46:28 crc kubenswrapper[4751]: I0131 14:46:28.242009 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-69f964bddc-kkkr7" Jan 31 14:46:28 crc kubenswrapper[4751]: I0131 14:46:28.248313 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-69f964bddc-kkkr7" Jan 31 14:46:36 crc kubenswrapper[4751]: I0131 14:46:36.180523 4751 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Jan 31 14:46:43 crc kubenswrapper[4751]: I0131 14:46:43.911846 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-69f964bddc-kkkr7"] Jan 31 14:46:43 crc kubenswrapper[4751]: I0131 14:46:43.912799 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-69f964bddc-kkkr7" podUID="4a69f6b0-3803-4184-bfd0-0fac841243c9" containerName="controller-manager" containerID="cri-o://fe82122738b743a68b1378a4b01c84a2b7160746fd4157fd800d3764207bde57" gracePeriod=30 Jan 31 14:46:43 crc kubenswrapper[4751]: I0131 14:46:43.939851 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-786458fd97-jr5xx"] Jan 31 14:46:43 crc kubenswrapper[4751]: I0131 14:46:43.940548 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-786458fd97-jr5xx" podUID="63f706f0-4681-4255-9479-fa83f336faf3" containerName="route-controller-manager" containerID="cri-o://2f228eb3d0a527e09dcf6548c6df7eca5ca8b08cfdbf6688de33c64dc5398abe" gracePeriod=30 Jan 31 14:46:44 crc kubenswrapper[4751]: I0131 14:46:44.351709 4751 generic.go:334] "Generic (PLEG): container finished" podID="4a69f6b0-3803-4184-bfd0-0fac841243c9" containerID="fe82122738b743a68b1378a4b01c84a2b7160746fd4157fd800d3764207bde57" exitCode=0 Jan 31 14:46:44 crc kubenswrapper[4751]: I0131 14:46:44.351819 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-69f964bddc-kkkr7" event={"ID":"4a69f6b0-3803-4184-bfd0-0fac841243c9","Type":"ContainerDied","Data":"fe82122738b743a68b1378a4b01c84a2b7160746fd4157fd800d3764207bde57"} Jan 31 14:46:44 crc kubenswrapper[4751]: I0131 14:46:44.354212 4751 generic.go:334] "Generic (PLEG): container finished" podID="63f706f0-4681-4255-9479-fa83f336faf3" containerID="2f228eb3d0a527e09dcf6548c6df7eca5ca8b08cfdbf6688de33c64dc5398abe" exitCode=0 Jan 31 14:46:44 crc kubenswrapper[4751]: I0131 14:46:44.354260 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-786458fd97-jr5xx" event={"ID":"63f706f0-4681-4255-9479-fa83f336faf3","Type":"ContainerDied","Data":"2f228eb3d0a527e09dcf6548c6df7eca5ca8b08cfdbf6688de33c64dc5398abe"} Jan 31 14:46:44 crc kubenswrapper[4751]: I0131 14:46:44.540968 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-69f964bddc-kkkr7" Jan 31 14:46:44 crc kubenswrapper[4751]: I0131 14:46:44.547172 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-786458fd97-jr5xx" Jan 31 14:46:44 crc kubenswrapper[4751]: I0131 14:46:44.696414 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a69f6b0-3803-4184-bfd0-0fac841243c9-config\") pod \"4a69f6b0-3803-4184-bfd0-0fac841243c9\" (UID: \"4a69f6b0-3803-4184-bfd0-0fac841243c9\") " Jan 31 14:46:44 crc kubenswrapper[4751]: I0131 14:46:44.696523 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63f706f0-4681-4255-9479-fa83f336faf3-serving-cert\") pod \"63f706f0-4681-4255-9479-fa83f336faf3\" (UID: \"63f706f0-4681-4255-9479-fa83f336faf3\") " Jan 31 14:46:44 crc kubenswrapper[4751]: I0131 14:46:44.696629 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63f706f0-4681-4255-9479-fa83f336faf3-config\") pod \"63f706f0-4681-4255-9479-fa83f336faf3\" (UID: \"63f706f0-4681-4255-9479-fa83f336faf3\") " Jan 31 14:46:44 crc kubenswrapper[4751]: I0131 14:46:44.696677 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4a69f6b0-3803-4184-bfd0-0fac841243c9-proxy-ca-bundles\") pod \"4a69f6b0-3803-4184-bfd0-0fac841243c9\" (UID: \"4a69f6b0-3803-4184-bfd0-0fac841243c9\") " Jan 31 14:46:44 crc kubenswrapper[4751]: I0131 14:46:44.696716 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/63f706f0-4681-4255-9479-fa83f336faf3-client-ca\") pod \"63f706f0-4681-4255-9479-fa83f336faf3\" (UID: \"63f706f0-4681-4255-9479-fa83f336faf3\") " Jan 31 14:46:44 crc kubenswrapper[4751]: I0131 14:46:44.696752 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4a69f6b0-3803-4184-bfd0-0fac841243c9-client-ca\") pod \"4a69f6b0-3803-4184-bfd0-0fac841243c9\" (UID: \"4a69f6b0-3803-4184-bfd0-0fac841243c9\") " Jan 31 14:46:44 crc kubenswrapper[4751]: I0131 14:46:44.696790 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqtdt\" (UniqueName: \"kubernetes.io/projected/63f706f0-4681-4255-9479-fa83f336faf3-kube-api-access-fqtdt\") pod \"63f706f0-4681-4255-9479-fa83f336faf3\" (UID: \"63f706f0-4681-4255-9479-fa83f336faf3\") " Jan 31 14:46:44 crc kubenswrapper[4751]: I0131 14:46:44.696829 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a69f6b0-3803-4184-bfd0-0fac841243c9-serving-cert\") pod \"4a69f6b0-3803-4184-bfd0-0fac841243c9\" (UID: \"4a69f6b0-3803-4184-bfd0-0fac841243c9\") " Jan 31 14:46:44 crc kubenswrapper[4751]: I0131 14:46:44.696857 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-598sf\" (UniqueName: \"kubernetes.io/projected/4a69f6b0-3803-4184-bfd0-0fac841243c9-kube-api-access-598sf\") pod \"4a69f6b0-3803-4184-bfd0-0fac841243c9\" (UID: \"4a69f6b0-3803-4184-bfd0-0fac841243c9\") " Jan 31 14:46:44 crc kubenswrapper[4751]: I0131 14:46:44.697630 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a69f6b0-3803-4184-bfd0-0fac841243c9-client-ca" (OuterVolumeSpecName: "client-ca") pod "4a69f6b0-3803-4184-bfd0-0fac841243c9" (UID: "4a69f6b0-3803-4184-bfd0-0fac841243c9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:46:44 crc kubenswrapper[4751]: I0131 14:46:44.698046 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a69f6b0-3803-4184-bfd0-0fac841243c9-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "4a69f6b0-3803-4184-bfd0-0fac841243c9" (UID: "4a69f6b0-3803-4184-bfd0-0fac841243c9"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:46:44 crc kubenswrapper[4751]: I0131 14:46:44.698141 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a69f6b0-3803-4184-bfd0-0fac841243c9-config" (OuterVolumeSpecName: "config") pod "4a69f6b0-3803-4184-bfd0-0fac841243c9" (UID: "4a69f6b0-3803-4184-bfd0-0fac841243c9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:46:44 crc kubenswrapper[4751]: I0131 14:46:44.698267 4751 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4a69f6b0-3803-4184-bfd0-0fac841243c9-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 31 14:46:44 crc kubenswrapper[4751]: I0131 14:46:44.698378 4751 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4a69f6b0-3803-4184-bfd0-0fac841243c9-client-ca\") on node \"crc\" DevicePath \"\"" Jan 31 14:46:44 crc kubenswrapper[4751]: I0131 14:46:44.698562 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63f706f0-4681-4255-9479-fa83f336faf3-client-ca" (OuterVolumeSpecName: "client-ca") pod "63f706f0-4681-4255-9479-fa83f336faf3" (UID: "63f706f0-4681-4255-9479-fa83f336faf3"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:46:44 crc kubenswrapper[4751]: I0131 14:46:44.698637 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63f706f0-4681-4255-9479-fa83f336faf3-config" (OuterVolumeSpecName: "config") pod "63f706f0-4681-4255-9479-fa83f336faf3" (UID: "63f706f0-4681-4255-9479-fa83f336faf3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:46:44 crc kubenswrapper[4751]: I0131 14:46:44.703163 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a69f6b0-3803-4184-bfd0-0fac841243c9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4a69f6b0-3803-4184-bfd0-0fac841243c9" (UID: "4a69f6b0-3803-4184-bfd0-0fac841243c9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:46:44 crc kubenswrapper[4751]: I0131 14:46:44.703270 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a69f6b0-3803-4184-bfd0-0fac841243c9-kube-api-access-598sf" (OuterVolumeSpecName: "kube-api-access-598sf") pod "4a69f6b0-3803-4184-bfd0-0fac841243c9" (UID: "4a69f6b0-3803-4184-bfd0-0fac841243c9"). InnerVolumeSpecName "kube-api-access-598sf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:46:44 crc kubenswrapper[4751]: I0131 14:46:44.703908 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63f706f0-4681-4255-9479-fa83f336faf3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "63f706f0-4681-4255-9479-fa83f336faf3" (UID: "63f706f0-4681-4255-9479-fa83f336faf3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:46:44 crc kubenswrapper[4751]: I0131 14:46:44.704351 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63f706f0-4681-4255-9479-fa83f336faf3-kube-api-access-fqtdt" (OuterVolumeSpecName: "kube-api-access-fqtdt") pod "63f706f0-4681-4255-9479-fa83f336faf3" (UID: "63f706f0-4681-4255-9479-fa83f336faf3"). InnerVolumeSpecName "kube-api-access-fqtdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:46:44 crc kubenswrapper[4751]: I0131 14:46:44.799796 4751 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/63f706f0-4681-4255-9479-fa83f336faf3-client-ca\") on node \"crc\" DevicePath \"\"" Jan 31 14:46:44 crc kubenswrapper[4751]: I0131 14:46:44.799853 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqtdt\" (UniqueName: \"kubernetes.io/projected/63f706f0-4681-4255-9479-fa83f336faf3-kube-api-access-fqtdt\") on node \"crc\" DevicePath \"\"" Jan 31 14:46:44 crc kubenswrapper[4751]: I0131 14:46:44.799878 4751 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a69f6b0-3803-4184-bfd0-0fac841243c9-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 14:46:44 crc kubenswrapper[4751]: I0131 14:46:44.799901 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-598sf\" (UniqueName: \"kubernetes.io/projected/4a69f6b0-3803-4184-bfd0-0fac841243c9-kube-api-access-598sf\") on node \"crc\" DevicePath \"\"" Jan 31 14:46:44 crc kubenswrapper[4751]: I0131 14:46:44.799949 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a69f6b0-3803-4184-bfd0-0fac841243c9-config\") on node \"crc\" DevicePath \"\"" Jan 31 14:46:44 crc kubenswrapper[4751]: I0131 14:46:44.799974 4751 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63f706f0-4681-4255-9479-fa83f336faf3-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 14:46:44 crc kubenswrapper[4751]: I0131 14:46:44.799992 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63f706f0-4681-4255-9479-fa83f336faf3-config\") on node \"crc\" DevicePath \"\"" Jan 31 14:46:45 crc kubenswrapper[4751]: I0131 14:46:45.190722 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7ff7586b44-p7vhp"] Jan 31 14:46:45 crc kubenswrapper[4751]: E0131 14:46:45.191287 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a69f6b0-3803-4184-bfd0-0fac841243c9" containerName="controller-manager" Jan 31 14:46:45 crc kubenswrapper[4751]: I0131 14:46:45.191318 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a69f6b0-3803-4184-bfd0-0fac841243c9" containerName="controller-manager" Jan 31 14:46:45 crc kubenswrapper[4751]: E0131 14:46:45.191351 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63f706f0-4681-4255-9479-fa83f336faf3" containerName="route-controller-manager" Jan 31 14:46:45 crc kubenswrapper[4751]: I0131 14:46:45.191364 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="63f706f0-4681-4255-9479-fa83f336faf3" containerName="route-controller-manager" Jan 31 14:46:45 crc kubenswrapper[4751]: I0131 14:46:45.191533 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a69f6b0-3803-4184-bfd0-0fac841243c9" containerName="controller-manager" Jan 31 14:46:45 crc kubenswrapper[4751]: I0131 14:46:45.191570 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="63f706f0-4681-4255-9479-fa83f336faf3" containerName="route-controller-manager" Jan 31 14:46:45 crc kubenswrapper[4751]: I0131 14:46:45.192216 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7ff7586b44-p7vhp" Jan 31 14:46:45 crc kubenswrapper[4751]: I0131 14:46:45.213156 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-77bc486b6-z2pvg"] Jan 31 14:46:45 crc kubenswrapper[4751]: I0131 14:46:45.215733 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77bc486b6-z2pvg" Jan 31 14:46:45 crc kubenswrapper[4751]: I0131 14:46:45.232329 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-77bc486b6-z2pvg"] Jan 31 14:46:45 crc kubenswrapper[4751]: I0131 14:46:45.247242 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7ff7586b44-p7vhp"] Jan 31 14:46:45 crc kubenswrapper[4751]: I0131 14:46:45.308201 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/886a47b7-6715-4cd7-aea5-7db85b593b9b-config\") pod \"controller-manager-77bc486b6-z2pvg\" (UID: \"886a47b7-6715-4cd7-aea5-7db85b593b9b\") " pod="openshift-controller-manager/controller-manager-77bc486b6-z2pvg" Jan 31 14:46:45 crc kubenswrapper[4751]: I0131 14:46:45.308266 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/61fbf77a-1344-4a32-81b4-9a12283ace53-client-ca\") pod \"route-controller-manager-7ff7586b44-p7vhp\" (UID: \"61fbf77a-1344-4a32-81b4-9a12283ace53\") " pod="openshift-route-controller-manager/route-controller-manager-7ff7586b44-p7vhp" Jan 31 14:46:45 crc kubenswrapper[4751]: I0131 14:46:45.308457 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/886a47b7-6715-4cd7-aea5-7db85b593b9b-proxy-ca-bundles\") pod \"controller-manager-77bc486b6-z2pvg\" (UID: \"886a47b7-6715-4cd7-aea5-7db85b593b9b\") " pod="openshift-controller-manager/controller-manager-77bc486b6-z2pvg" Jan 31 14:46:45 crc kubenswrapper[4751]: I0131 14:46:45.308528 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61fbf77a-1344-4a32-81b4-9a12283ace53-serving-cert\") pod \"route-controller-manager-7ff7586b44-p7vhp\" (UID: \"61fbf77a-1344-4a32-81b4-9a12283ace53\") " pod="openshift-route-controller-manager/route-controller-manager-7ff7586b44-p7vhp" Jan 31 14:46:45 crc kubenswrapper[4751]: I0131 14:46:45.308545 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjqp6\" (UniqueName: \"kubernetes.io/projected/61fbf77a-1344-4a32-81b4-9a12283ace53-kube-api-access-pjqp6\") pod \"route-controller-manager-7ff7586b44-p7vhp\" (UID: \"61fbf77a-1344-4a32-81b4-9a12283ace53\") " pod="openshift-route-controller-manager/route-controller-manager-7ff7586b44-p7vhp" Jan 31 14:46:45 crc kubenswrapper[4751]: I0131 14:46:45.308686 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/886a47b7-6715-4cd7-aea5-7db85b593b9b-serving-cert\") pod \"controller-manager-77bc486b6-z2pvg\" (UID: \"886a47b7-6715-4cd7-aea5-7db85b593b9b\") " pod="openshift-controller-manager/controller-manager-77bc486b6-z2pvg" Jan 31 14:46:45 crc kubenswrapper[4751]: I0131 14:46:45.308768 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sbxv\" (UniqueName: \"kubernetes.io/projected/886a47b7-6715-4cd7-aea5-7db85b593b9b-kube-api-access-6sbxv\") pod \"controller-manager-77bc486b6-z2pvg\" (UID: \"886a47b7-6715-4cd7-aea5-7db85b593b9b\") " pod="openshift-controller-manager/controller-manager-77bc486b6-z2pvg" Jan 31 14:46:45 crc kubenswrapper[4751]: I0131 14:46:45.308796 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61fbf77a-1344-4a32-81b4-9a12283ace53-config\") pod \"route-controller-manager-7ff7586b44-p7vhp\" (UID: \"61fbf77a-1344-4a32-81b4-9a12283ace53\") " pod="openshift-route-controller-manager/route-controller-manager-7ff7586b44-p7vhp" Jan 31 14:46:45 crc kubenswrapper[4751]: I0131 14:46:45.308839 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/886a47b7-6715-4cd7-aea5-7db85b593b9b-client-ca\") pod \"controller-manager-77bc486b6-z2pvg\" (UID: \"886a47b7-6715-4cd7-aea5-7db85b593b9b\") " pod="openshift-controller-manager/controller-manager-77bc486b6-z2pvg" Jan 31 14:46:45 crc kubenswrapper[4751]: I0131 14:46:45.361853 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-786458fd97-jr5xx" event={"ID":"63f706f0-4681-4255-9479-fa83f336faf3","Type":"ContainerDied","Data":"f1b03d6b6b0f49ab50bcdb3528cc23a3e1574218292e25557a87fa9bff9d14bc"} Jan 31 14:46:45 crc kubenswrapper[4751]: I0131 14:46:45.361938 4751 scope.go:117] "RemoveContainer" containerID="2f228eb3d0a527e09dcf6548c6df7eca5ca8b08cfdbf6688de33c64dc5398abe" Jan 31 14:46:45 crc kubenswrapper[4751]: I0131 14:46:45.362051 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-786458fd97-jr5xx" Jan 31 14:46:45 crc kubenswrapper[4751]: I0131 14:46:45.367555 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-69f964bddc-kkkr7" event={"ID":"4a69f6b0-3803-4184-bfd0-0fac841243c9","Type":"ContainerDied","Data":"86d14448a4ea80ec6af8481d7e6007a568a5184b3838fc04a9ee1ffb1652ee65"} Jan 31 14:46:45 crc kubenswrapper[4751]: I0131 14:46:45.367696 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-69f964bddc-kkkr7" Jan 31 14:46:45 crc kubenswrapper[4751]: I0131 14:46:45.382704 4751 scope.go:117] "RemoveContainer" containerID="fe82122738b743a68b1378a4b01c84a2b7160746fd4157fd800d3764207bde57" Jan 31 14:46:45 crc kubenswrapper[4751]: I0131 14:46:45.400909 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-786458fd97-jr5xx"] Jan 31 14:46:45 crc kubenswrapper[4751]: I0131 14:46:45.406274 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-786458fd97-jr5xx"] Jan 31 14:46:45 crc kubenswrapper[4751]: I0131 14:46:45.409768 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/886a47b7-6715-4cd7-aea5-7db85b593b9b-proxy-ca-bundles\") pod \"controller-manager-77bc486b6-z2pvg\" (UID: \"886a47b7-6715-4cd7-aea5-7db85b593b9b\") " pod="openshift-controller-manager/controller-manager-77bc486b6-z2pvg" Jan 31 14:46:45 crc kubenswrapper[4751]: I0131 14:46:45.409837 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61fbf77a-1344-4a32-81b4-9a12283ace53-serving-cert\") pod \"route-controller-manager-7ff7586b44-p7vhp\" (UID: \"61fbf77a-1344-4a32-81b4-9a12283ace53\") " pod="openshift-route-controller-manager/route-controller-manager-7ff7586b44-p7vhp" Jan 31 14:46:45 crc kubenswrapper[4751]: I0131 14:46:45.409874 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjqp6\" (UniqueName: \"kubernetes.io/projected/61fbf77a-1344-4a32-81b4-9a12283ace53-kube-api-access-pjqp6\") pod \"route-controller-manager-7ff7586b44-p7vhp\" (UID: \"61fbf77a-1344-4a32-81b4-9a12283ace53\") " pod="openshift-route-controller-manager/route-controller-manager-7ff7586b44-p7vhp" Jan 31 14:46:45 crc kubenswrapper[4751]: I0131 14:46:45.409927 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/886a47b7-6715-4cd7-aea5-7db85b593b9b-serving-cert\") pod \"controller-manager-77bc486b6-z2pvg\" (UID: \"886a47b7-6715-4cd7-aea5-7db85b593b9b\") " pod="openshift-controller-manager/controller-manager-77bc486b6-z2pvg" Jan 31 14:46:45 crc kubenswrapper[4751]: I0131 14:46:45.409972 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sbxv\" (UniqueName: \"kubernetes.io/projected/886a47b7-6715-4cd7-aea5-7db85b593b9b-kube-api-access-6sbxv\") pod \"controller-manager-77bc486b6-z2pvg\" (UID: \"886a47b7-6715-4cd7-aea5-7db85b593b9b\") " pod="openshift-controller-manager/controller-manager-77bc486b6-z2pvg" Jan 31 14:46:45 crc kubenswrapper[4751]: I0131 14:46:45.410005 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61fbf77a-1344-4a32-81b4-9a12283ace53-config\") pod \"route-controller-manager-7ff7586b44-p7vhp\" (UID: \"61fbf77a-1344-4a32-81b4-9a12283ace53\") " pod="openshift-route-controller-manager/route-controller-manager-7ff7586b44-p7vhp" Jan 31 14:46:45 crc kubenswrapper[4751]: I0131 14:46:45.410048 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/886a47b7-6715-4cd7-aea5-7db85b593b9b-client-ca\") pod \"controller-manager-77bc486b6-z2pvg\" (UID: \"886a47b7-6715-4cd7-aea5-7db85b593b9b\") " pod="openshift-controller-manager/controller-manager-77bc486b6-z2pvg" Jan 31 14:46:45 crc kubenswrapper[4751]: I0131 14:46:45.410128 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/886a47b7-6715-4cd7-aea5-7db85b593b9b-config\") pod \"controller-manager-77bc486b6-z2pvg\" (UID: \"886a47b7-6715-4cd7-aea5-7db85b593b9b\") " pod="openshift-controller-manager/controller-manager-77bc486b6-z2pvg" Jan 31 14:46:45 crc kubenswrapper[4751]: I0131 14:46:45.410158 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/61fbf77a-1344-4a32-81b4-9a12283ace53-client-ca\") pod \"route-controller-manager-7ff7586b44-p7vhp\" (UID: \"61fbf77a-1344-4a32-81b4-9a12283ace53\") " pod="openshift-route-controller-manager/route-controller-manager-7ff7586b44-p7vhp" Jan 31 14:46:45 crc kubenswrapper[4751]: I0131 14:46:45.411567 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/61fbf77a-1344-4a32-81b4-9a12283ace53-client-ca\") pod \"route-controller-manager-7ff7586b44-p7vhp\" (UID: \"61fbf77a-1344-4a32-81b4-9a12283ace53\") " pod="openshift-route-controller-manager/route-controller-manager-7ff7586b44-p7vhp" Jan 31 14:46:45 crc kubenswrapper[4751]: I0131 14:46:45.413267 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/886a47b7-6715-4cd7-aea5-7db85b593b9b-proxy-ca-bundles\") pod \"controller-manager-77bc486b6-z2pvg\" (UID: \"886a47b7-6715-4cd7-aea5-7db85b593b9b\") " pod="openshift-controller-manager/controller-manager-77bc486b6-z2pvg" Jan 31 14:46:45 crc kubenswrapper[4751]: I0131 14:46:45.413779 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/886a47b7-6715-4cd7-aea5-7db85b593b9b-client-ca\") pod \"controller-manager-77bc486b6-z2pvg\" (UID: \"886a47b7-6715-4cd7-aea5-7db85b593b9b\") " pod="openshift-controller-manager/controller-manager-77bc486b6-z2pvg" Jan 31 14:46:45 crc kubenswrapper[4751]: I0131 14:46:45.413882 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61fbf77a-1344-4a32-81b4-9a12283ace53-config\") pod \"route-controller-manager-7ff7586b44-p7vhp\" (UID: \"61fbf77a-1344-4a32-81b4-9a12283ace53\") " pod="openshift-route-controller-manager/route-controller-manager-7ff7586b44-p7vhp" Jan 31 14:46:45 crc kubenswrapper[4751]: I0131 14:46:45.416033 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/886a47b7-6715-4cd7-aea5-7db85b593b9b-config\") pod \"controller-manager-77bc486b6-z2pvg\" (UID: \"886a47b7-6715-4cd7-aea5-7db85b593b9b\") " pod="openshift-controller-manager/controller-manager-77bc486b6-z2pvg" Jan 31 14:46:45 crc kubenswrapper[4751]: I0131 14:46:45.417760 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61fbf77a-1344-4a32-81b4-9a12283ace53-serving-cert\") pod \"route-controller-manager-7ff7586b44-p7vhp\" (UID: \"61fbf77a-1344-4a32-81b4-9a12283ace53\") " pod="openshift-route-controller-manager/route-controller-manager-7ff7586b44-p7vhp" Jan 31 14:46:45 crc kubenswrapper[4751]: I0131 14:46:45.421401 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-69f964bddc-kkkr7"] Jan 31 14:46:45 crc kubenswrapper[4751]: I0131 14:46:45.430052 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-69f964bddc-kkkr7"] Jan 31 14:46:45 crc kubenswrapper[4751]: I0131 14:46:45.444004 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/886a47b7-6715-4cd7-aea5-7db85b593b9b-serving-cert\") pod \"controller-manager-77bc486b6-z2pvg\" (UID: \"886a47b7-6715-4cd7-aea5-7db85b593b9b\") " pod="openshift-controller-manager/controller-manager-77bc486b6-z2pvg" Jan 31 14:46:45 crc kubenswrapper[4751]: I0131 14:46:45.445022 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sbxv\" (UniqueName: \"kubernetes.io/projected/886a47b7-6715-4cd7-aea5-7db85b593b9b-kube-api-access-6sbxv\") pod \"controller-manager-77bc486b6-z2pvg\" (UID: \"886a47b7-6715-4cd7-aea5-7db85b593b9b\") " pod="openshift-controller-manager/controller-manager-77bc486b6-z2pvg" Jan 31 14:46:45 crc kubenswrapper[4751]: I0131 14:46:45.447496 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjqp6\" (UniqueName: \"kubernetes.io/projected/61fbf77a-1344-4a32-81b4-9a12283ace53-kube-api-access-pjqp6\") pod \"route-controller-manager-7ff7586b44-p7vhp\" (UID: \"61fbf77a-1344-4a32-81b4-9a12283ace53\") " pod="openshift-route-controller-manager/route-controller-manager-7ff7586b44-p7vhp" Jan 31 14:46:45 crc kubenswrapper[4751]: I0131 14:46:45.529817 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7ff7586b44-p7vhp" Jan 31 14:46:45 crc kubenswrapper[4751]: I0131 14:46:45.542448 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77bc486b6-z2pvg" Jan 31 14:46:45 crc kubenswrapper[4751]: I0131 14:46:45.882172 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7ff7586b44-p7vhp"] Jan 31 14:46:45 crc kubenswrapper[4751]: W0131 14:46:45.887257 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61fbf77a_1344_4a32_81b4_9a12283ace53.slice/crio-ce28e111d2169a56020e396a7157657e7dded02c68bf85f9bd547a411d88e0d1 WatchSource:0}: Error finding container ce28e111d2169a56020e396a7157657e7dded02c68bf85f9bd547a411d88e0d1: Status 404 returned error can't find the container with id ce28e111d2169a56020e396a7157657e7dded02c68bf85f9bd547a411d88e0d1 Jan 31 14:46:46 crc kubenswrapper[4751]: I0131 14:46:46.035839 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-77bc486b6-z2pvg"] Jan 31 14:46:46 crc kubenswrapper[4751]: W0131 14:46:46.043395 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod886a47b7_6715_4cd7_aea5_7db85b593b9b.slice/crio-1f155d2931dd972625ba384f618c3c395946a7bcfd8383aa5e80dfb2a1ba9412 WatchSource:0}: Error finding container 1f155d2931dd972625ba384f618c3c395946a7bcfd8383aa5e80dfb2a1ba9412: Status 404 returned error can't find the container with id 1f155d2931dd972625ba384f618c3c395946a7bcfd8383aa5e80dfb2a1ba9412 Jan 31 14:46:46 crc kubenswrapper[4751]: I0131 14:46:46.376161 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77bc486b6-z2pvg" event={"ID":"886a47b7-6715-4cd7-aea5-7db85b593b9b","Type":"ContainerStarted","Data":"4e67ee071bf6d27d2aacad78b8e2a9b1cad4b202c044eb94395ef3b85a36b3e5"} Jan 31 14:46:46 crc kubenswrapper[4751]: I0131 14:46:46.376989 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-77bc486b6-z2pvg" Jan 31 14:46:46 crc kubenswrapper[4751]: I0131 14:46:46.377273 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77bc486b6-z2pvg" event={"ID":"886a47b7-6715-4cd7-aea5-7db85b593b9b","Type":"ContainerStarted","Data":"1f155d2931dd972625ba384f618c3c395946a7bcfd8383aa5e80dfb2a1ba9412"} Jan 31 14:46:46 crc kubenswrapper[4751]: I0131 14:46:46.379380 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7ff7586b44-p7vhp" event={"ID":"61fbf77a-1344-4a32-81b4-9a12283ace53","Type":"ContainerStarted","Data":"0ddb810f9243354e528fff64b7098fda6c73e32066f11dc3cd5f6a8f0284bcdc"} Jan 31 14:46:46 crc kubenswrapper[4751]: I0131 14:46:46.379436 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7ff7586b44-p7vhp" event={"ID":"61fbf77a-1344-4a32-81b4-9a12283ace53","Type":"ContainerStarted","Data":"ce28e111d2169a56020e396a7157657e7dded02c68bf85f9bd547a411d88e0d1"} Jan 31 14:46:46 crc kubenswrapper[4751]: I0131 14:46:46.379793 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7ff7586b44-p7vhp" Jan 31 14:46:46 crc kubenswrapper[4751]: I0131 14:46:46.384666 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-77bc486b6-z2pvg" Jan 31 14:46:46 crc kubenswrapper[4751]: I0131 14:46:46.419783 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a69f6b0-3803-4184-bfd0-0fac841243c9" path="/var/lib/kubelet/pods/4a69f6b0-3803-4184-bfd0-0fac841243c9/volumes" Jan 31 14:46:46 crc kubenswrapper[4751]: I0131 14:46:46.420619 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63f706f0-4681-4255-9479-fa83f336faf3" path="/var/lib/kubelet/pods/63f706f0-4681-4255-9479-fa83f336faf3/volumes" Jan 31 14:46:46 crc kubenswrapper[4751]: I0131 14:46:46.434555 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-77bc486b6-z2pvg" podStartSLOduration=3.434536821 podStartE2EDuration="3.434536821s" podCreationTimestamp="2026-01-31 14:46:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:46:46.404405659 +0000 UTC m=+310.779118544" watchObservedRunningTime="2026-01-31 14:46:46.434536821 +0000 UTC m=+310.809249716" Jan 31 14:46:46 crc kubenswrapper[4751]: I0131 14:46:46.457535 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7ff7586b44-p7vhp" podStartSLOduration=3.457516245 podStartE2EDuration="3.457516245s" podCreationTimestamp="2026-01-31 14:46:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:46:46.453792577 +0000 UTC m=+310.828505462" watchObservedRunningTime="2026-01-31 14:46:46.457516245 +0000 UTC m=+310.832229130" Jan 31 14:46:46 crc kubenswrapper[4751]: I0131 14:46:46.501826 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7ff7586b44-p7vhp" Jan 31 14:46:48 crc kubenswrapper[4751]: I0131 14:46:48.815655 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 31 14:47:03 crc kubenswrapper[4751]: I0131 14:47:03.813646 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-5fdjn"] Jan 31 14:47:03 crc kubenswrapper[4751]: I0131 14:47:03.818507 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-5fdjn" Jan 31 14:47:03 crc kubenswrapper[4751]: I0131 14:47:03.821278 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-5fdjn"] Jan 31 14:47:03 crc kubenswrapper[4751]: I0131 14:47:03.926471 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-77bc486b6-z2pvg"] Jan 31 14:47:03 crc kubenswrapper[4751]: I0131 14:47:03.926674 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-77bc486b6-z2pvg" podUID="886a47b7-6715-4cd7-aea5-7db85b593b9b" containerName="controller-manager" containerID="cri-o://4e67ee071bf6d27d2aacad78b8e2a9b1cad4b202c044eb94395ef3b85a36b3e5" gracePeriod=30 Jan 31 14:47:03 crc kubenswrapper[4751]: I0131 14:47:03.941025 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7ff7586b44-p7vhp"] Jan 31 14:47:03 crc kubenswrapper[4751]: I0131 14:47:03.941264 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7ff7586b44-p7vhp" podUID="61fbf77a-1344-4a32-81b4-9a12283ace53" containerName="route-controller-manager" containerID="cri-o://0ddb810f9243354e528fff64b7098fda6c73e32066f11dc3cd5f6a8f0284bcdc" gracePeriod=30 Jan 31 14:47:03 crc kubenswrapper[4751]: I0131 14:47:03.959888 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9e5d9ae7-378c-4f07-9d25-d1b3d187bde9-trusted-ca\") pod \"image-registry-66df7c8f76-5fdjn\" (UID: \"9e5d9ae7-378c-4f07-9d25-d1b3d187bde9\") " pod="openshift-image-registry/image-registry-66df7c8f76-5fdjn" Jan 31 14:47:03 crc kubenswrapper[4751]: I0131 14:47:03.959933 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9e5d9ae7-378c-4f07-9d25-d1b3d187bde9-ca-trust-extracted\") pod \"image-registry-66df7c8f76-5fdjn\" (UID: \"9e5d9ae7-378c-4f07-9d25-d1b3d187bde9\") " pod="openshift-image-registry/image-registry-66df7c8f76-5fdjn" Jan 31 14:47:03 crc kubenswrapper[4751]: I0131 14:47:03.959955 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8m8k\" (UniqueName: \"kubernetes.io/projected/9e5d9ae7-378c-4f07-9d25-d1b3d187bde9-kube-api-access-z8m8k\") pod \"image-registry-66df7c8f76-5fdjn\" (UID: \"9e5d9ae7-378c-4f07-9d25-d1b3d187bde9\") " pod="openshift-image-registry/image-registry-66df7c8f76-5fdjn" Jan 31 14:47:03 crc kubenswrapper[4751]: I0131 14:47:03.959986 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-5fdjn\" (UID: \"9e5d9ae7-378c-4f07-9d25-d1b3d187bde9\") " pod="openshift-image-registry/image-registry-66df7c8f76-5fdjn" Jan 31 14:47:03 crc kubenswrapper[4751]: I0131 14:47:03.960083 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9e5d9ae7-378c-4f07-9d25-d1b3d187bde9-registry-certificates\") pod \"image-registry-66df7c8f76-5fdjn\" (UID: \"9e5d9ae7-378c-4f07-9d25-d1b3d187bde9\") " pod="openshift-image-registry/image-registry-66df7c8f76-5fdjn" Jan 31 14:47:03 crc kubenswrapper[4751]: I0131 14:47:03.960135 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9e5d9ae7-378c-4f07-9d25-d1b3d187bde9-registry-tls\") pod \"image-registry-66df7c8f76-5fdjn\" (UID: \"9e5d9ae7-378c-4f07-9d25-d1b3d187bde9\") " pod="openshift-image-registry/image-registry-66df7c8f76-5fdjn" Jan 31 14:47:03 crc kubenswrapper[4751]: I0131 14:47:03.960159 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9e5d9ae7-378c-4f07-9d25-d1b3d187bde9-bound-sa-token\") pod \"image-registry-66df7c8f76-5fdjn\" (UID: \"9e5d9ae7-378c-4f07-9d25-d1b3d187bde9\") " pod="openshift-image-registry/image-registry-66df7c8f76-5fdjn" Jan 31 14:47:03 crc kubenswrapper[4751]: I0131 14:47:03.960225 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9e5d9ae7-378c-4f07-9d25-d1b3d187bde9-installation-pull-secrets\") pod \"image-registry-66df7c8f76-5fdjn\" (UID: \"9e5d9ae7-378c-4f07-9d25-d1b3d187bde9\") " pod="openshift-image-registry/image-registry-66df7c8f76-5fdjn" Jan 31 14:47:03 crc kubenswrapper[4751]: I0131 14:47:03.982065 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-5fdjn\" (UID: \"9e5d9ae7-378c-4f07-9d25-d1b3d187bde9\") " pod="openshift-image-registry/image-registry-66df7c8f76-5fdjn" Jan 31 14:47:04 crc kubenswrapper[4751]: I0131 14:47:04.065920 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8m8k\" (UniqueName: \"kubernetes.io/projected/9e5d9ae7-378c-4f07-9d25-d1b3d187bde9-kube-api-access-z8m8k\") pod \"image-registry-66df7c8f76-5fdjn\" (UID: \"9e5d9ae7-378c-4f07-9d25-d1b3d187bde9\") " pod="openshift-image-registry/image-registry-66df7c8f76-5fdjn" Jan 31 14:47:04 crc kubenswrapper[4751]: I0131 14:47:04.065992 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9e5d9ae7-378c-4f07-9d25-d1b3d187bde9-registry-certificates\") pod \"image-registry-66df7c8f76-5fdjn\" (UID: \"9e5d9ae7-378c-4f07-9d25-d1b3d187bde9\") " pod="openshift-image-registry/image-registry-66df7c8f76-5fdjn" Jan 31 14:47:04 crc kubenswrapper[4751]: I0131 14:47:04.066013 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9e5d9ae7-378c-4f07-9d25-d1b3d187bde9-registry-tls\") pod \"image-registry-66df7c8f76-5fdjn\" (UID: \"9e5d9ae7-378c-4f07-9d25-d1b3d187bde9\") " pod="openshift-image-registry/image-registry-66df7c8f76-5fdjn" Jan 31 14:47:04 crc kubenswrapper[4751]: I0131 14:47:04.066044 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9e5d9ae7-378c-4f07-9d25-d1b3d187bde9-bound-sa-token\") pod \"image-registry-66df7c8f76-5fdjn\" (UID: \"9e5d9ae7-378c-4f07-9d25-d1b3d187bde9\") " pod="openshift-image-registry/image-registry-66df7c8f76-5fdjn" Jan 31 14:47:04 crc kubenswrapper[4751]: I0131 14:47:04.066123 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9e5d9ae7-378c-4f07-9d25-d1b3d187bde9-installation-pull-secrets\") pod \"image-registry-66df7c8f76-5fdjn\" (UID: \"9e5d9ae7-378c-4f07-9d25-d1b3d187bde9\") " pod="openshift-image-registry/image-registry-66df7c8f76-5fdjn" Jan 31 14:47:04 crc kubenswrapper[4751]: I0131 14:47:04.066187 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9e5d9ae7-378c-4f07-9d25-d1b3d187bde9-trusted-ca\") pod \"image-registry-66df7c8f76-5fdjn\" (UID: \"9e5d9ae7-378c-4f07-9d25-d1b3d187bde9\") " pod="openshift-image-registry/image-registry-66df7c8f76-5fdjn" Jan 31 14:47:04 crc kubenswrapper[4751]: I0131 14:47:04.066208 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9e5d9ae7-378c-4f07-9d25-d1b3d187bde9-ca-trust-extracted\") pod \"image-registry-66df7c8f76-5fdjn\" (UID: \"9e5d9ae7-378c-4f07-9d25-d1b3d187bde9\") " pod="openshift-image-registry/image-registry-66df7c8f76-5fdjn" Jan 31 14:47:04 crc kubenswrapper[4751]: I0131 14:47:04.067676 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9e5d9ae7-378c-4f07-9d25-d1b3d187bde9-registry-certificates\") pod \"image-registry-66df7c8f76-5fdjn\" (UID: \"9e5d9ae7-378c-4f07-9d25-d1b3d187bde9\") " pod="openshift-image-registry/image-registry-66df7c8f76-5fdjn" Jan 31 14:47:04 crc kubenswrapper[4751]: I0131 14:47:04.071136 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9e5d9ae7-378c-4f07-9d25-d1b3d187bde9-trusted-ca\") pod \"image-registry-66df7c8f76-5fdjn\" (UID: \"9e5d9ae7-378c-4f07-9d25-d1b3d187bde9\") " pod="openshift-image-registry/image-registry-66df7c8f76-5fdjn" Jan 31 14:47:04 crc kubenswrapper[4751]: I0131 14:47:04.073869 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9e5d9ae7-378c-4f07-9d25-d1b3d187bde9-ca-trust-extracted\") pod \"image-registry-66df7c8f76-5fdjn\" (UID: \"9e5d9ae7-378c-4f07-9d25-d1b3d187bde9\") " pod="openshift-image-registry/image-registry-66df7c8f76-5fdjn" Jan 31 14:47:04 crc kubenswrapper[4751]: I0131 14:47:04.074573 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9e5d9ae7-378c-4f07-9d25-d1b3d187bde9-registry-tls\") pod \"image-registry-66df7c8f76-5fdjn\" (UID: \"9e5d9ae7-378c-4f07-9d25-d1b3d187bde9\") " pod="openshift-image-registry/image-registry-66df7c8f76-5fdjn" Jan 31 14:47:04 crc kubenswrapper[4751]: I0131 14:47:04.077190 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9e5d9ae7-378c-4f07-9d25-d1b3d187bde9-installation-pull-secrets\") pod \"image-registry-66df7c8f76-5fdjn\" (UID: \"9e5d9ae7-378c-4f07-9d25-d1b3d187bde9\") " pod="openshift-image-registry/image-registry-66df7c8f76-5fdjn" Jan 31 14:47:04 crc kubenswrapper[4751]: I0131 14:47:04.088343 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8m8k\" (UniqueName: \"kubernetes.io/projected/9e5d9ae7-378c-4f07-9d25-d1b3d187bde9-kube-api-access-z8m8k\") pod \"image-registry-66df7c8f76-5fdjn\" (UID: \"9e5d9ae7-378c-4f07-9d25-d1b3d187bde9\") " pod="openshift-image-registry/image-registry-66df7c8f76-5fdjn" Jan 31 14:47:04 crc kubenswrapper[4751]: I0131 14:47:04.090471 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9e5d9ae7-378c-4f07-9d25-d1b3d187bde9-bound-sa-token\") pod \"image-registry-66df7c8f76-5fdjn\" (UID: \"9e5d9ae7-378c-4f07-9d25-d1b3d187bde9\") " pod="openshift-image-registry/image-registry-66df7c8f76-5fdjn" Jan 31 14:47:04 crc kubenswrapper[4751]: I0131 14:47:04.141001 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-5fdjn" Jan 31 14:47:04 crc kubenswrapper[4751]: I0131 14:47:04.459266 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7ff7586b44-p7vhp" Jan 31 14:47:04 crc kubenswrapper[4751]: I0131 14:47:04.488171 4751 generic.go:334] "Generic (PLEG): container finished" podID="886a47b7-6715-4cd7-aea5-7db85b593b9b" containerID="4e67ee071bf6d27d2aacad78b8e2a9b1cad4b202c044eb94395ef3b85a36b3e5" exitCode=0 Jan 31 14:47:04 crc kubenswrapper[4751]: I0131 14:47:04.488269 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77bc486b6-z2pvg" event={"ID":"886a47b7-6715-4cd7-aea5-7db85b593b9b","Type":"ContainerDied","Data":"4e67ee071bf6d27d2aacad78b8e2a9b1cad4b202c044eb94395ef3b85a36b3e5"} Jan 31 14:47:04 crc kubenswrapper[4751]: I0131 14:47:04.492040 4751 generic.go:334] "Generic (PLEG): container finished" podID="61fbf77a-1344-4a32-81b4-9a12283ace53" containerID="0ddb810f9243354e528fff64b7098fda6c73e32066f11dc3cd5f6a8f0284bcdc" exitCode=0 Jan 31 14:47:04 crc kubenswrapper[4751]: I0131 14:47:04.492105 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7ff7586b44-p7vhp" Jan 31 14:47:04 crc kubenswrapper[4751]: I0131 14:47:04.492108 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7ff7586b44-p7vhp" event={"ID":"61fbf77a-1344-4a32-81b4-9a12283ace53","Type":"ContainerDied","Data":"0ddb810f9243354e528fff64b7098fda6c73e32066f11dc3cd5f6a8f0284bcdc"} Jan 31 14:47:04 crc kubenswrapper[4751]: I0131 14:47:04.492157 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7ff7586b44-p7vhp" event={"ID":"61fbf77a-1344-4a32-81b4-9a12283ace53","Type":"ContainerDied","Data":"ce28e111d2169a56020e396a7157657e7dded02c68bf85f9bd547a411d88e0d1"} Jan 31 14:47:04 crc kubenswrapper[4751]: I0131 14:47:04.492177 4751 scope.go:117] "RemoveContainer" containerID="0ddb810f9243354e528fff64b7098fda6c73e32066f11dc3cd5f6a8f0284bcdc" Jan 31 14:47:04 crc kubenswrapper[4751]: I0131 14:47:04.524565 4751 scope.go:117] "RemoveContainer" containerID="0ddb810f9243354e528fff64b7098fda6c73e32066f11dc3cd5f6a8f0284bcdc" Jan 31 14:47:04 crc kubenswrapper[4751]: E0131 14:47:04.524953 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ddb810f9243354e528fff64b7098fda6c73e32066f11dc3cd5f6a8f0284bcdc\": container with ID starting with 0ddb810f9243354e528fff64b7098fda6c73e32066f11dc3cd5f6a8f0284bcdc not found: ID does not exist" containerID="0ddb810f9243354e528fff64b7098fda6c73e32066f11dc3cd5f6a8f0284bcdc" Jan 31 14:47:04 crc kubenswrapper[4751]: I0131 14:47:04.525002 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ddb810f9243354e528fff64b7098fda6c73e32066f11dc3cd5f6a8f0284bcdc"} err="failed to get container status \"0ddb810f9243354e528fff64b7098fda6c73e32066f11dc3cd5f6a8f0284bcdc\": rpc error: code = NotFound desc = could not find container \"0ddb810f9243354e528fff64b7098fda6c73e32066f11dc3cd5f6a8f0284bcdc\": container with ID starting with 0ddb810f9243354e528fff64b7098fda6c73e32066f11dc3cd5f6a8f0284bcdc not found: ID does not exist" Jan 31 14:47:04 crc kubenswrapper[4751]: I0131 14:47:04.573350 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/61fbf77a-1344-4a32-81b4-9a12283ace53-client-ca\") pod \"61fbf77a-1344-4a32-81b4-9a12283ace53\" (UID: \"61fbf77a-1344-4a32-81b4-9a12283ace53\") " Jan 31 14:47:04 crc kubenswrapper[4751]: I0131 14:47:04.573523 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61fbf77a-1344-4a32-81b4-9a12283ace53-serving-cert\") pod \"61fbf77a-1344-4a32-81b4-9a12283ace53\" (UID: \"61fbf77a-1344-4a32-81b4-9a12283ace53\") " Jan 31 14:47:04 crc kubenswrapper[4751]: I0131 14:47:04.573583 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjqp6\" (UniqueName: \"kubernetes.io/projected/61fbf77a-1344-4a32-81b4-9a12283ace53-kube-api-access-pjqp6\") pod \"61fbf77a-1344-4a32-81b4-9a12283ace53\" (UID: \"61fbf77a-1344-4a32-81b4-9a12283ace53\") " Jan 31 14:47:04 crc kubenswrapper[4751]: I0131 14:47:04.573695 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61fbf77a-1344-4a32-81b4-9a12283ace53-config\") pod \"61fbf77a-1344-4a32-81b4-9a12283ace53\" (UID: \"61fbf77a-1344-4a32-81b4-9a12283ace53\") " Jan 31 14:47:04 crc kubenswrapper[4751]: I0131 14:47:04.574185 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61fbf77a-1344-4a32-81b4-9a12283ace53-client-ca" (OuterVolumeSpecName: "client-ca") pod "61fbf77a-1344-4a32-81b4-9a12283ace53" (UID: "61fbf77a-1344-4a32-81b4-9a12283ace53"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:47:04 crc kubenswrapper[4751]: I0131 14:47:04.574815 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61fbf77a-1344-4a32-81b4-9a12283ace53-config" (OuterVolumeSpecName: "config") pod "61fbf77a-1344-4a32-81b4-9a12283ace53" (UID: "61fbf77a-1344-4a32-81b4-9a12283ace53"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:47:04 crc kubenswrapper[4751]: I0131 14:47:04.578746 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61fbf77a-1344-4a32-81b4-9a12283ace53-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "61fbf77a-1344-4a32-81b4-9a12283ace53" (UID: "61fbf77a-1344-4a32-81b4-9a12283ace53"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:47:04 crc kubenswrapper[4751]: I0131 14:47:04.579470 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61fbf77a-1344-4a32-81b4-9a12283ace53-kube-api-access-pjqp6" (OuterVolumeSpecName: "kube-api-access-pjqp6") pod "61fbf77a-1344-4a32-81b4-9a12283ace53" (UID: "61fbf77a-1344-4a32-81b4-9a12283ace53"). InnerVolumeSpecName "kube-api-access-pjqp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:47:04 crc kubenswrapper[4751]: I0131 14:47:04.592548 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77bc486b6-z2pvg" Jan 31 14:47:04 crc kubenswrapper[4751]: I0131 14:47:04.617091 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-5fdjn"] Jan 31 14:47:04 crc kubenswrapper[4751]: W0131 14:47:04.632298 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e5d9ae7_378c_4f07_9d25_d1b3d187bde9.slice/crio-aa513204eefb20e434ea51bebfe6e79a0009c66d211ca5b79c4bd783529f6928 WatchSource:0}: Error finding container aa513204eefb20e434ea51bebfe6e79a0009c66d211ca5b79c4bd783529f6928: Status 404 returned error can't find the container with id aa513204eefb20e434ea51bebfe6e79a0009c66d211ca5b79c4bd783529f6928 Jan 31 14:47:04 crc kubenswrapper[4751]: I0131 14:47:04.675777 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/886a47b7-6715-4cd7-aea5-7db85b593b9b-client-ca\") pod \"886a47b7-6715-4cd7-aea5-7db85b593b9b\" (UID: \"886a47b7-6715-4cd7-aea5-7db85b593b9b\") " Jan 31 14:47:04 crc kubenswrapper[4751]: I0131 14:47:04.676210 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/886a47b7-6715-4cd7-aea5-7db85b593b9b-proxy-ca-bundles\") pod \"886a47b7-6715-4cd7-aea5-7db85b593b9b\" (UID: \"886a47b7-6715-4cd7-aea5-7db85b593b9b\") " Jan 31 14:47:04 crc kubenswrapper[4751]: I0131 14:47:04.676294 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/886a47b7-6715-4cd7-aea5-7db85b593b9b-serving-cert\") pod \"886a47b7-6715-4cd7-aea5-7db85b593b9b\" (UID: \"886a47b7-6715-4cd7-aea5-7db85b593b9b\") " Jan 31 14:47:04 crc kubenswrapper[4751]: I0131 14:47:04.676331 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/886a47b7-6715-4cd7-aea5-7db85b593b9b-config\") pod \"886a47b7-6715-4cd7-aea5-7db85b593b9b\" (UID: \"886a47b7-6715-4cd7-aea5-7db85b593b9b\") " Jan 31 14:47:04 crc kubenswrapper[4751]: I0131 14:47:04.676347 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6sbxv\" (UniqueName: \"kubernetes.io/projected/886a47b7-6715-4cd7-aea5-7db85b593b9b-kube-api-access-6sbxv\") pod \"886a47b7-6715-4cd7-aea5-7db85b593b9b\" (UID: \"886a47b7-6715-4cd7-aea5-7db85b593b9b\") " Jan 31 14:47:04 crc kubenswrapper[4751]: I0131 14:47:04.676555 4751 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/61fbf77a-1344-4a32-81b4-9a12283ace53-client-ca\") on node \"crc\" DevicePath \"\"" Jan 31 14:47:04 crc kubenswrapper[4751]: I0131 14:47:04.676567 4751 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61fbf77a-1344-4a32-81b4-9a12283ace53-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 14:47:04 crc kubenswrapper[4751]: I0131 14:47:04.676575 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjqp6\" (UniqueName: \"kubernetes.io/projected/61fbf77a-1344-4a32-81b4-9a12283ace53-kube-api-access-pjqp6\") on node \"crc\" DevicePath \"\"" Jan 31 14:47:04 crc kubenswrapper[4751]: I0131 14:47:04.676585 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61fbf77a-1344-4a32-81b4-9a12283ace53-config\") on node \"crc\" DevicePath \"\"" Jan 31 14:47:04 crc kubenswrapper[4751]: I0131 14:47:04.677330 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/886a47b7-6715-4cd7-aea5-7db85b593b9b-client-ca" (OuterVolumeSpecName: "client-ca") pod "886a47b7-6715-4cd7-aea5-7db85b593b9b" (UID: "886a47b7-6715-4cd7-aea5-7db85b593b9b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:47:04 crc kubenswrapper[4751]: I0131 14:47:04.677384 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/886a47b7-6715-4cd7-aea5-7db85b593b9b-config" (OuterVolumeSpecName: "config") pod "886a47b7-6715-4cd7-aea5-7db85b593b9b" (UID: "886a47b7-6715-4cd7-aea5-7db85b593b9b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:47:04 crc kubenswrapper[4751]: I0131 14:47:04.678226 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/886a47b7-6715-4cd7-aea5-7db85b593b9b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "886a47b7-6715-4cd7-aea5-7db85b593b9b" (UID: "886a47b7-6715-4cd7-aea5-7db85b593b9b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:47:04 crc kubenswrapper[4751]: I0131 14:47:04.680186 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/886a47b7-6715-4cd7-aea5-7db85b593b9b-kube-api-access-6sbxv" (OuterVolumeSpecName: "kube-api-access-6sbxv") pod "886a47b7-6715-4cd7-aea5-7db85b593b9b" (UID: "886a47b7-6715-4cd7-aea5-7db85b593b9b"). InnerVolumeSpecName "kube-api-access-6sbxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:47:04 crc kubenswrapper[4751]: I0131 14:47:04.689510 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/886a47b7-6715-4cd7-aea5-7db85b593b9b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "886a47b7-6715-4cd7-aea5-7db85b593b9b" (UID: "886a47b7-6715-4cd7-aea5-7db85b593b9b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:47:04 crc kubenswrapper[4751]: I0131 14:47:04.777189 4751 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/886a47b7-6715-4cd7-aea5-7db85b593b9b-client-ca\") on node \"crc\" DevicePath \"\"" Jan 31 14:47:04 crc kubenswrapper[4751]: I0131 14:47:04.777242 4751 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/886a47b7-6715-4cd7-aea5-7db85b593b9b-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 31 14:47:04 crc kubenswrapper[4751]: I0131 14:47:04.777252 4751 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/886a47b7-6715-4cd7-aea5-7db85b593b9b-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 14:47:04 crc kubenswrapper[4751]: I0131 14:47:04.777261 4751 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/886a47b7-6715-4cd7-aea5-7db85b593b9b-config\") on node \"crc\" DevicePath \"\"" Jan 31 14:47:04 crc kubenswrapper[4751]: I0131 14:47:04.777270 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6sbxv\" (UniqueName: \"kubernetes.io/projected/886a47b7-6715-4cd7-aea5-7db85b593b9b-kube-api-access-6sbxv\") on node \"crc\" DevicePath \"\"" Jan 31 14:47:04 crc kubenswrapper[4751]: I0131 14:47:04.825459 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7ff7586b44-p7vhp"] Jan 31 14:47:04 crc kubenswrapper[4751]: I0131 14:47:04.829308 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7ff7586b44-p7vhp"] Jan 31 14:47:05 crc kubenswrapper[4751]: I0131 14:47:05.202924 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-69f964bddc-w29sc"] Jan 31 14:47:05 crc kubenswrapper[4751]: E0131 14:47:05.203305 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="886a47b7-6715-4cd7-aea5-7db85b593b9b" containerName="controller-manager" Jan 31 14:47:05 crc kubenswrapper[4751]: I0131 14:47:05.203350 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="886a47b7-6715-4cd7-aea5-7db85b593b9b" containerName="controller-manager" Jan 31 14:47:05 crc kubenswrapper[4751]: E0131 14:47:05.203379 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61fbf77a-1344-4a32-81b4-9a12283ace53" containerName="route-controller-manager" Jan 31 14:47:05 crc kubenswrapper[4751]: I0131 14:47:05.203393 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="61fbf77a-1344-4a32-81b4-9a12283ace53" containerName="route-controller-manager" Jan 31 14:47:05 crc kubenswrapper[4751]: I0131 14:47:05.203553 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="61fbf77a-1344-4a32-81b4-9a12283ace53" containerName="route-controller-manager" Jan 31 14:47:05 crc kubenswrapper[4751]: I0131 14:47:05.203581 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="886a47b7-6715-4cd7-aea5-7db85b593b9b" containerName="controller-manager" Jan 31 14:47:05 crc kubenswrapper[4751]: I0131 14:47:05.204172 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-69f964bddc-w29sc" Jan 31 14:47:05 crc kubenswrapper[4751]: I0131 14:47:05.212016 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-786458fd97-vccrh"] Jan 31 14:47:05 crc kubenswrapper[4751]: I0131 14:47:05.212995 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-786458fd97-vccrh" Jan 31 14:47:05 crc kubenswrapper[4751]: I0131 14:47:05.217703 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 31 14:47:05 crc kubenswrapper[4751]: I0131 14:47:05.218178 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 31 14:47:05 crc kubenswrapper[4751]: I0131 14:47:05.218397 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 31 14:47:05 crc kubenswrapper[4751]: I0131 14:47:05.218606 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 31 14:47:05 crc kubenswrapper[4751]: I0131 14:47:05.218816 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 31 14:47:05 crc kubenswrapper[4751]: I0131 14:47:05.219252 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 31 14:47:05 crc kubenswrapper[4751]: I0131 14:47:05.233951 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-69f964bddc-w29sc"] Jan 31 14:47:05 crc kubenswrapper[4751]: I0131 14:47:05.243770 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-786458fd97-vccrh"] Jan 31 14:47:05 crc kubenswrapper[4751]: I0131 14:47:05.283572 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/20655fbe-0d1c-451d-8ab2-1b8e3423fbcd-client-ca\") pod \"route-controller-manager-786458fd97-vccrh\" (UID: \"20655fbe-0d1c-451d-8ab2-1b8e3423fbcd\") " pod="openshift-route-controller-manager/route-controller-manager-786458fd97-vccrh" Jan 31 14:47:05 crc kubenswrapper[4751]: I0131 14:47:05.283654 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8efbc4e-7e88-4914-8141-8b93ace1dcb0-config\") pod \"controller-manager-69f964bddc-w29sc\" (UID: \"d8efbc4e-7e88-4914-8141-8b93ace1dcb0\") " pod="openshift-controller-manager/controller-manager-69f964bddc-w29sc" Jan 31 14:47:05 crc kubenswrapper[4751]: I0131 14:47:05.283713 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20655fbe-0d1c-451d-8ab2-1b8e3423fbcd-serving-cert\") pod \"route-controller-manager-786458fd97-vccrh\" (UID: \"20655fbe-0d1c-451d-8ab2-1b8e3423fbcd\") " pod="openshift-route-controller-manager/route-controller-manager-786458fd97-vccrh" Jan 31 14:47:05 crc kubenswrapper[4751]: I0131 14:47:05.283757 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d8efbc4e-7e88-4914-8141-8b93ace1dcb0-client-ca\") pod \"controller-manager-69f964bddc-w29sc\" (UID: \"d8efbc4e-7e88-4914-8141-8b93ace1dcb0\") " pod="openshift-controller-manager/controller-manager-69f964bddc-w29sc" Jan 31 14:47:05 crc kubenswrapper[4751]: I0131 14:47:05.283810 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szbhb\" (UniqueName: \"kubernetes.io/projected/d8efbc4e-7e88-4914-8141-8b93ace1dcb0-kube-api-access-szbhb\") pod \"controller-manager-69f964bddc-w29sc\" (UID: \"d8efbc4e-7e88-4914-8141-8b93ace1dcb0\") " pod="openshift-controller-manager/controller-manager-69f964bddc-w29sc" Jan 31 14:47:05 crc kubenswrapper[4751]: I0131 14:47:05.283855 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20655fbe-0d1c-451d-8ab2-1b8e3423fbcd-config\") pod \"route-controller-manager-786458fd97-vccrh\" (UID: \"20655fbe-0d1c-451d-8ab2-1b8e3423fbcd\") " pod="openshift-route-controller-manager/route-controller-manager-786458fd97-vccrh" Jan 31 14:47:05 crc kubenswrapper[4751]: I0131 14:47:05.283886 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8efbc4e-7e88-4914-8141-8b93ace1dcb0-serving-cert\") pod \"controller-manager-69f964bddc-w29sc\" (UID: \"d8efbc4e-7e88-4914-8141-8b93ace1dcb0\") " pod="openshift-controller-manager/controller-manager-69f964bddc-w29sc" Jan 31 14:47:05 crc kubenswrapper[4751]: I0131 14:47:05.283922 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8efbc4e-7e88-4914-8141-8b93ace1dcb0-proxy-ca-bundles\") pod \"controller-manager-69f964bddc-w29sc\" (UID: \"d8efbc4e-7e88-4914-8141-8b93ace1dcb0\") " pod="openshift-controller-manager/controller-manager-69f964bddc-w29sc" Jan 31 14:47:05 crc kubenswrapper[4751]: I0131 14:47:05.283971 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnljq\" (UniqueName: \"kubernetes.io/projected/20655fbe-0d1c-451d-8ab2-1b8e3423fbcd-kube-api-access-pnljq\") pod \"route-controller-manager-786458fd97-vccrh\" (UID: \"20655fbe-0d1c-451d-8ab2-1b8e3423fbcd\") " pod="openshift-route-controller-manager/route-controller-manager-786458fd97-vccrh" Jan 31 14:47:05 crc kubenswrapper[4751]: I0131 14:47:05.384782 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szbhb\" (UniqueName: \"kubernetes.io/projected/d8efbc4e-7e88-4914-8141-8b93ace1dcb0-kube-api-access-szbhb\") pod \"controller-manager-69f964bddc-w29sc\" (UID: \"d8efbc4e-7e88-4914-8141-8b93ace1dcb0\") " pod="openshift-controller-manager/controller-manager-69f964bddc-w29sc" Jan 31 14:47:05 crc kubenswrapper[4751]: I0131 14:47:05.384836 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20655fbe-0d1c-451d-8ab2-1b8e3423fbcd-config\") pod \"route-controller-manager-786458fd97-vccrh\" (UID: \"20655fbe-0d1c-451d-8ab2-1b8e3423fbcd\") " pod="openshift-route-controller-manager/route-controller-manager-786458fd97-vccrh" Jan 31 14:47:05 crc kubenswrapper[4751]: I0131 14:47:05.384869 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8efbc4e-7e88-4914-8141-8b93ace1dcb0-serving-cert\") pod \"controller-manager-69f964bddc-w29sc\" (UID: \"d8efbc4e-7e88-4914-8141-8b93ace1dcb0\") " pod="openshift-controller-manager/controller-manager-69f964bddc-w29sc" Jan 31 14:47:05 crc kubenswrapper[4751]: I0131 14:47:05.384889 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8efbc4e-7e88-4914-8141-8b93ace1dcb0-proxy-ca-bundles\") pod \"controller-manager-69f964bddc-w29sc\" (UID: \"d8efbc4e-7e88-4914-8141-8b93ace1dcb0\") " pod="openshift-controller-manager/controller-manager-69f964bddc-w29sc" Jan 31 14:47:05 crc kubenswrapper[4751]: I0131 14:47:05.384917 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnljq\" (UniqueName: \"kubernetes.io/projected/20655fbe-0d1c-451d-8ab2-1b8e3423fbcd-kube-api-access-pnljq\") pod \"route-controller-manager-786458fd97-vccrh\" (UID: \"20655fbe-0d1c-451d-8ab2-1b8e3423fbcd\") " pod="openshift-route-controller-manager/route-controller-manager-786458fd97-vccrh" Jan 31 14:47:05 crc kubenswrapper[4751]: I0131 14:47:05.384940 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/20655fbe-0d1c-451d-8ab2-1b8e3423fbcd-client-ca\") pod \"route-controller-manager-786458fd97-vccrh\" (UID: \"20655fbe-0d1c-451d-8ab2-1b8e3423fbcd\") " pod="openshift-route-controller-manager/route-controller-manager-786458fd97-vccrh" Jan 31 14:47:05 crc kubenswrapper[4751]: I0131 14:47:05.384960 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8efbc4e-7e88-4914-8141-8b93ace1dcb0-config\") pod \"controller-manager-69f964bddc-w29sc\" (UID: \"d8efbc4e-7e88-4914-8141-8b93ace1dcb0\") " pod="openshift-controller-manager/controller-manager-69f964bddc-w29sc" Jan 31 14:47:05 crc kubenswrapper[4751]: I0131 14:47:05.384984 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20655fbe-0d1c-451d-8ab2-1b8e3423fbcd-serving-cert\") pod \"route-controller-manager-786458fd97-vccrh\" (UID: \"20655fbe-0d1c-451d-8ab2-1b8e3423fbcd\") " pod="openshift-route-controller-manager/route-controller-manager-786458fd97-vccrh" Jan 31 14:47:05 crc kubenswrapper[4751]: I0131 14:47:05.385008 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d8efbc4e-7e88-4914-8141-8b93ace1dcb0-client-ca\") pod \"controller-manager-69f964bddc-w29sc\" (UID: \"d8efbc4e-7e88-4914-8141-8b93ace1dcb0\") " pod="openshift-controller-manager/controller-manager-69f964bddc-w29sc" Jan 31 14:47:05 crc kubenswrapper[4751]: I0131 14:47:05.386984 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d8efbc4e-7e88-4914-8141-8b93ace1dcb0-client-ca\") pod \"controller-manager-69f964bddc-w29sc\" (UID: \"d8efbc4e-7e88-4914-8141-8b93ace1dcb0\") " pod="openshift-controller-manager/controller-manager-69f964bddc-w29sc" Jan 31 14:47:05 crc kubenswrapper[4751]: I0131 14:47:05.387170 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8efbc4e-7e88-4914-8141-8b93ace1dcb0-config\") pod \"controller-manager-69f964bddc-w29sc\" (UID: \"d8efbc4e-7e88-4914-8141-8b93ace1dcb0\") " pod="openshift-controller-manager/controller-manager-69f964bddc-w29sc" Jan 31 14:47:05 crc kubenswrapper[4751]: I0131 14:47:05.387592 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/20655fbe-0d1c-451d-8ab2-1b8e3423fbcd-client-ca\") pod \"route-controller-manager-786458fd97-vccrh\" (UID: \"20655fbe-0d1c-451d-8ab2-1b8e3423fbcd\") " pod="openshift-route-controller-manager/route-controller-manager-786458fd97-vccrh" Jan 31 14:47:05 crc kubenswrapper[4751]: I0131 14:47:05.388085 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8efbc4e-7e88-4914-8141-8b93ace1dcb0-proxy-ca-bundles\") pod \"controller-manager-69f964bddc-w29sc\" (UID: \"d8efbc4e-7e88-4914-8141-8b93ace1dcb0\") " pod="openshift-controller-manager/controller-manager-69f964bddc-w29sc" Jan 31 14:47:05 crc kubenswrapper[4751]: I0131 14:47:05.388372 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20655fbe-0d1c-451d-8ab2-1b8e3423fbcd-config\") pod \"route-controller-manager-786458fd97-vccrh\" (UID: \"20655fbe-0d1c-451d-8ab2-1b8e3423fbcd\") " pod="openshift-route-controller-manager/route-controller-manager-786458fd97-vccrh" Jan 31 14:47:05 crc kubenswrapper[4751]: I0131 14:47:05.392786 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8efbc4e-7e88-4914-8141-8b93ace1dcb0-serving-cert\") pod \"controller-manager-69f964bddc-w29sc\" (UID: \"d8efbc4e-7e88-4914-8141-8b93ace1dcb0\") " pod="openshift-controller-manager/controller-manager-69f964bddc-w29sc" Jan 31 14:47:05 crc kubenswrapper[4751]: I0131 14:47:05.392951 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20655fbe-0d1c-451d-8ab2-1b8e3423fbcd-serving-cert\") pod \"route-controller-manager-786458fd97-vccrh\" (UID: \"20655fbe-0d1c-451d-8ab2-1b8e3423fbcd\") " pod="openshift-route-controller-manager/route-controller-manager-786458fd97-vccrh" Jan 31 14:47:05 crc kubenswrapper[4751]: I0131 14:47:05.407486 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnljq\" (UniqueName: \"kubernetes.io/projected/20655fbe-0d1c-451d-8ab2-1b8e3423fbcd-kube-api-access-pnljq\") pod \"route-controller-manager-786458fd97-vccrh\" (UID: \"20655fbe-0d1c-451d-8ab2-1b8e3423fbcd\") " pod="openshift-route-controller-manager/route-controller-manager-786458fd97-vccrh" Jan 31 14:47:05 crc kubenswrapper[4751]: I0131 14:47:05.410091 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szbhb\" (UniqueName: \"kubernetes.io/projected/d8efbc4e-7e88-4914-8141-8b93ace1dcb0-kube-api-access-szbhb\") pod \"controller-manager-69f964bddc-w29sc\" (UID: \"d8efbc4e-7e88-4914-8141-8b93ace1dcb0\") " pod="openshift-controller-manager/controller-manager-69f964bddc-w29sc" Jan 31 14:47:05 crc kubenswrapper[4751]: I0131 14:47:05.501161 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77bc486b6-z2pvg" event={"ID":"886a47b7-6715-4cd7-aea5-7db85b593b9b","Type":"ContainerDied","Data":"1f155d2931dd972625ba384f618c3c395946a7bcfd8383aa5e80dfb2a1ba9412"} Jan 31 14:47:05 crc kubenswrapper[4751]: I0131 14:47:05.501191 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77bc486b6-z2pvg" Jan 31 14:47:05 crc kubenswrapper[4751]: I0131 14:47:05.501211 4751 scope.go:117] "RemoveContainer" containerID="4e67ee071bf6d27d2aacad78b8e2a9b1cad4b202c044eb94395ef3b85a36b3e5" Jan 31 14:47:05 crc kubenswrapper[4751]: I0131 14:47:05.505167 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-5fdjn" event={"ID":"9e5d9ae7-378c-4f07-9d25-d1b3d187bde9","Type":"ContainerStarted","Data":"2a23d20731bcc1295cc79827deb93850e328eee568b010969dd169117fad03dd"} Jan 31 14:47:05 crc kubenswrapper[4751]: I0131 14:47:05.505507 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-5fdjn" event={"ID":"9e5d9ae7-378c-4f07-9d25-d1b3d187bde9","Type":"ContainerStarted","Data":"aa513204eefb20e434ea51bebfe6e79a0009c66d211ca5b79c4bd783529f6928"} Jan 31 14:47:05 crc kubenswrapper[4751]: I0131 14:47:05.505547 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-5fdjn" Jan 31 14:47:05 crc kubenswrapper[4751]: I0131 14:47:05.526523 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-5fdjn" podStartSLOduration=2.52649724 podStartE2EDuration="2.52649724s" podCreationTimestamp="2026-01-31 14:47:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:47:05.523149747 +0000 UTC m=+329.897862662" watchObservedRunningTime="2026-01-31 14:47:05.52649724 +0000 UTC m=+329.901210165" Jan 31 14:47:05 crc kubenswrapper[4751]: I0131 14:47:05.537480 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-69f964bddc-w29sc" Jan 31 14:47:05 crc kubenswrapper[4751]: I0131 14:47:05.545474 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-786458fd97-vccrh" Jan 31 14:47:05 crc kubenswrapper[4751]: I0131 14:47:05.569806 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-77bc486b6-z2pvg"] Jan 31 14:47:05 crc kubenswrapper[4751]: I0131 14:47:05.573388 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-77bc486b6-z2pvg"] Jan 31 14:47:06 crc kubenswrapper[4751]: I0131 14:47:06.023127 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-786458fd97-vccrh"] Jan 31 14:47:06 crc kubenswrapper[4751]: W0131 14:47:06.031675 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20655fbe_0d1c_451d_8ab2_1b8e3423fbcd.slice/crio-42e78426b9ee75e568c2050d6a332e4005abb4b37e408c567550901cb461109e WatchSource:0}: Error finding container 42e78426b9ee75e568c2050d6a332e4005abb4b37e408c567550901cb461109e: Status 404 returned error can't find the container with id 42e78426b9ee75e568c2050d6a332e4005abb4b37e408c567550901cb461109e Jan 31 14:47:06 crc kubenswrapper[4751]: I0131 14:47:06.085647 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-69f964bddc-w29sc"] Jan 31 14:47:06 crc kubenswrapper[4751]: W0131 14:47:06.094346 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8efbc4e_7e88_4914_8141_8b93ace1dcb0.slice/crio-c34e6be9317685ffc3d68cf782fa0094bfb98f5274e80b311fa44888e6c0023b WatchSource:0}: Error finding container c34e6be9317685ffc3d68cf782fa0094bfb98f5274e80b311fa44888e6c0023b: Status 404 returned error can't find the container with id c34e6be9317685ffc3d68cf782fa0094bfb98f5274e80b311fa44888e6c0023b Jan 31 14:47:06 crc kubenswrapper[4751]: I0131 14:47:06.336199 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-m4m6r"] Jan 31 14:47:06 crc kubenswrapper[4751]: I0131 14:47:06.336910 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-m4m6r" podUID="8d5f1383-42d7-47a1-9e47-8dba038241d2" containerName="registry-server" containerID="cri-o://eabca8f8fcdbfb2f04b488498b2a615e9946a5ba739f9fb75c570ef168f4bcd8" gracePeriod=30 Jan 31 14:47:06 crc kubenswrapper[4751]: I0131 14:47:06.352689 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wcnsn"] Jan 31 14:47:06 crc kubenswrapper[4751]: I0131 14:47:06.352916 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wcnsn" podUID="074619b7-9220-4377-b93d-6088199a5e16" containerName="registry-server" containerID="cri-o://362555e9a4bda60e895d4cff8fad32fbdb6800b24c4a8d8deeb2ac026aebcc1b" gracePeriod=30 Jan 31 14:47:06 crc kubenswrapper[4751]: I0131 14:47:06.362951 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5r6kv"] Jan 31 14:47:06 crc kubenswrapper[4751]: I0131 14:47:06.363216 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-5r6kv" podUID="8bc2bde1-a50e-47bb-8211-4b5ed0ac74ea" containerName="marketplace-operator" containerID="cri-o://cc163d448fa8fad6b5ab0077c0960c4003a53c503f6d097090f206fed6245a22" gracePeriod=30 Jan 31 14:47:06 crc kubenswrapper[4751]: I0131 14:47:06.371923 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k2xfl"] Jan 31 14:47:06 crc kubenswrapper[4751]: I0131 14:47:06.372251 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-k2xfl" podUID="e656c7af-fbd9-4e9c-ae61-d4142d37c89f" containerName="registry-server" containerID="cri-o://3bb7101aeb47dd5d5b9aa6ef1075e32a424c360c1ebaa7fd0787c20e4303f647" gracePeriod=30 Jan 31 14:47:06 crc kubenswrapper[4751]: I0131 14:47:06.386167 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jv94g"] Jan 31 14:47:06 crc kubenswrapper[4751]: I0131 14:47:06.387092 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jv94g" Jan 31 14:47:06 crc kubenswrapper[4751]: I0131 14:47:06.393822 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gktqp"] Jan 31 14:47:06 crc kubenswrapper[4751]: I0131 14:47:06.394153 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gktqp" podUID="0cfb2e52-7371-4d38-994c-92b5b7d123cc" containerName="registry-server" containerID="cri-o://632dd7cf21c157e19b8b506aada8a2a0cc9ee7a4c7089d92374fc5dc9f67b1ea" gracePeriod=30 Jan 31 14:47:06 crc kubenswrapper[4751]: I0131 14:47:06.422410 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61fbf77a-1344-4a32-81b4-9a12283ace53" path="/var/lib/kubelet/pods/61fbf77a-1344-4a32-81b4-9a12283ace53/volumes" Jan 31 14:47:06 crc kubenswrapper[4751]: I0131 14:47:06.423454 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="886a47b7-6715-4cd7-aea5-7db85b593b9b" path="/var/lib/kubelet/pods/886a47b7-6715-4cd7-aea5-7db85b593b9b/volumes" Jan 31 14:47:06 crc kubenswrapper[4751]: I0131 14:47:06.424529 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jv94g"] Jan 31 14:47:06 crc kubenswrapper[4751]: E0131 14:47:06.441543 4751 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 362555e9a4bda60e895d4cff8fad32fbdb6800b24c4a8d8deeb2ac026aebcc1b is running failed: container process not found" containerID="362555e9a4bda60e895d4cff8fad32fbdb6800b24c4a8d8deeb2ac026aebcc1b" cmd=["grpc_health_probe","-addr=:50051"] Jan 31 14:47:06 crc kubenswrapper[4751]: E0131 14:47:06.442168 4751 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 362555e9a4bda60e895d4cff8fad32fbdb6800b24c4a8d8deeb2ac026aebcc1b is running failed: container process not found" containerID="362555e9a4bda60e895d4cff8fad32fbdb6800b24c4a8d8deeb2ac026aebcc1b" cmd=["grpc_health_probe","-addr=:50051"] Jan 31 14:47:06 crc kubenswrapper[4751]: E0131 14:47:06.442549 4751 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 362555e9a4bda60e895d4cff8fad32fbdb6800b24c4a8d8deeb2ac026aebcc1b is running failed: container process not found" containerID="362555e9a4bda60e895d4cff8fad32fbdb6800b24c4a8d8deeb2ac026aebcc1b" cmd=["grpc_health_probe","-addr=:50051"] Jan 31 14:47:06 crc kubenswrapper[4751]: E0131 14:47:06.442583 4751 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 362555e9a4bda60e895d4cff8fad32fbdb6800b24c4a8d8deeb2ac026aebcc1b is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-wcnsn" podUID="074619b7-9220-4377-b93d-6088199a5e16" containerName="registry-server" Jan 31 14:47:06 crc kubenswrapper[4751]: I0131 14:47:06.498507 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9853dd16-26f9-4fe4-9468-52d39dd4dd1f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jv94g\" (UID: \"9853dd16-26f9-4fe4-9468-52d39dd4dd1f\") " pod="openshift-marketplace/marketplace-operator-79b997595-jv94g" Jan 31 14:47:06 crc kubenswrapper[4751]: I0131 14:47:06.498567 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x7sz\" (UniqueName: \"kubernetes.io/projected/9853dd16-26f9-4fe4-9468-52d39dd4dd1f-kube-api-access-8x7sz\") pod \"marketplace-operator-79b997595-jv94g\" (UID: \"9853dd16-26f9-4fe4-9468-52d39dd4dd1f\") " pod="openshift-marketplace/marketplace-operator-79b997595-jv94g" Jan 31 14:47:06 crc kubenswrapper[4751]: I0131 14:47:06.498590 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9853dd16-26f9-4fe4-9468-52d39dd4dd1f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jv94g\" (UID: \"9853dd16-26f9-4fe4-9468-52d39dd4dd1f\") " pod="openshift-marketplace/marketplace-operator-79b997595-jv94g" Jan 31 14:47:06 crc kubenswrapper[4751]: I0131 14:47:06.527645 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-786458fd97-vccrh" event={"ID":"20655fbe-0d1c-451d-8ab2-1b8e3423fbcd","Type":"ContainerStarted","Data":"6d8bfb291d86aed9721a7701126d432d5dd1555a7bc160fc6cddff2dc085284f"} Jan 31 14:47:06 crc kubenswrapper[4751]: I0131 14:47:06.527687 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-786458fd97-vccrh" event={"ID":"20655fbe-0d1c-451d-8ab2-1b8e3423fbcd","Type":"ContainerStarted","Data":"42e78426b9ee75e568c2050d6a332e4005abb4b37e408c567550901cb461109e"} Jan 31 14:47:06 crc kubenswrapper[4751]: I0131 14:47:06.528395 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-786458fd97-vccrh" Jan 31 14:47:06 crc kubenswrapper[4751]: I0131 14:47:06.545688 4751 generic.go:334] "Generic (PLEG): container finished" podID="8d5f1383-42d7-47a1-9e47-8dba038241d2" containerID="eabca8f8fcdbfb2f04b488498b2a615e9946a5ba739f9fb75c570ef168f4bcd8" exitCode=0 Jan 31 14:47:06 crc kubenswrapper[4751]: I0131 14:47:06.545784 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m4m6r" event={"ID":"8d5f1383-42d7-47a1-9e47-8dba038241d2","Type":"ContainerDied","Data":"eabca8f8fcdbfb2f04b488498b2a615e9946a5ba739f9fb75c570ef168f4bcd8"} Jan 31 14:47:06 crc kubenswrapper[4751]: I0131 14:47:06.570663 4751 generic.go:334] "Generic (PLEG): container finished" podID="e656c7af-fbd9-4e9c-ae61-d4142d37c89f" containerID="3bb7101aeb47dd5d5b9aa6ef1075e32a424c360c1ebaa7fd0787c20e4303f647" exitCode=0 Jan 31 14:47:06 crc kubenswrapper[4751]: I0131 14:47:06.570757 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k2xfl" event={"ID":"e656c7af-fbd9-4e9c-ae61-d4142d37c89f","Type":"ContainerDied","Data":"3bb7101aeb47dd5d5b9aa6ef1075e32a424c360c1ebaa7fd0787c20e4303f647"} Jan 31 14:47:06 crc kubenswrapper[4751]: I0131 14:47:06.600803 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8x7sz\" (UniqueName: \"kubernetes.io/projected/9853dd16-26f9-4fe4-9468-52d39dd4dd1f-kube-api-access-8x7sz\") pod \"marketplace-operator-79b997595-jv94g\" (UID: \"9853dd16-26f9-4fe4-9468-52d39dd4dd1f\") " pod="openshift-marketplace/marketplace-operator-79b997595-jv94g" Jan 31 14:47:06 crc kubenswrapper[4751]: I0131 14:47:06.600859 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9853dd16-26f9-4fe4-9468-52d39dd4dd1f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jv94g\" (UID: \"9853dd16-26f9-4fe4-9468-52d39dd4dd1f\") " pod="openshift-marketplace/marketplace-operator-79b997595-jv94g" Jan 31 14:47:06 crc kubenswrapper[4751]: I0131 14:47:06.600965 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9853dd16-26f9-4fe4-9468-52d39dd4dd1f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jv94g\" (UID: \"9853dd16-26f9-4fe4-9468-52d39dd4dd1f\") " pod="openshift-marketplace/marketplace-operator-79b997595-jv94g" Jan 31 14:47:06 crc kubenswrapper[4751]: I0131 14:47:06.602917 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9853dd16-26f9-4fe4-9468-52d39dd4dd1f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jv94g\" (UID: \"9853dd16-26f9-4fe4-9468-52d39dd4dd1f\") " pod="openshift-marketplace/marketplace-operator-79b997595-jv94g" Jan 31 14:47:06 crc kubenswrapper[4751]: I0131 14:47:06.622143 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-69f964bddc-w29sc" event={"ID":"d8efbc4e-7e88-4914-8141-8b93ace1dcb0","Type":"ContainerStarted","Data":"1b42a0bcb0990937c0f4a17f1eafa43df652100c854be4cc03e56963d9f512df"} Jan 31 14:47:06 crc kubenswrapper[4751]: I0131 14:47:06.622187 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-69f964bddc-w29sc" event={"ID":"d8efbc4e-7e88-4914-8141-8b93ace1dcb0","Type":"ContainerStarted","Data":"c34e6be9317685ffc3d68cf782fa0094bfb98f5274e80b311fa44888e6c0023b"} Jan 31 14:47:06 crc kubenswrapper[4751]: I0131 14:47:06.623088 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-69f964bddc-w29sc" Jan 31 14:47:06 crc kubenswrapper[4751]: I0131 14:47:06.627209 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9853dd16-26f9-4fe4-9468-52d39dd4dd1f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jv94g\" (UID: \"9853dd16-26f9-4fe4-9468-52d39dd4dd1f\") " pod="openshift-marketplace/marketplace-operator-79b997595-jv94g" Jan 31 14:47:06 crc kubenswrapper[4751]: I0131 14:47:06.639733 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x7sz\" (UniqueName: \"kubernetes.io/projected/9853dd16-26f9-4fe4-9468-52d39dd4dd1f-kube-api-access-8x7sz\") pod \"marketplace-operator-79b997595-jv94g\" (UID: \"9853dd16-26f9-4fe4-9468-52d39dd4dd1f\") " pod="openshift-marketplace/marketplace-operator-79b997595-jv94g" Jan 31 14:47:06 crc kubenswrapper[4751]: E0131 14:47:06.649732 4751 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of eabca8f8fcdbfb2f04b488498b2a615e9946a5ba739f9fb75c570ef168f4bcd8 is running failed: container process not found" containerID="eabca8f8fcdbfb2f04b488498b2a615e9946a5ba739f9fb75c570ef168f4bcd8" cmd=["grpc_health_probe","-addr=:50051"] Jan 31 14:47:06 crc kubenswrapper[4751]: E0131 14:47:06.650463 4751 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of eabca8f8fcdbfb2f04b488498b2a615e9946a5ba739f9fb75c570ef168f4bcd8 is running failed: container process not found" containerID="eabca8f8fcdbfb2f04b488498b2a615e9946a5ba739f9fb75c570ef168f4bcd8" cmd=["grpc_health_probe","-addr=:50051"] Jan 31 14:47:06 crc kubenswrapper[4751]: I0131 14:47:06.651475 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-69f964bddc-w29sc" podStartSLOduration=3.65146463 podStartE2EDuration="3.65146463s" podCreationTimestamp="2026-01-31 14:47:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:47:06.650882653 +0000 UTC m=+331.025595538" watchObservedRunningTime="2026-01-31 14:47:06.65146463 +0000 UTC m=+331.026177515" Jan 31 14:47:06 crc kubenswrapper[4751]: I0131 14:47:06.653520 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-786458fd97-vccrh" podStartSLOduration=2.653513287 podStartE2EDuration="2.653513287s" podCreationTimestamp="2026-01-31 14:47:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:47:06.581376744 +0000 UTC m=+330.956089629" watchObservedRunningTime="2026-01-31 14:47:06.653513287 +0000 UTC m=+331.028226172" Jan 31 14:47:06 crc kubenswrapper[4751]: E0131 14:47:06.655603 4751 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of eabca8f8fcdbfb2f04b488498b2a615e9946a5ba739f9fb75c570ef168f4bcd8 is running failed: container process not found" containerID="eabca8f8fcdbfb2f04b488498b2a615e9946a5ba739f9fb75c570ef168f4bcd8" cmd=["grpc_health_probe","-addr=:50051"] Jan 31 14:47:06 crc kubenswrapper[4751]: E0131 14:47:06.655664 4751 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of eabca8f8fcdbfb2f04b488498b2a615e9946a5ba739f9fb75c570ef168f4bcd8 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-m4m6r" podUID="8d5f1383-42d7-47a1-9e47-8dba038241d2" containerName="registry-server" Jan 31 14:47:06 crc kubenswrapper[4751]: I0131 14:47:06.672380 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-69f964bddc-w29sc" Jan 31 14:47:06 crc kubenswrapper[4751]: I0131 14:47:06.681857 4751 generic.go:334] "Generic (PLEG): container finished" podID="8bc2bde1-a50e-47bb-8211-4b5ed0ac74ea" containerID="cc163d448fa8fad6b5ab0077c0960c4003a53c503f6d097090f206fed6245a22" exitCode=0 Jan 31 14:47:06 crc kubenswrapper[4751]: I0131 14:47:06.681913 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-5r6kv" event={"ID":"8bc2bde1-a50e-47bb-8211-4b5ed0ac74ea","Type":"ContainerDied","Data":"cc163d448fa8fad6b5ab0077c0960c4003a53c503f6d097090f206fed6245a22"} Jan 31 14:47:06 crc kubenswrapper[4751]: I0131 14:47:06.692893 4751 generic.go:334] "Generic (PLEG): container finished" podID="074619b7-9220-4377-b93d-6088199a5e16" containerID="362555e9a4bda60e895d4cff8fad32fbdb6800b24c4a8d8deeb2ac026aebcc1b" exitCode=0 Jan 31 14:47:06 crc kubenswrapper[4751]: I0131 14:47:06.693043 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wcnsn" event={"ID":"074619b7-9220-4377-b93d-6088199a5e16","Type":"ContainerDied","Data":"362555e9a4bda60e895d4cff8fad32fbdb6800b24c4a8d8deeb2ac026aebcc1b"} Jan 31 14:47:06 crc kubenswrapper[4751]: I0131 14:47:06.704940 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jv94g" Jan 31 14:47:06 crc kubenswrapper[4751]: I0131 14:47:06.817294 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-786458fd97-vccrh" Jan 31 14:47:06 crc kubenswrapper[4751]: I0131 14:47:06.946320 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-5r6kv" Jan 31 14:47:06 crc kubenswrapper[4751]: I0131 14:47:06.997510 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k2xfl" Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.010458 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdrlx\" (UniqueName: \"kubernetes.io/projected/8bc2bde1-a50e-47bb-8211-4b5ed0ac74ea-kube-api-access-qdrlx\") pod \"8bc2bde1-a50e-47bb-8211-4b5ed0ac74ea\" (UID: \"8bc2bde1-a50e-47bb-8211-4b5ed0ac74ea\") " Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.010503 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8bc2bde1-a50e-47bb-8211-4b5ed0ac74ea-marketplace-trusted-ca\") pod \"8bc2bde1-a50e-47bb-8211-4b5ed0ac74ea\" (UID: \"8bc2bde1-a50e-47bb-8211-4b5ed0ac74ea\") " Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.010543 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8bc2bde1-a50e-47bb-8211-4b5ed0ac74ea-marketplace-operator-metrics\") pod \"8bc2bde1-a50e-47bb-8211-4b5ed0ac74ea\" (UID: \"8bc2bde1-a50e-47bb-8211-4b5ed0ac74ea\") " Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.012345 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bc2bde1-a50e-47bb-8211-4b5ed0ac74ea-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "8bc2bde1-a50e-47bb-8211-4b5ed0ac74ea" (UID: "8bc2bde1-a50e-47bb-8211-4b5ed0ac74ea"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.017492 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bc2bde1-a50e-47bb-8211-4b5ed0ac74ea-kube-api-access-qdrlx" (OuterVolumeSpecName: "kube-api-access-qdrlx") pod "8bc2bde1-a50e-47bb-8211-4b5ed0ac74ea" (UID: "8bc2bde1-a50e-47bb-8211-4b5ed0ac74ea"). InnerVolumeSpecName "kube-api-access-qdrlx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.034853 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bc2bde1-a50e-47bb-8211-4b5ed0ac74ea-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "8bc2bde1-a50e-47bb-8211-4b5ed0ac74ea" (UID: "8bc2bde1-a50e-47bb-8211-4b5ed0ac74ea"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.091350 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m4m6r" Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.111514 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e656c7af-fbd9-4e9c-ae61-d4142d37c89f-utilities\") pod \"e656c7af-fbd9-4e9c-ae61-d4142d37c89f\" (UID: \"e656c7af-fbd9-4e9c-ae61-d4142d37c89f\") " Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.111636 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e656c7af-fbd9-4e9c-ae61-d4142d37c89f-catalog-content\") pod \"e656c7af-fbd9-4e9c-ae61-d4142d37c89f\" (UID: \"e656c7af-fbd9-4e9c-ae61-d4142d37c89f\") " Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.111670 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkxqv\" (UniqueName: \"kubernetes.io/projected/e656c7af-fbd9-4e9c-ae61-d4142d37c89f-kube-api-access-wkxqv\") pod \"e656c7af-fbd9-4e9c-ae61-d4142d37c89f\" (UID: \"e656c7af-fbd9-4e9c-ae61-d4142d37c89f\") " Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.111883 4751 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8bc2bde1-a50e-47bb-8211-4b5ed0ac74ea-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.111900 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdrlx\" (UniqueName: \"kubernetes.io/projected/8bc2bde1-a50e-47bb-8211-4b5ed0ac74ea-kube-api-access-qdrlx\") on node \"crc\" DevicePath \"\"" Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.111909 4751 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8bc2bde1-a50e-47bb-8211-4b5ed0ac74ea-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.113988 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e656c7af-fbd9-4e9c-ae61-d4142d37c89f-utilities" (OuterVolumeSpecName: "utilities") pod "e656c7af-fbd9-4e9c-ae61-d4142d37c89f" (UID: "e656c7af-fbd9-4e9c-ae61-d4142d37c89f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.115057 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e656c7af-fbd9-4e9c-ae61-d4142d37c89f-kube-api-access-wkxqv" (OuterVolumeSpecName: "kube-api-access-wkxqv") pod "e656c7af-fbd9-4e9c-ae61-d4142d37c89f" (UID: "e656c7af-fbd9-4e9c-ae61-d4142d37c89f"). InnerVolumeSpecName "kube-api-access-wkxqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.138820 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e656c7af-fbd9-4e9c-ae61-d4142d37c89f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e656c7af-fbd9-4e9c-ae61-d4142d37c89f" (UID: "e656c7af-fbd9-4e9c-ae61-d4142d37c89f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.175095 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gktqp" Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.180298 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wcnsn" Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.213971 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgf8m\" (UniqueName: \"kubernetes.io/projected/0cfb2e52-7371-4d38-994c-92b5b7d123cc-kube-api-access-qgf8m\") pod \"0cfb2e52-7371-4d38-994c-92b5b7d123cc\" (UID: \"0cfb2e52-7371-4d38-994c-92b5b7d123cc\") " Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.214085 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d5f1383-42d7-47a1-9e47-8dba038241d2-catalog-content\") pod \"8d5f1383-42d7-47a1-9e47-8dba038241d2\" (UID: \"8d5f1383-42d7-47a1-9e47-8dba038241d2\") " Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.215616 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cfb2e52-7371-4d38-994c-92b5b7d123cc-utilities\") pod \"0cfb2e52-7371-4d38-994c-92b5b7d123cc\" (UID: \"0cfb2e52-7371-4d38-994c-92b5b7d123cc\") " Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.220765 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cfb2e52-7371-4d38-994c-92b5b7d123cc-catalog-content\") pod \"0cfb2e52-7371-4d38-994c-92b5b7d123cc\" (UID: \"0cfb2e52-7371-4d38-994c-92b5b7d123cc\") " Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.220848 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-566b8\" (UniqueName: \"kubernetes.io/projected/8d5f1383-42d7-47a1-9e47-8dba038241d2-kube-api-access-566b8\") pod \"8d5f1383-42d7-47a1-9e47-8dba038241d2\" (UID: \"8d5f1383-42d7-47a1-9e47-8dba038241d2\") " Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.220880 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzp7l\" (UniqueName: \"kubernetes.io/projected/074619b7-9220-4377-b93d-6088199a5e16-kube-api-access-pzp7l\") pod \"074619b7-9220-4377-b93d-6088199a5e16\" (UID: \"074619b7-9220-4377-b93d-6088199a5e16\") " Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.221378 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d5f1383-42d7-47a1-9e47-8dba038241d2-utilities\") pod \"8d5f1383-42d7-47a1-9e47-8dba038241d2\" (UID: \"8d5f1383-42d7-47a1-9e47-8dba038241d2\") " Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.221490 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/074619b7-9220-4377-b93d-6088199a5e16-catalog-content\") pod \"074619b7-9220-4377-b93d-6088199a5e16\" (UID: \"074619b7-9220-4377-b93d-6088199a5e16\") " Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.221552 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/074619b7-9220-4377-b93d-6088199a5e16-utilities\") pod \"074619b7-9220-4377-b93d-6088199a5e16\" (UID: \"074619b7-9220-4377-b93d-6088199a5e16\") " Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.222051 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e656c7af-fbd9-4e9c-ae61-d4142d37c89f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.222096 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkxqv\" (UniqueName: \"kubernetes.io/projected/e656c7af-fbd9-4e9c-ae61-d4142d37c89f-kube-api-access-wkxqv\") on node \"crc\" DevicePath \"\"" Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.222110 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e656c7af-fbd9-4e9c-ae61-d4142d37c89f-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.222206 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d5f1383-42d7-47a1-9e47-8dba038241d2-utilities" (OuterVolumeSpecName: "utilities") pod "8d5f1383-42d7-47a1-9e47-8dba038241d2" (UID: "8d5f1383-42d7-47a1-9e47-8dba038241d2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.222420 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/074619b7-9220-4377-b93d-6088199a5e16-utilities" (OuterVolumeSpecName: "utilities") pod "074619b7-9220-4377-b93d-6088199a5e16" (UID: "074619b7-9220-4377-b93d-6088199a5e16"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.222497 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cfb2e52-7371-4d38-994c-92b5b7d123cc-utilities" (OuterVolumeSpecName: "utilities") pod "0cfb2e52-7371-4d38-994c-92b5b7d123cc" (UID: "0cfb2e52-7371-4d38-994c-92b5b7d123cc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.231170 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cfb2e52-7371-4d38-994c-92b5b7d123cc-kube-api-access-qgf8m" (OuterVolumeSpecName: "kube-api-access-qgf8m") pod "0cfb2e52-7371-4d38-994c-92b5b7d123cc" (UID: "0cfb2e52-7371-4d38-994c-92b5b7d123cc"). InnerVolumeSpecName "kube-api-access-qgf8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.231837 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d5f1383-42d7-47a1-9e47-8dba038241d2-kube-api-access-566b8" (OuterVolumeSpecName: "kube-api-access-566b8") pod "8d5f1383-42d7-47a1-9e47-8dba038241d2" (UID: "8d5f1383-42d7-47a1-9e47-8dba038241d2"). InnerVolumeSpecName "kube-api-access-566b8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.232561 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/074619b7-9220-4377-b93d-6088199a5e16-kube-api-access-pzp7l" (OuterVolumeSpecName: "kube-api-access-pzp7l") pod "074619b7-9220-4377-b93d-6088199a5e16" (UID: "074619b7-9220-4377-b93d-6088199a5e16"). InnerVolumeSpecName "kube-api-access-pzp7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.273423 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d5f1383-42d7-47a1-9e47-8dba038241d2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8d5f1383-42d7-47a1-9e47-8dba038241d2" (UID: "8d5f1383-42d7-47a1-9e47-8dba038241d2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.293887 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/074619b7-9220-4377-b93d-6088199a5e16-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "074619b7-9220-4377-b93d-6088199a5e16" (UID: "074619b7-9220-4377-b93d-6088199a5e16"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.323443 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgf8m\" (UniqueName: \"kubernetes.io/projected/0cfb2e52-7371-4d38-994c-92b5b7d123cc-kube-api-access-qgf8m\") on node \"crc\" DevicePath \"\"" Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.323466 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d5f1383-42d7-47a1-9e47-8dba038241d2-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.323476 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cfb2e52-7371-4d38-994c-92b5b7d123cc-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.323486 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-566b8\" (UniqueName: \"kubernetes.io/projected/8d5f1383-42d7-47a1-9e47-8dba038241d2-kube-api-access-566b8\") on node \"crc\" DevicePath \"\"" Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.323494 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzp7l\" (UniqueName: \"kubernetes.io/projected/074619b7-9220-4377-b93d-6088199a5e16-kube-api-access-pzp7l\") on node \"crc\" DevicePath \"\"" Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.323503 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d5f1383-42d7-47a1-9e47-8dba038241d2-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.323510 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/074619b7-9220-4377-b93d-6088199a5e16-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.323518 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/074619b7-9220-4377-b93d-6088199a5e16-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.354566 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jv94g"] Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.374887 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cfb2e52-7371-4d38-994c-92b5b7d123cc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0cfb2e52-7371-4d38-994c-92b5b7d123cc" (UID: "0cfb2e52-7371-4d38-994c-92b5b7d123cc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.424533 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cfb2e52-7371-4d38-994c-92b5b7d123cc-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.699296 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-5r6kv" Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.699369 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-5r6kv" event={"ID":"8bc2bde1-a50e-47bb-8211-4b5ed0ac74ea","Type":"ContainerDied","Data":"8d8b4a1528af48d18db181db8a7bebc79bb86f32aba8601a554e74b7bcaef05b"} Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.699435 4751 scope.go:117] "RemoveContainer" containerID="cc163d448fa8fad6b5ab0077c0960c4003a53c503f6d097090f206fed6245a22" Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.705192 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wcnsn" event={"ID":"074619b7-9220-4377-b93d-6088199a5e16","Type":"ContainerDied","Data":"092d3acc3e94a3dfd58bc12b9df82ef7950bf9b5a3e7871999c9c0efa3eb1c6d"} Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.705247 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wcnsn" Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.714304 4751 scope.go:117] "RemoveContainer" containerID="362555e9a4bda60e895d4cff8fad32fbdb6800b24c4a8d8deeb2ac026aebcc1b" Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.716790 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k2xfl" event={"ID":"e656c7af-fbd9-4e9c-ae61-d4142d37c89f","Type":"ContainerDied","Data":"ed378354261ea17a2d24e834a9aed8f1a45166375fb6ae1ce1dc38b9af3b5e0f"} Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.716812 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k2xfl" Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.719949 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m4m6r" event={"ID":"8d5f1383-42d7-47a1-9e47-8dba038241d2","Type":"ContainerDied","Data":"cce74deb968262c3870a67f8d4e000b52815c6a74a72fbfe9270cef7ee6b23e7"} Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.719993 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m4m6r" Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.723121 4751 generic.go:334] "Generic (PLEG): container finished" podID="0cfb2e52-7371-4d38-994c-92b5b7d123cc" containerID="632dd7cf21c157e19b8b506aada8a2a0cc9ee7a4c7089d92374fc5dc9f67b1ea" exitCode=0 Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.723193 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gktqp" event={"ID":"0cfb2e52-7371-4d38-994c-92b5b7d123cc","Type":"ContainerDied","Data":"632dd7cf21c157e19b8b506aada8a2a0cc9ee7a4c7089d92374fc5dc9f67b1ea"} Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.723239 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gktqp" event={"ID":"0cfb2e52-7371-4d38-994c-92b5b7d123cc","Type":"ContainerDied","Data":"6d8aa8d0e0300436346b38972033f042890b145471a99f8a553c2f56d280787e"} Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.723359 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gktqp" Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.728781 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jv94g" event={"ID":"9853dd16-26f9-4fe4-9468-52d39dd4dd1f","Type":"ContainerStarted","Data":"9fb26ef265cb69ab4af712c357f3693b010e56fd9a06a09ee5a8f9d24d9f4442"} Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.728816 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jv94g" event={"ID":"9853dd16-26f9-4fe4-9468-52d39dd4dd1f","Type":"ContainerStarted","Data":"c98f7e5064bdc2ff52e44a7c61caa0f113237629962f0d6aede141632e8b125e"} Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.736646 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5r6kv"] Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.742895 4751 scope.go:117] "RemoveContainer" containerID="a757fc9386532749c4b360530fb36362a62f17d343908433db3d64555171c0b9" Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.749309 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5r6kv"] Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.765110 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-jv94g" podStartSLOduration=1.765093772 podStartE2EDuration="1.765093772s" podCreationTimestamp="2026-01-31 14:47:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:47:07.761282535 +0000 UTC m=+332.135995430" watchObservedRunningTime="2026-01-31 14:47:07.765093772 +0000 UTC m=+332.139806657" Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.783338 4751 scope.go:117] "RemoveContainer" containerID="c0a252955873aa8b7cfdf7c617f1852f7e64f86f50411d0f5cc675309d6a71b6" Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.791369 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k2xfl"] Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.795745 4751 scope.go:117] "RemoveContainer" containerID="3bb7101aeb47dd5d5b9aa6ef1075e32a424c360c1ebaa7fd0787c20e4303f647" Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.799042 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-k2xfl"] Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.804404 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wcnsn"] Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.808677 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wcnsn"] Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.812368 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-m4m6r"] Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.817082 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-m4m6r"] Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.822163 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gktqp"] Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.823508 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gktqp"] Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.824859 4751 scope.go:117] "RemoveContainer" containerID="874aebfb442c94d60aaad947db92520e6e5ff745ee226afefd00dd9dc85cb564" Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.837987 4751 scope.go:117] "RemoveContainer" containerID="6c75c5ad4aa0723fec261497091fc30b60d95e73f9fe993ece85f3e477da66ef" Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.850403 4751 scope.go:117] "RemoveContainer" containerID="eabca8f8fcdbfb2f04b488498b2a615e9946a5ba739f9fb75c570ef168f4bcd8" Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.863494 4751 scope.go:117] "RemoveContainer" containerID="c0f53c12a6e17e599de6a624dae5a0ba532d7e88bc9baf9838475b082d03f347" Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.877562 4751 scope.go:117] "RemoveContainer" containerID="e34fa377384a9a30f2361b80400e882c53155e0b5c8ad5f9beb3a5c178384ca0" Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.887506 4751 scope.go:117] "RemoveContainer" containerID="632dd7cf21c157e19b8b506aada8a2a0cc9ee7a4c7089d92374fc5dc9f67b1ea" Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.908553 4751 scope.go:117] "RemoveContainer" containerID="0fd8fddc66836c1f7f3a6139ad51fa7a751ab965677515d2805ae4e6c08a2765" Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.925882 4751 scope.go:117] "RemoveContainer" containerID="aa50b668454ba4cf1d6033028034c77daf53f009e58a1184a7d22b857abf8b23" Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.937023 4751 scope.go:117] "RemoveContainer" containerID="632dd7cf21c157e19b8b506aada8a2a0cc9ee7a4c7089d92374fc5dc9f67b1ea" Jan 31 14:47:07 crc kubenswrapper[4751]: E0131 14:47:07.937429 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"632dd7cf21c157e19b8b506aada8a2a0cc9ee7a4c7089d92374fc5dc9f67b1ea\": container with ID starting with 632dd7cf21c157e19b8b506aada8a2a0cc9ee7a4c7089d92374fc5dc9f67b1ea not found: ID does not exist" containerID="632dd7cf21c157e19b8b506aada8a2a0cc9ee7a4c7089d92374fc5dc9f67b1ea" Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.937457 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"632dd7cf21c157e19b8b506aada8a2a0cc9ee7a4c7089d92374fc5dc9f67b1ea"} err="failed to get container status \"632dd7cf21c157e19b8b506aada8a2a0cc9ee7a4c7089d92374fc5dc9f67b1ea\": rpc error: code = NotFound desc = could not find container \"632dd7cf21c157e19b8b506aada8a2a0cc9ee7a4c7089d92374fc5dc9f67b1ea\": container with ID starting with 632dd7cf21c157e19b8b506aada8a2a0cc9ee7a4c7089d92374fc5dc9f67b1ea not found: ID does not exist" Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.937478 4751 scope.go:117] "RemoveContainer" containerID="0fd8fddc66836c1f7f3a6139ad51fa7a751ab965677515d2805ae4e6c08a2765" Jan 31 14:47:07 crc kubenswrapper[4751]: E0131 14:47:07.937675 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fd8fddc66836c1f7f3a6139ad51fa7a751ab965677515d2805ae4e6c08a2765\": container with ID starting with 0fd8fddc66836c1f7f3a6139ad51fa7a751ab965677515d2805ae4e6c08a2765 not found: ID does not exist" containerID="0fd8fddc66836c1f7f3a6139ad51fa7a751ab965677515d2805ae4e6c08a2765" Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.937697 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fd8fddc66836c1f7f3a6139ad51fa7a751ab965677515d2805ae4e6c08a2765"} err="failed to get container status \"0fd8fddc66836c1f7f3a6139ad51fa7a751ab965677515d2805ae4e6c08a2765\": rpc error: code = NotFound desc = could not find container \"0fd8fddc66836c1f7f3a6139ad51fa7a751ab965677515d2805ae4e6c08a2765\": container with ID starting with 0fd8fddc66836c1f7f3a6139ad51fa7a751ab965677515d2805ae4e6c08a2765 not found: ID does not exist" Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.937710 4751 scope.go:117] "RemoveContainer" containerID="aa50b668454ba4cf1d6033028034c77daf53f009e58a1184a7d22b857abf8b23" Jan 31 14:47:07 crc kubenswrapper[4751]: E0131 14:47:07.937894 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa50b668454ba4cf1d6033028034c77daf53f009e58a1184a7d22b857abf8b23\": container with ID starting with aa50b668454ba4cf1d6033028034c77daf53f009e58a1184a7d22b857abf8b23 not found: ID does not exist" containerID="aa50b668454ba4cf1d6033028034c77daf53f009e58a1184a7d22b857abf8b23" Jan 31 14:47:07 crc kubenswrapper[4751]: I0131 14:47:07.937909 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa50b668454ba4cf1d6033028034c77daf53f009e58a1184a7d22b857abf8b23"} err="failed to get container status \"aa50b668454ba4cf1d6033028034c77daf53f009e58a1184a7d22b857abf8b23\": rpc error: code = NotFound desc = could not find container \"aa50b668454ba4cf1d6033028034c77daf53f009e58a1184a7d22b857abf8b23\": container with ID starting with aa50b668454ba4cf1d6033028034c77daf53f009e58a1184a7d22b857abf8b23 not found: ID does not exist" Jan 31 14:47:08 crc kubenswrapper[4751]: I0131 14:47:08.413763 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="074619b7-9220-4377-b93d-6088199a5e16" path="/var/lib/kubelet/pods/074619b7-9220-4377-b93d-6088199a5e16/volumes" Jan 31 14:47:08 crc kubenswrapper[4751]: I0131 14:47:08.414819 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cfb2e52-7371-4d38-994c-92b5b7d123cc" path="/var/lib/kubelet/pods/0cfb2e52-7371-4d38-994c-92b5b7d123cc/volumes" Jan 31 14:47:08 crc kubenswrapper[4751]: I0131 14:47:08.415655 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bc2bde1-a50e-47bb-8211-4b5ed0ac74ea" path="/var/lib/kubelet/pods/8bc2bde1-a50e-47bb-8211-4b5ed0ac74ea/volumes" Jan 31 14:47:08 crc kubenswrapper[4751]: I0131 14:47:08.416811 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d5f1383-42d7-47a1-9e47-8dba038241d2" path="/var/lib/kubelet/pods/8d5f1383-42d7-47a1-9e47-8dba038241d2/volumes" Jan 31 14:47:08 crc kubenswrapper[4751]: I0131 14:47:08.417594 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e656c7af-fbd9-4e9c-ae61-d4142d37c89f" path="/var/lib/kubelet/pods/e656c7af-fbd9-4e9c-ae61-d4142d37c89f/volumes" Jan 31 14:47:08 crc kubenswrapper[4751]: I0131 14:47:08.742649 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-jv94g" Jan 31 14:47:08 crc kubenswrapper[4751]: I0131 14:47:08.744804 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-jv94g" Jan 31 14:47:08 crc kubenswrapper[4751]: I0131 14:47:08.825389 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-22krg"] Jan 31 14:47:08 crc kubenswrapper[4751]: E0131 14:47:08.825555 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="074619b7-9220-4377-b93d-6088199a5e16" containerName="registry-server" Jan 31 14:47:08 crc kubenswrapper[4751]: I0131 14:47:08.825566 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="074619b7-9220-4377-b93d-6088199a5e16" containerName="registry-server" Jan 31 14:47:08 crc kubenswrapper[4751]: E0131 14:47:08.825574 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d5f1383-42d7-47a1-9e47-8dba038241d2" containerName="extract-content" Jan 31 14:47:08 crc kubenswrapper[4751]: I0131 14:47:08.825580 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d5f1383-42d7-47a1-9e47-8dba038241d2" containerName="extract-content" Jan 31 14:47:08 crc kubenswrapper[4751]: E0131 14:47:08.825590 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d5f1383-42d7-47a1-9e47-8dba038241d2" containerName="registry-server" Jan 31 14:47:08 crc kubenswrapper[4751]: I0131 14:47:08.825596 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d5f1383-42d7-47a1-9e47-8dba038241d2" containerName="registry-server" Jan 31 14:47:08 crc kubenswrapper[4751]: E0131 14:47:08.825604 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e656c7af-fbd9-4e9c-ae61-d4142d37c89f" containerName="extract-utilities" Jan 31 14:47:08 crc kubenswrapper[4751]: I0131 14:47:08.825609 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="e656c7af-fbd9-4e9c-ae61-d4142d37c89f" containerName="extract-utilities" Jan 31 14:47:08 crc kubenswrapper[4751]: E0131 14:47:08.825615 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="074619b7-9220-4377-b93d-6088199a5e16" containerName="extract-content" Jan 31 14:47:08 crc kubenswrapper[4751]: I0131 14:47:08.825621 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="074619b7-9220-4377-b93d-6088199a5e16" containerName="extract-content" Jan 31 14:47:08 crc kubenswrapper[4751]: E0131 14:47:08.825628 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e656c7af-fbd9-4e9c-ae61-d4142d37c89f" containerName="extract-content" Jan 31 14:47:08 crc kubenswrapper[4751]: I0131 14:47:08.825634 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="e656c7af-fbd9-4e9c-ae61-d4142d37c89f" containerName="extract-content" Jan 31 14:47:08 crc kubenswrapper[4751]: E0131 14:47:08.825642 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cfb2e52-7371-4d38-994c-92b5b7d123cc" containerName="registry-server" Jan 31 14:47:08 crc kubenswrapper[4751]: I0131 14:47:08.825647 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cfb2e52-7371-4d38-994c-92b5b7d123cc" containerName="registry-server" Jan 31 14:47:08 crc kubenswrapper[4751]: E0131 14:47:08.825658 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cfb2e52-7371-4d38-994c-92b5b7d123cc" containerName="extract-content" Jan 31 14:47:08 crc kubenswrapper[4751]: I0131 14:47:08.825664 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cfb2e52-7371-4d38-994c-92b5b7d123cc" containerName="extract-content" Jan 31 14:47:08 crc kubenswrapper[4751]: E0131 14:47:08.825673 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e656c7af-fbd9-4e9c-ae61-d4142d37c89f" containerName="registry-server" Jan 31 14:47:08 crc kubenswrapper[4751]: I0131 14:47:08.825678 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="e656c7af-fbd9-4e9c-ae61-d4142d37c89f" containerName="registry-server" Jan 31 14:47:08 crc kubenswrapper[4751]: E0131 14:47:08.825688 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cfb2e52-7371-4d38-994c-92b5b7d123cc" containerName="extract-utilities" Jan 31 14:47:08 crc kubenswrapper[4751]: I0131 14:47:08.825693 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cfb2e52-7371-4d38-994c-92b5b7d123cc" containerName="extract-utilities" Jan 31 14:47:08 crc kubenswrapper[4751]: E0131 14:47:08.825704 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="074619b7-9220-4377-b93d-6088199a5e16" containerName="extract-utilities" Jan 31 14:47:08 crc kubenswrapper[4751]: I0131 14:47:08.825710 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="074619b7-9220-4377-b93d-6088199a5e16" containerName="extract-utilities" Jan 31 14:47:08 crc kubenswrapper[4751]: E0131 14:47:08.825718 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d5f1383-42d7-47a1-9e47-8dba038241d2" containerName="extract-utilities" Jan 31 14:47:08 crc kubenswrapper[4751]: I0131 14:47:08.825723 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d5f1383-42d7-47a1-9e47-8dba038241d2" containerName="extract-utilities" Jan 31 14:47:08 crc kubenswrapper[4751]: E0131 14:47:08.825731 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bc2bde1-a50e-47bb-8211-4b5ed0ac74ea" containerName="marketplace-operator" Jan 31 14:47:08 crc kubenswrapper[4751]: I0131 14:47:08.825736 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bc2bde1-a50e-47bb-8211-4b5ed0ac74ea" containerName="marketplace-operator" Jan 31 14:47:08 crc kubenswrapper[4751]: I0131 14:47:08.825809 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="e656c7af-fbd9-4e9c-ae61-d4142d37c89f" containerName="registry-server" Jan 31 14:47:08 crc kubenswrapper[4751]: I0131 14:47:08.825820 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="074619b7-9220-4377-b93d-6088199a5e16" containerName="registry-server" Jan 31 14:47:08 crc kubenswrapper[4751]: I0131 14:47:08.825827 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cfb2e52-7371-4d38-994c-92b5b7d123cc" containerName="registry-server" Jan 31 14:47:08 crc kubenswrapper[4751]: I0131 14:47:08.825837 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bc2bde1-a50e-47bb-8211-4b5ed0ac74ea" containerName="marketplace-operator" Jan 31 14:47:08 crc kubenswrapper[4751]: I0131 14:47:08.825845 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d5f1383-42d7-47a1-9e47-8dba038241d2" containerName="registry-server" Jan 31 14:47:08 crc kubenswrapper[4751]: I0131 14:47:08.826462 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-22krg" Jan 31 14:47:08 crc kubenswrapper[4751]: I0131 14:47:08.830319 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 31 14:47:08 crc kubenswrapper[4751]: I0131 14:47:08.841392 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-22krg"] Jan 31 14:47:08 crc kubenswrapper[4751]: I0131 14:47:08.945089 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dppkm\" (UniqueName: \"kubernetes.io/projected/affc293d-ac4e-49ad-be4a-bc13d7c056a7-kube-api-access-dppkm\") pod \"redhat-marketplace-22krg\" (UID: \"affc293d-ac4e-49ad-be4a-bc13d7c056a7\") " pod="openshift-marketplace/redhat-marketplace-22krg" Jan 31 14:47:08 crc kubenswrapper[4751]: I0131 14:47:08.945139 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/affc293d-ac4e-49ad-be4a-bc13d7c056a7-utilities\") pod \"redhat-marketplace-22krg\" (UID: \"affc293d-ac4e-49ad-be4a-bc13d7c056a7\") " pod="openshift-marketplace/redhat-marketplace-22krg" Jan 31 14:47:08 crc kubenswrapper[4751]: I0131 14:47:08.945260 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/affc293d-ac4e-49ad-be4a-bc13d7c056a7-catalog-content\") pod \"redhat-marketplace-22krg\" (UID: \"affc293d-ac4e-49ad-be4a-bc13d7c056a7\") " pod="openshift-marketplace/redhat-marketplace-22krg" Jan 31 14:47:09 crc kubenswrapper[4751]: I0131 14:47:09.046243 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dppkm\" (UniqueName: \"kubernetes.io/projected/affc293d-ac4e-49ad-be4a-bc13d7c056a7-kube-api-access-dppkm\") pod \"redhat-marketplace-22krg\" (UID: \"affc293d-ac4e-49ad-be4a-bc13d7c056a7\") " pod="openshift-marketplace/redhat-marketplace-22krg" Jan 31 14:47:09 crc kubenswrapper[4751]: I0131 14:47:09.046301 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/affc293d-ac4e-49ad-be4a-bc13d7c056a7-utilities\") pod \"redhat-marketplace-22krg\" (UID: \"affc293d-ac4e-49ad-be4a-bc13d7c056a7\") " pod="openshift-marketplace/redhat-marketplace-22krg" Jan 31 14:47:09 crc kubenswrapper[4751]: I0131 14:47:09.046361 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/affc293d-ac4e-49ad-be4a-bc13d7c056a7-catalog-content\") pod \"redhat-marketplace-22krg\" (UID: \"affc293d-ac4e-49ad-be4a-bc13d7c056a7\") " pod="openshift-marketplace/redhat-marketplace-22krg" Jan 31 14:47:09 crc kubenswrapper[4751]: I0131 14:47:09.046839 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/affc293d-ac4e-49ad-be4a-bc13d7c056a7-catalog-content\") pod \"redhat-marketplace-22krg\" (UID: \"affc293d-ac4e-49ad-be4a-bc13d7c056a7\") " pod="openshift-marketplace/redhat-marketplace-22krg" Jan 31 14:47:09 crc kubenswrapper[4751]: I0131 14:47:09.047030 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/affc293d-ac4e-49ad-be4a-bc13d7c056a7-utilities\") pod \"redhat-marketplace-22krg\" (UID: \"affc293d-ac4e-49ad-be4a-bc13d7c056a7\") " pod="openshift-marketplace/redhat-marketplace-22krg" Jan 31 14:47:09 crc kubenswrapper[4751]: I0131 14:47:09.079016 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dppkm\" (UniqueName: \"kubernetes.io/projected/affc293d-ac4e-49ad-be4a-bc13d7c056a7-kube-api-access-dppkm\") pod \"redhat-marketplace-22krg\" (UID: \"affc293d-ac4e-49ad-be4a-bc13d7c056a7\") " pod="openshift-marketplace/redhat-marketplace-22krg" Jan 31 14:47:09 crc kubenswrapper[4751]: I0131 14:47:09.142099 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-22krg" Jan 31 14:47:09 crc kubenswrapper[4751]: I0131 14:47:09.594459 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-22krg"] Jan 31 14:47:09 crc kubenswrapper[4751]: I0131 14:47:09.747800 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-22krg" event={"ID":"affc293d-ac4e-49ad-be4a-bc13d7c056a7","Type":"ContainerStarted","Data":"e2924e6df1ae2df930cc380ff4f034e1e01fc092d1b0cac883bd6d09d6d43e8e"} Jan 31 14:47:09 crc kubenswrapper[4751]: I0131 14:47:09.828708 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-678m7"] Jan 31 14:47:09 crc kubenswrapper[4751]: I0131 14:47:09.829942 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-678m7" Jan 31 14:47:09 crc kubenswrapper[4751]: I0131 14:47:09.833662 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 31 14:47:09 crc kubenswrapper[4751]: I0131 14:47:09.839431 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-678m7"] Jan 31 14:47:09 crc kubenswrapper[4751]: I0131 14:47:09.959017 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43fbbbf2-c128-46a4-9cc3-99e46c617027-utilities\") pod \"redhat-operators-678m7\" (UID: \"43fbbbf2-c128-46a4-9cc3-99e46c617027\") " pod="openshift-marketplace/redhat-operators-678m7" Jan 31 14:47:09 crc kubenswrapper[4751]: I0131 14:47:09.959185 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43fbbbf2-c128-46a4-9cc3-99e46c617027-catalog-content\") pod \"redhat-operators-678m7\" (UID: \"43fbbbf2-c128-46a4-9cc3-99e46c617027\") " pod="openshift-marketplace/redhat-operators-678m7" Jan 31 14:47:09 crc kubenswrapper[4751]: I0131 14:47:09.959244 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcznw\" (UniqueName: \"kubernetes.io/projected/43fbbbf2-c128-46a4-9cc3-99e46c617027-kube-api-access-wcznw\") pod \"redhat-operators-678m7\" (UID: \"43fbbbf2-c128-46a4-9cc3-99e46c617027\") " pod="openshift-marketplace/redhat-operators-678m7" Jan 31 14:47:10 crc kubenswrapper[4751]: I0131 14:47:10.060871 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43fbbbf2-c128-46a4-9cc3-99e46c617027-utilities\") pod \"redhat-operators-678m7\" (UID: \"43fbbbf2-c128-46a4-9cc3-99e46c617027\") " pod="openshift-marketplace/redhat-operators-678m7" Jan 31 14:47:10 crc kubenswrapper[4751]: I0131 14:47:10.060940 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43fbbbf2-c128-46a4-9cc3-99e46c617027-catalog-content\") pod \"redhat-operators-678m7\" (UID: \"43fbbbf2-c128-46a4-9cc3-99e46c617027\") " pod="openshift-marketplace/redhat-operators-678m7" Jan 31 14:47:10 crc kubenswrapper[4751]: I0131 14:47:10.060994 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcznw\" (UniqueName: \"kubernetes.io/projected/43fbbbf2-c128-46a4-9cc3-99e46c617027-kube-api-access-wcznw\") pod \"redhat-operators-678m7\" (UID: \"43fbbbf2-c128-46a4-9cc3-99e46c617027\") " pod="openshift-marketplace/redhat-operators-678m7" Jan 31 14:47:10 crc kubenswrapper[4751]: I0131 14:47:10.061778 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43fbbbf2-c128-46a4-9cc3-99e46c617027-utilities\") pod \"redhat-operators-678m7\" (UID: \"43fbbbf2-c128-46a4-9cc3-99e46c617027\") " pod="openshift-marketplace/redhat-operators-678m7" Jan 31 14:47:10 crc kubenswrapper[4751]: I0131 14:47:10.061984 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43fbbbf2-c128-46a4-9cc3-99e46c617027-catalog-content\") pod \"redhat-operators-678m7\" (UID: \"43fbbbf2-c128-46a4-9cc3-99e46c617027\") " pod="openshift-marketplace/redhat-operators-678m7" Jan 31 14:47:10 crc kubenswrapper[4751]: I0131 14:47:10.079293 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcznw\" (UniqueName: \"kubernetes.io/projected/43fbbbf2-c128-46a4-9cc3-99e46c617027-kube-api-access-wcznw\") pod \"redhat-operators-678m7\" (UID: \"43fbbbf2-c128-46a4-9cc3-99e46c617027\") " pod="openshift-marketplace/redhat-operators-678m7" Jan 31 14:47:10 crc kubenswrapper[4751]: I0131 14:47:10.161267 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-678m7" Jan 31 14:47:10 crc kubenswrapper[4751]: I0131 14:47:10.559996 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-678m7"] Jan 31 14:47:10 crc kubenswrapper[4751]: W0131 14:47:10.567120 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43fbbbf2_c128_46a4_9cc3_99e46c617027.slice/crio-d06bd7bb965534d3f4081f66291469dfeeba3fc44f0ad41ce4151ef674a66b32 WatchSource:0}: Error finding container d06bd7bb965534d3f4081f66291469dfeeba3fc44f0ad41ce4151ef674a66b32: Status 404 returned error can't find the container with id d06bd7bb965534d3f4081f66291469dfeeba3fc44f0ad41ce4151ef674a66b32 Jan 31 14:47:10 crc kubenswrapper[4751]: I0131 14:47:10.754794 4751 generic.go:334] "Generic (PLEG): container finished" podID="affc293d-ac4e-49ad-be4a-bc13d7c056a7" containerID="ef0b6ffeed9764097de7924c9d5800599c1fcf813b5b2868a854c3e83b3eddeb" exitCode=0 Jan 31 14:47:10 crc kubenswrapper[4751]: I0131 14:47:10.754870 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-22krg" event={"ID":"affc293d-ac4e-49ad-be4a-bc13d7c056a7","Type":"ContainerDied","Data":"ef0b6ffeed9764097de7924c9d5800599c1fcf813b5b2868a854c3e83b3eddeb"} Jan 31 14:47:10 crc kubenswrapper[4751]: I0131 14:47:10.759756 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-678m7" event={"ID":"43fbbbf2-c128-46a4-9cc3-99e46c617027","Type":"ContainerStarted","Data":"f2cafee39d3e2d35eefedf868fc04160baf4ea42c8c5028c544445c72eedb2be"} Jan 31 14:47:10 crc kubenswrapper[4751]: I0131 14:47:10.759800 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-678m7" event={"ID":"43fbbbf2-c128-46a4-9cc3-99e46c617027","Type":"ContainerStarted","Data":"d06bd7bb965534d3f4081f66291469dfeeba3fc44f0ad41ce4151ef674a66b32"} Jan 31 14:47:11 crc kubenswrapper[4751]: I0131 14:47:11.229381 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qcs7h"] Jan 31 14:47:11 crc kubenswrapper[4751]: I0131 14:47:11.231256 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qcs7h" Jan 31 14:47:11 crc kubenswrapper[4751]: I0131 14:47:11.234397 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 31 14:47:11 crc kubenswrapper[4751]: I0131 14:47:11.235965 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qcs7h"] Jan 31 14:47:11 crc kubenswrapper[4751]: I0131 14:47:11.275938 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c83f0a10-f56b-4795-93b9-ee224d439648-utilities\") pod \"certified-operators-qcs7h\" (UID: \"c83f0a10-f56b-4795-93b9-ee224d439648\") " pod="openshift-marketplace/certified-operators-qcs7h" Jan 31 14:47:11 crc kubenswrapper[4751]: I0131 14:47:11.276007 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c83f0a10-f56b-4795-93b9-ee224d439648-catalog-content\") pod \"certified-operators-qcs7h\" (UID: \"c83f0a10-f56b-4795-93b9-ee224d439648\") " pod="openshift-marketplace/certified-operators-qcs7h" Jan 31 14:47:11 crc kubenswrapper[4751]: I0131 14:47:11.276034 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w72wf\" (UniqueName: \"kubernetes.io/projected/c83f0a10-f56b-4795-93b9-ee224d439648-kube-api-access-w72wf\") pod \"certified-operators-qcs7h\" (UID: \"c83f0a10-f56b-4795-93b9-ee224d439648\") " pod="openshift-marketplace/certified-operators-qcs7h" Jan 31 14:47:11 crc kubenswrapper[4751]: I0131 14:47:11.378036 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c83f0a10-f56b-4795-93b9-ee224d439648-utilities\") pod \"certified-operators-qcs7h\" (UID: \"c83f0a10-f56b-4795-93b9-ee224d439648\") " pod="openshift-marketplace/certified-operators-qcs7h" Jan 31 14:47:11 crc kubenswrapper[4751]: I0131 14:47:11.378538 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c83f0a10-f56b-4795-93b9-ee224d439648-catalog-content\") pod \"certified-operators-qcs7h\" (UID: \"c83f0a10-f56b-4795-93b9-ee224d439648\") " pod="openshift-marketplace/certified-operators-qcs7h" Jan 31 14:47:11 crc kubenswrapper[4751]: I0131 14:47:11.378697 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w72wf\" (UniqueName: \"kubernetes.io/projected/c83f0a10-f56b-4795-93b9-ee224d439648-kube-api-access-w72wf\") pod \"certified-operators-qcs7h\" (UID: \"c83f0a10-f56b-4795-93b9-ee224d439648\") " pod="openshift-marketplace/certified-operators-qcs7h" Jan 31 14:47:11 crc kubenswrapper[4751]: I0131 14:47:11.378794 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c83f0a10-f56b-4795-93b9-ee224d439648-utilities\") pod \"certified-operators-qcs7h\" (UID: \"c83f0a10-f56b-4795-93b9-ee224d439648\") " pod="openshift-marketplace/certified-operators-qcs7h" Jan 31 14:47:11 crc kubenswrapper[4751]: I0131 14:47:11.379138 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c83f0a10-f56b-4795-93b9-ee224d439648-catalog-content\") pod \"certified-operators-qcs7h\" (UID: \"c83f0a10-f56b-4795-93b9-ee224d439648\") " pod="openshift-marketplace/certified-operators-qcs7h" Jan 31 14:47:11 crc kubenswrapper[4751]: I0131 14:47:11.866328 4751 generic.go:334] "Generic (PLEG): container finished" podID="43fbbbf2-c128-46a4-9cc3-99e46c617027" containerID="f2cafee39d3e2d35eefedf868fc04160baf4ea42c8c5028c544445c72eedb2be" exitCode=0 Jan 31 14:47:11 crc kubenswrapper[4751]: I0131 14:47:11.866546 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-678m7" event={"ID":"43fbbbf2-c128-46a4-9cc3-99e46c617027","Type":"ContainerDied","Data":"f2cafee39d3e2d35eefedf868fc04160baf4ea42c8c5028c544445c72eedb2be"} Jan 31 14:47:11 crc kubenswrapper[4751]: I0131 14:47:11.871039 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w72wf\" (UniqueName: \"kubernetes.io/projected/c83f0a10-f56b-4795-93b9-ee224d439648-kube-api-access-w72wf\") pod \"certified-operators-qcs7h\" (UID: \"c83f0a10-f56b-4795-93b9-ee224d439648\") " pod="openshift-marketplace/certified-operators-qcs7h" Jan 31 14:47:12 crc kubenswrapper[4751]: I0131 14:47:12.159923 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qcs7h" Jan 31 14:47:13 crc kubenswrapper[4751]: I0131 14:47:12.428167 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gr5gf"] Jan 31 14:47:13 crc kubenswrapper[4751]: I0131 14:47:12.429466 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gr5gf" Jan 31 14:47:13 crc kubenswrapper[4751]: I0131 14:47:12.431778 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gr5gf"] Jan 31 14:47:13 crc kubenswrapper[4751]: I0131 14:47:12.432419 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 31 14:47:13 crc kubenswrapper[4751]: I0131 14:47:12.495200 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdm52\" (UniqueName: \"kubernetes.io/projected/2eb5e3aa-17fa-49a0-a422-bc69a8a410fb-kube-api-access-fdm52\") pod \"community-operators-gr5gf\" (UID: \"2eb5e3aa-17fa-49a0-a422-bc69a8a410fb\") " pod="openshift-marketplace/community-operators-gr5gf" Jan 31 14:47:13 crc kubenswrapper[4751]: I0131 14:47:12.495254 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2eb5e3aa-17fa-49a0-a422-bc69a8a410fb-utilities\") pod \"community-operators-gr5gf\" (UID: \"2eb5e3aa-17fa-49a0-a422-bc69a8a410fb\") " pod="openshift-marketplace/community-operators-gr5gf" Jan 31 14:47:13 crc kubenswrapper[4751]: I0131 14:47:12.495294 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2eb5e3aa-17fa-49a0-a422-bc69a8a410fb-catalog-content\") pod \"community-operators-gr5gf\" (UID: \"2eb5e3aa-17fa-49a0-a422-bc69a8a410fb\") " pod="openshift-marketplace/community-operators-gr5gf" Jan 31 14:47:13 crc kubenswrapper[4751]: I0131 14:47:12.593588 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qcs7h"] Jan 31 14:47:13 crc kubenswrapper[4751]: I0131 14:47:12.596680 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdm52\" (UniqueName: \"kubernetes.io/projected/2eb5e3aa-17fa-49a0-a422-bc69a8a410fb-kube-api-access-fdm52\") pod \"community-operators-gr5gf\" (UID: \"2eb5e3aa-17fa-49a0-a422-bc69a8a410fb\") " pod="openshift-marketplace/community-operators-gr5gf" Jan 31 14:47:13 crc kubenswrapper[4751]: I0131 14:47:12.596715 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2eb5e3aa-17fa-49a0-a422-bc69a8a410fb-utilities\") pod \"community-operators-gr5gf\" (UID: \"2eb5e3aa-17fa-49a0-a422-bc69a8a410fb\") " pod="openshift-marketplace/community-operators-gr5gf" Jan 31 14:47:13 crc kubenswrapper[4751]: I0131 14:47:12.596747 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2eb5e3aa-17fa-49a0-a422-bc69a8a410fb-catalog-content\") pod \"community-operators-gr5gf\" (UID: \"2eb5e3aa-17fa-49a0-a422-bc69a8a410fb\") " pod="openshift-marketplace/community-operators-gr5gf" Jan 31 14:47:13 crc kubenswrapper[4751]: I0131 14:47:12.597168 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2eb5e3aa-17fa-49a0-a422-bc69a8a410fb-catalog-content\") pod \"community-operators-gr5gf\" (UID: \"2eb5e3aa-17fa-49a0-a422-bc69a8a410fb\") " pod="openshift-marketplace/community-operators-gr5gf" Jan 31 14:47:13 crc kubenswrapper[4751]: I0131 14:47:12.597607 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2eb5e3aa-17fa-49a0-a422-bc69a8a410fb-utilities\") pod \"community-operators-gr5gf\" (UID: \"2eb5e3aa-17fa-49a0-a422-bc69a8a410fb\") " pod="openshift-marketplace/community-operators-gr5gf" Jan 31 14:47:13 crc kubenswrapper[4751]: I0131 14:47:12.621093 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdm52\" (UniqueName: \"kubernetes.io/projected/2eb5e3aa-17fa-49a0-a422-bc69a8a410fb-kube-api-access-fdm52\") pod \"community-operators-gr5gf\" (UID: \"2eb5e3aa-17fa-49a0-a422-bc69a8a410fb\") " pod="openshift-marketplace/community-operators-gr5gf" Jan 31 14:47:13 crc kubenswrapper[4751]: I0131 14:47:12.745944 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gr5gf" Jan 31 14:47:13 crc kubenswrapper[4751]: I0131 14:47:12.879408 4751 generic.go:334] "Generic (PLEG): container finished" podID="affc293d-ac4e-49ad-be4a-bc13d7c056a7" containerID="2a4c0e6fdb547d7da161d6df9f6a153b6a313115facfb8588f53546344a84b83" exitCode=0 Jan 31 14:47:13 crc kubenswrapper[4751]: I0131 14:47:12.879482 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-22krg" event={"ID":"affc293d-ac4e-49ad-be4a-bc13d7c056a7","Type":"ContainerDied","Data":"2a4c0e6fdb547d7da161d6df9f6a153b6a313115facfb8588f53546344a84b83"} Jan 31 14:47:13 crc kubenswrapper[4751]: I0131 14:47:12.882284 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-678m7" event={"ID":"43fbbbf2-c128-46a4-9cc3-99e46c617027","Type":"ContainerStarted","Data":"cbc01a65e88cb04d479e3fbee6b56cd23c306d0d404314544d61604afee6ce91"} Jan 31 14:47:13 crc kubenswrapper[4751]: I0131 14:47:12.885504 4751 generic.go:334] "Generic (PLEG): container finished" podID="c83f0a10-f56b-4795-93b9-ee224d439648" containerID="f0c7d210f4905c5ebd63ad1688d4962a22611a0650fc973151439c47e20f365f" exitCode=0 Jan 31 14:47:13 crc kubenswrapper[4751]: I0131 14:47:12.886240 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qcs7h" event={"ID":"c83f0a10-f56b-4795-93b9-ee224d439648","Type":"ContainerDied","Data":"f0c7d210f4905c5ebd63ad1688d4962a22611a0650fc973151439c47e20f365f"} Jan 31 14:47:13 crc kubenswrapper[4751]: I0131 14:47:12.886368 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qcs7h" event={"ID":"c83f0a10-f56b-4795-93b9-ee224d439648","Type":"ContainerStarted","Data":"43ba868ae0b26e8ee6fc70a86bc9fb9f499781411409ffef9af2e7dd09c6176a"} Jan 31 14:47:13 crc kubenswrapper[4751]: I0131 14:47:13.446807 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gr5gf"] Jan 31 14:47:13 crc kubenswrapper[4751]: W0131 14:47:13.454940 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2eb5e3aa_17fa_49a0_a422_bc69a8a410fb.slice/crio-a9230f306dbb7839004de374b90ecaee07288759ea0cfb21958fdb7f3d0a7a06 WatchSource:0}: Error finding container a9230f306dbb7839004de374b90ecaee07288759ea0cfb21958fdb7f3d0a7a06: Status 404 returned error can't find the container with id a9230f306dbb7839004de374b90ecaee07288759ea0cfb21958fdb7f3d0a7a06 Jan 31 14:47:13 crc kubenswrapper[4751]: I0131 14:47:13.892505 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qcs7h" event={"ID":"c83f0a10-f56b-4795-93b9-ee224d439648","Type":"ContainerStarted","Data":"0e1f4b1ea2d1f4c691c80005d0d4b88eefb40ca9917b8e2e9866ba3f5c04a5c5"} Jan 31 14:47:13 crc kubenswrapper[4751]: I0131 14:47:13.893738 4751 generic.go:334] "Generic (PLEG): container finished" podID="2eb5e3aa-17fa-49a0-a422-bc69a8a410fb" containerID="c4234f74dd13343823cab275e87af0dec8660d77f7c5674d07ed63ca0ba425fa" exitCode=0 Jan 31 14:47:13 crc kubenswrapper[4751]: I0131 14:47:13.893795 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gr5gf" event={"ID":"2eb5e3aa-17fa-49a0-a422-bc69a8a410fb","Type":"ContainerDied","Data":"c4234f74dd13343823cab275e87af0dec8660d77f7c5674d07ed63ca0ba425fa"} Jan 31 14:47:13 crc kubenswrapper[4751]: I0131 14:47:13.893830 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gr5gf" event={"ID":"2eb5e3aa-17fa-49a0-a422-bc69a8a410fb","Type":"ContainerStarted","Data":"a9230f306dbb7839004de374b90ecaee07288759ea0cfb21958fdb7f3d0a7a06"} Jan 31 14:47:13 crc kubenswrapper[4751]: I0131 14:47:13.896387 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-22krg" event={"ID":"affc293d-ac4e-49ad-be4a-bc13d7c056a7","Type":"ContainerStarted","Data":"c665d5714ce1ec2e2819ec5b82889b2b386aa67d7901b6a62dbf023eefbc4de3"} Jan 31 14:47:13 crc kubenswrapper[4751]: I0131 14:47:13.897998 4751 generic.go:334] "Generic (PLEG): container finished" podID="43fbbbf2-c128-46a4-9cc3-99e46c617027" containerID="cbc01a65e88cb04d479e3fbee6b56cd23c306d0d404314544d61604afee6ce91" exitCode=0 Jan 31 14:47:13 crc kubenswrapper[4751]: I0131 14:47:13.898022 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-678m7" event={"ID":"43fbbbf2-c128-46a4-9cc3-99e46c617027","Type":"ContainerDied","Data":"cbc01a65e88cb04d479e3fbee6b56cd23c306d0d404314544d61604afee6ce91"} Jan 31 14:47:13 crc kubenswrapper[4751]: I0131 14:47:13.938834 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-22krg" podStartSLOduration=3.372518422 podStartE2EDuration="5.938814955s" podCreationTimestamp="2026-01-31 14:47:08 +0000 UTC" firstStartedPulling="2026-01-31 14:47:10.757200016 +0000 UTC m=+335.131912911" lastFinishedPulling="2026-01-31 14:47:13.323496559 +0000 UTC m=+337.698209444" observedRunningTime="2026-01-31 14:47:13.935813921 +0000 UTC m=+338.310526806" watchObservedRunningTime="2026-01-31 14:47:13.938814955 +0000 UTC m=+338.313527840" Jan 31 14:47:14 crc kubenswrapper[4751]: I0131 14:47:14.908451 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-678m7" event={"ID":"43fbbbf2-c128-46a4-9cc3-99e46c617027","Type":"ContainerStarted","Data":"cbc93651e15fe9b1b53de7dc02f8cb49804246c847642a9031bf67df3f58a6d8"} Jan 31 14:47:14 crc kubenswrapper[4751]: I0131 14:47:14.914172 4751 generic.go:334] "Generic (PLEG): container finished" podID="c83f0a10-f56b-4795-93b9-ee224d439648" containerID="0e1f4b1ea2d1f4c691c80005d0d4b88eefb40ca9917b8e2e9866ba3f5c04a5c5" exitCode=0 Jan 31 14:47:14 crc kubenswrapper[4751]: I0131 14:47:14.915350 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qcs7h" event={"ID":"c83f0a10-f56b-4795-93b9-ee224d439648","Type":"ContainerDied","Data":"0e1f4b1ea2d1f4c691c80005d0d4b88eefb40ca9917b8e2e9866ba3f5c04a5c5"} Jan 31 14:47:14 crc kubenswrapper[4751]: I0131 14:47:14.928038 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-678m7" podStartSLOduration=3.517110504 podStartE2EDuration="5.928021128s" podCreationTimestamp="2026-01-31 14:47:09 +0000 UTC" firstStartedPulling="2026-01-31 14:47:11.868354389 +0000 UTC m=+336.243067274" lastFinishedPulling="2026-01-31 14:47:14.279265013 +0000 UTC m=+338.653977898" observedRunningTime="2026-01-31 14:47:14.927605776 +0000 UTC m=+339.302318691" watchObservedRunningTime="2026-01-31 14:47:14.928021128 +0000 UTC m=+339.302734003" Jan 31 14:47:15 crc kubenswrapper[4751]: I0131 14:47:15.926840 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qcs7h" event={"ID":"c83f0a10-f56b-4795-93b9-ee224d439648","Type":"ContainerStarted","Data":"205e2afe79bb15a6fb42be9a5245809e70944c85ed4ba914f8281a5585cee3a0"} Jan 31 14:47:15 crc kubenswrapper[4751]: I0131 14:47:15.929500 4751 generic.go:334] "Generic (PLEG): container finished" podID="2eb5e3aa-17fa-49a0-a422-bc69a8a410fb" containerID="f4c859cc4863edeb57864d4b70863a93d3378c06e1797e924a9ef9213bda12a3" exitCode=0 Jan 31 14:47:15 crc kubenswrapper[4751]: I0131 14:47:15.930515 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gr5gf" event={"ID":"2eb5e3aa-17fa-49a0-a422-bc69a8a410fb","Type":"ContainerDied","Data":"f4c859cc4863edeb57864d4b70863a93d3378c06e1797e924a9ef9213bda12a3"} Jan 31 14:47:15 crc kubenswrapper[4751]: I0131 14:47:15.946168 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qcs7h" podStartSLOduration=2.428422992 podStartE2EDuration="4.946155062s" podCreationTimestamp="2026-01-31 14:47:11 +0000 UTC" firstStartedPulling="2026-01-31 14:47:12.888286154 +0000 UTC m=+337.262999039" lastFinishedPulling="2026-01-31 14:47:15.406018224 +0000 UTC m=+339.780731109" observedRunningTime="2026-01-31 14:47:15.941036359 +0000 UTC m=+340.315749234" watchObservedRunningTime="2026-01-31 14:47:15.946155062 +0000 UTC m=+340.320867947" Jan 31 14:47:17 crc kubenswrapper[4751]: I0131 14:47:17.941099 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gr5gf" event={"ID":"2eb5e3aa-17fa-49a0-a422-bc69a8a410fb","Type":"ContainerStarted","Data":"a295af802189ad1afcb88d928212ac435dbd647d08b6f500b14457174599fe98"} Jan 31 14:47:17 crc kubenswrapper[4751]: I0131 14:47:17.960893 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gr5gf" podStartSLOduration=3.533130167 podStartE2EDuration="5.960877685s" podCreationTimestamp="2026-01-31 14:47:12 +0000 UTC" firstStartedPulling="2026-01-31 14:47:13.895308125 +0000 UTC m=+338.270021010" lastFinishedPulling="2026-01-31 14:47:16.323055643 +0000 UTC m=+340.697768528" observedRunningTime="2026-01-31 14:47:17.959008603 +0000 UTC m=+342.333721488" watchObservedRunningTime="2026-01-31 14:47:17.960877685 +0000 UTC m=+342.335590580" Jan 31 14:47:19 crc kubenswrapper[4751]: I0131 14:47:19.143577 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-22krg" Jan 31 14:47:19 crc kubenswrapper[4751]: I0131 14:47:19.143839 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-22krg" Jan 31 14:47:19 crc kubenswrapper[4751]: I0131 14:47:19.215801 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-22krg" Jan 31 14:47:19 crc kubenswrapper[4751]: I0131 14:47:19.997693 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-22krg" Jan 31 14:47:20 crc kubenswrapper[4751]: I0131 14:47:20.161757 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-678m7" Jan 31 14:47:20 crc kubenswrapper[4751]: I0131 14:47:20.162047 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-678m7" Jan 31 14:47:21 crc kubenswrapper[4751]: I0131 14:47:21.198781 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-678m7" podUID="43fbbbf2-c128-46a4-9cc3-99e46c617027" containerName="registry-server" probeResult="failure" output=< Jan 31 14:47:21 crc kubenswrapper[4751]: timeout: failed to connect service ":50051" within 1s Jan 31 14:47:21 crc kubenswrapper[4751]: > Jan 31 14:47:22 crc kubenswrapper[4751]: I0131 14:47:22.159999 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qcs7h" Jan 31 14:47:22 crc kubenswrapper[4751]: I0131 14:47:22.160336 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qcs7h" Jan 31 14:47:22 crc kubenswrapper[4751]: I0131 14:47:22.219765 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qcs7h" Jan 31 14:47:22 crc kubenswrapper[4751]: I0131 14:47:22.746401 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gr5gf" Jan 31 14:47:22 crc kubenswrapper[4751]: I0131 14:47:22.746474 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gr5gf" Jan 31 14:47:22 crc kubenswrapper[4751]: I0131 14:47:22.788942 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gr5gf" Jan 31 14:47:23 crc kubenswrapper[4751]: I0131 14:47:23.016637 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qcs7h" Jan 31 14:47:23 crc kubenswrapper[4751]: I0131 14:47:23.022414 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gr5gf" Jan 31 14:47:24 crc kubenswrapper[4751]: I0131 14:47:24.148577 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-5fdjn" Jan 31 14:47:24 crc kubenswrapper[4751]: I0131 14:47:24.211476 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mpbgx"] Jan 31 14:47:30 crc kubenswrapper[4751]: I0131 14:47:30.214001 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-678m7" Jan 31 14:47:30 crc kubenswrapper[4751]: I0131 14:47:30.258301 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-678m7" Jan 31 14:47:38 crc kubenswrapper[4751]: I0131 14:47:38.897021 4751 patch_prober.go:28] interesting pod/machine-config-daemon-2wpj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 14:47:38 crc kubenswrapper[4751]: I0131 14:47:38.897800 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podUID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 14:47:49 crc kubenswrapper[4751]: I0131 14:47:49.264023 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" podUID="4e18e163-6cf0-48ef-9a6f-90cbece870b0" containerName="registry" containerID="cri-o://4a4776950d27c1d1245ca6dd71fb7012b30d42bb2d21525539ad27b3f377c032" gracePeriod=30 Jan 31 14:47:50 crc kubenswrapper[4751]: I0131 14:47:50.150159 4751 generic.go:334] "Generic (PLEG): container finished" podID="4e18e163-6cf0-48ef-9a6f-90cbece870b0" containerID="4a4776950d27c1d1245ca6dd71fb7012b30d42bb2d21525539ad27b3f377c032" exitCode=0 Jan 31 14:47:50 crc kubenswrapper[4751]: I0131 14:47:50.150211 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" event={"ID":"4e18e163-6cf0-48ef-9a6f-90cbece870b0","Type":"ContainerDied","Data":"4a4776950d27c1d1245ca6dd71fb7012b30d42bb2d21525539ad27b3f377c032"} Jan 31 14:47:50 crc kubenswrapper[4751]: I0131 14:47:50.269461 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:47:50 crc kubenswrapper[4751]: I0131 14:47:50.339297 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4e18e163-6cf0-48ef-9a6f-90cbece870b0-installation-pull-secrets\") pod \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " Jan 31 14:47:50 crc kubenswrapper[4751]: I0131 14:47:50.339614 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " Jan 31 14:47:50 crc kubenswrapper[4751]: I0131 14:47:50.339663 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4e18e163-6cf0-48ef-9a6f-90cbece870b0-bound-sa-token\") pod \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " Jan 31 14:47:50 crc kubenswrapper[4751]: I0131 14:47:50.339729 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4e18e163-6cf0-48ef-9a6f-90cbece870b0-ca-trust-extracted\") pod \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " Jan 31 14:47:50 crc kubenswrapper[4751]: I0131 14:47:50.339762 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4e18e163-6cf0-48ef-9a6f-90cbece870b0-registry-tls\") pod \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " Jan 31 14:47:50 crc kubenswrapper[4751]: I0131 14:47:50.339804 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4e18e163-6cf0-48ef-9a6f-90cbece870b0-trusted-ca\") pod \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " Jan 31 14:47:50 crc kubenswrapper[4751]: I0131 14:47:50.339837 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4e18e163-6cf0-48ef-9a6f-90cbece870b0-registry-certificates\") pod \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " Jan 31 14:47:50 crc kubenswrapper[4751]: I0131 14:47:50.339874 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llx87\" (UniqueName: \"kubernetes.io/projected/4e18e163-6cf0-48ef-9a6f-90cbece870b0-kube-api-access-llx87\") pod \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\" (UID: \"4e18e163-6cf0-48ef-9a6f-90cbece870b0\") " Jan 31 14:47:50 crc kubenswrapper[4751]: I0131 14:47:50.342386 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e18e163-6cf0-48ef-9a6f-90cbece870b0-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "4e18e163-6cf0-48ef-9a6f-90cbece870b0" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:47:50 crc kubenswrapper[4751]: I0131 14:47:50.342579 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e18e163-6cf0-48ef-9a6f-90cbece870b0-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "4e18e163-6cf0-48ef-9a6f-90cbece870b0" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:47:50 crc kubenswrapper[4751]: I0131 14:47:50.346405 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e18e163-6cf0-48ef-9a6f-90cbece870b0-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "4e18e163-6cf0-48ef-9a6f-90cbece870b0" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:47:50 crc kubenswrapper[4751]: I0131 14:47:50.349125 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e18e163-6cf0-48ef-9a6f-90cbece870b0-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "4e18e163-6cf0-48ef-9a6f-90cbece870b0" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:47:50 crc kubenswrapper[4751]: I0131 14:47:50.349408 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e18e163-6cf0-48ef-9a6f-90cbece870b0-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "4e18e163-6cf0-48ef-9a6f-90cbece870b0" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:47:50 crc kubenswrapper[4751]: I0131 14:47:50.350043 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e18e163-6cf0-48ef-9a6f-90cbece870b0-kube-api-access-llx87" (OuterVolumeSpecName: "kube-api-access-llx87") pod "4e18e163-6cf0-48ef-9a6f-90cbece870b0" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0"). InnerVolumeSpecName "kube-api-access-llx87". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:47:50 crc kubenswrapper[4751]: I0131 14:47:50.354018 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "4e18e163-6cf0-48ef-9a6f-90cbece870b0" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 31 14:47:50 crc kubenswrapper[4751]: I0131 14:47:50.357812 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e18e163-6cf0-48ef-9a6f-90cbece870b0-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "4e18e163-6cf0-48ef-9a6f-90cbece870b0" (UID: "4e18e163-6cf0-48ef-9a6f-90cbece870b0"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 14:47:50 crc kubenswrapper[4751]: I0131 14:47:50.440529 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llx87\" (UniqueName: \"kubernetes.io/projected/4e18e163-6cf0-48ef-9a6f-90cbece870b0-kube-api-access-llx87\") on node \"crc\" DevicePath \"\"" Jan 31 14:47:50 crc kubenswrapper[4751]: I0131 14:47:50.440563 4751 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4e18e163-6cf0-48ef-9a6f-90cbece870b0-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 31 14:47:50 crc kubenswrapper[4751]: I0131 14:47:50.440576 4751 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4e18e163-6cf0-48ef-9a6f-90cbece870b0-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 31 14:47:50 crc kubenswrapper[4751]: I0131 14:47:50.440587 4751 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4e18e163-6cf0-48ef-9a6f-90cbece870b0-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 31 14:47:50 crc kubenswrapper[4751]: I0131 14:47:50.440599 4751 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4e18e163-6cf0-48ef-9a6f-90cbece870b0-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 31 14:47:50 crc kubenswrapper[4751]: I0131 14:47:50.440611 4751 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4e18e163-6cf0-48ef-9a6f-90cbece870b0-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 31 14:47:50 crc kubenswrapper[4751]: I0131 14:47:50.440621 4751 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4e18e163-6cf0-48ef-9a6f-90cbece870b0-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 31 14:47:50 crc kubenswrapper[4751]: E0131 14:47:50.531187 4751 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e18e163_6cf0_48ef_9a6f_90cbece870b0.slice\": RecentStats: unable to find data in memory cache]" Jan 31 14:47:51 crc kubenswrapper[4751]: I0131 14:47:51.158661 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" event={"ID":"4e18e163-6cf0-48ef-9a6f-90cbece870b0","Type":"ContainerDied","Data":"f189ebd73b2de2ffc6329477d3690421c7e4c89608c81de50df6ebb8b9b1c5e0"} Jan 31 14:47:51 crc kubenswrapper[4751]: I0131 14:47:51.158748 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mpbgx" Jan 31 14:47:51 crc kubenswrapper[4751]: I0131 14:47:51.158758 4751 scope.go:117] "RemoveContainer" containerID="4a4776950d27c1d1245ca6dd71fb7012b30d42bb2d21525539ad27b3f377c032" Jan 31 14:47:51 crc kubenswrapper[4751]: I0131 14:47:51.204354 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mpbgx"] Jan 31 14:47:51 crc kubenswrapper[4751]: I0131 14:47:51.211768 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mpbgx"] Jan 31 14:47:52 crc kubenswrapper[4751]: I0131 14:47:52.417728 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e18e163-6cf0-48ef-9a6f-90cbece870b0" path="/var/lib/kubelet/pods/4e18e163-6cf0-48ef-9a6f-90cbece870b0/volumes" Jan 31 14:48:08 crc kubenswrapper[4751]: I0131 14:48:08.897339 4751 patch_prober.go:28] interesting pod/machine-config-daemon-2wpj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 14:48:08 crc kubenswrapper[4751]: I0131 14:48:08.898138 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podUID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 14:48:38 crc kubenswrapper[4751]: I0131 14:48:38.896880 4751 patch_prober.go:28] interesting pod/machine-config-daemon-2wpj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 14:48:38 crc kubenswrapper[4751]: I0131 14:48:38.897547 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podUID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 14:48:38 crc kubenswrapper[4751]: I0131 14:48:38.897613 4751 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" Jan 31 14:48:38 crc kubenswrapper[4751]: I0131 14:48:38.898984 4751 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"45cb0d3a062f00471c149bf8e8ee7eaef0df67968aef3870677e63ed898aa00d"} pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 31 14:48:38 crc kubenswrapper[4751]: I0131 14:48:38.899179 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podUID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" containerName="machine-config-daemon" containerID="cri-o://45cb0d3a062f00471c149bf8e8ee7eaef0df67968aef3870677e63ed898aa00d" gracePeriod=600 Jan 31 14:48:39 crc kubenswrapper[4751]: I0131 14:48:39.511889 4751 generic.go:334] "Generic (PLEG): container finished" podID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" containerID="45cb0d3a062f00471c149bf8e8ee7eaef0df67968aef3870677e63ed898aa00d" exitCode=0 Jan 31 14:48:39 crc kubenswrapper[4751]: I0131 14:48:39.512006 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" event={"ID":"b4c170e8-22c9-43a9-8b34-9d626c2ccddc","Type":"ContainerDied","Data":"45cb0d3a062f00471c149bf8e8ee7eaef0df67968aef3870677e63ed898aa00d"} Jan 31 14:48:39 crc kubenswrapper[4751]: I0131 14:48:39.512234 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" event={"ID":"b4c170e8-22c9-43a9-8b34-9d626c2ccddc","Type":"ContainerStarted","Data":"ef29f0f695de11b302d97f5ade678c0ae9fdc43953c2430b685d7fd276ee3217"} Jan 31 14:48:39 crc kubenswrapper[4751]: I0131 14:48:39.512272 4751 scope.go:117] "RemoveContainer" containerID="3956d143be77f4a50143f9678eb51ab7871e250cae73d87c9e7fce2575e466c2" Jan 31 14:50:36 crc kubenswrapper[4751]: I0131 14:50:36.754271 4751 scope.go:117] "RemoveContainer" containerID="8e402889398f0b5d93bacd46f42378e3cdc7f2ee478995578d04804d8ec0f029" Jan 31 14:50:36 crc kubenswrapper[4751]: I0131 14:50:36.780421 4751 scope.go:117] "RemoveContainer" containerID="96a0531e47323a9257c24b651a7067cc71a6c2a1c9189022bfa8c72e23c446c1" Jan 31 14:51:08 crc kubenswrapper[4751]: I0131 14:51:08.897367 4751 patch_prober.go:28] interesting pod/machine-config-daemon-2wpj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 14:51:08 crc kubenswrapper[4751]: I0131 14:51:08.898462 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podUID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 14:51:38 crc kubenswrapper[4751]: I0131 14:51:38.897321 4751 patch_prober.go:28] interesting pod/machine-config-daemon-2wpj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 14:51:38 crc kubenswrapper[4751]: I0131 14:51:38.898185 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podUID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 14:52:08 crc kubenswrapper[4751]: I0131 14:52:08.896842 4751 patch_prober.go:28] interesting pod/machine-config-daemon-2wpj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 14:52:08 crc kubenswrapper[4751]: I0131 14:52:08.897433 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podUID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 14:52:08 crc kubenswrapper[4751]: I0131 14:52:08.897490 4751 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" Jan 31 14:52:08 crc kubenswrapper[4751]: I0131 14:52:08.898303 4751 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ef29f0f695de11b302d97f5ade678c0ae9fdc43953c2430b685d7fd276ee3217"} pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 31 14:52:08 crc kubenswrapper[4751]: I0131 14:52:08.898394 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podUID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" containerName="machine-config-daemon" containerID="cri-o://ef29f0f695de11b302d97f5ade678c0ae9fdc43953c2430b685d7fd276ee3217" gracePeriod=600 Jan 31 14:52:09 crc kubenswrapper[4751]: I0131 14:52:09.977361 4751 generic.go:334] "Generic (PLEG): container finished" podID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" containerID="ef29f0f695de11b302d97f5ade678c0ae9fdc43953c2430b685d7fd276ee3217" exitCode=0 Jan 31 14:52:09 crc kubenswrapper[4751]: I0131 14:52:09.977885 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" event={"ID":"b4c170e8-22c9-43a9-8b34-9d626c2ccddc","Type":"ContainerDied","Data":"ef29f0f695de11b302d97f5ade678c0ae9fdc43953c2430b685d7fd276ee3217"} Jan 31 14:52:09 crc kubenswrapper[4751]: I0131 14:52:09.978538 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" event={"ID":"b4c170e8-22c9-43a9-8b34-9d626c2ccddc","Type":"ContainerStarted","Data":"f4d4f92719c72ec0adb31e02a10d5c8bcb4b1a9b3bfb5b0e7ed8cfdbc4a53235"} Jan 31 14:52:09 crc kubenswrapper[4751]: I0131 14:52:09.978639 4751 scope.go:117] "RemoveContainer" containerID="45cb0d3a062f00471c149bf8e8ee7eaef0df67968aef3870677e63ed898aa00d" Jan 31 14:52:14 crc kubenswrapper[4751]: I0131 14:52:14.655141 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-n8cdt"] Jan 31 14:52:14 crc kubenswrapper[4751]: I0131 14:52:14.656536 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" podUID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" containerName="ovn-controller" containerID="cri-o://e8f6719bdf2fef7e21f7c5cdcb3de4b4f08e69bd62e22cb10fe157f638b3c148" gracePeriod=30 Jan 31 14:52:14 crc kubenswrapper[4751]: I0131 14:52:14.657618 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" podUID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" containerName="sbdb" containerID="cri-o://e7f5e784895f315aa4d8fb6039b9e70c46f977a80fdd5d39ead81d75048abd6c" gracePeriod=30 Jan 31 14:52:14 crc kubenswrapper[4751]: I0131 14:52:14.657701 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" podUID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" containerName="nbdb" containerID="cri-o://701178503ef461547e687442de0607af20cee8bca075347d1d1cb2a439562232" gracePeriod=30 Jan 31 14:52:14 crc kubenswrapper[4751]: I0131 14:52:14.657761 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" podUID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" containerName="northd" containerID="cri-o://5623c591d9a6806335a9b671648fc172e749568a7ad29c3bd41f1a990ba56010" gracePeriod=30 Jan 31 14:52:14 crc kubenswrapper[4751]: I0131 14:52:14.657816 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" podUID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://20273cc05bc490f8dd5258dccd02aa0e97d1d22c159ce2c177594920f4af83de" gracePeriod=30 Jan 31 14:52:14 crc kubenswrapper[4751]: I0131 14:52:14.657882 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" podUID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" containerName="kube-rbac-proxy-node" containerID="cri-o://f3446cde960ab284f70a338a9a4bfee334fdbdef503d868692fb847f4df6acba" gracePeriod=30 Jan 31 14:52:14 crc kubenswrapper[4751]: I0131 14:52:14.658021 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" podUID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" containerName="ovn-acl-logging" containerID="cri-o://357afc89f219ac34037aaeb7086076b36f05907502c53006aeb588eaf440cc74" gracePeriod=30 Jan 31 14:52:14 crc kubenswrapper[4751]: I0131 14:52:14.708425 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" podUID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" containerName="ovnkube-controller" containerID="cri-o://153c98b7ebe36043f7ae094ec4ae3226c12652e95174c4ff2d00efc441bdb785" gracePeriod=30 Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.017807 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n8cdt_ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b/ovnkube-controller/3.log" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.021794 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n8cdt_ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b/ovn-acl-logging/0.log" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.028473 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n8cdt_ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b/ovnkube-controller/3.log" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.028637 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n8cdt_ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b/ovn-controller/0.log" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.029728 4751 generic.go:334] "Generic (PLEG): container finished" podID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" containerID="153c98b7ebe36043f7ae094ec4ae3226c12652e95174c4ff2d00efc441bdb785" exitCode=0 Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.029788 4751 generic.go:334] "Generic (PLEG): container finished" podID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" containerID="e7f5e784895f315aa4d8fb6039b9e70c46f977a80fdd5d39ead81d75048abd6c" exitCode=0 Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.029810 4751 generic.go:334] "Generic (PLEG): container finished" podID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" containerID="701178503ef461547e687442de0607af20cee8bca075347d1d1cb2a439562232" exitCode=0 Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.029812 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" event={"ID":"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b","Type":"ContainerDied","Data":"153c98b7ebe36043f7ae094ec4ae3226c12652e95174c4ff2d00efc441bdb785"} Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.029834 4751 generic.go:334] "Generic (PLEG): container finished" podID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" containerID="5623c591d9a6806335a9b671648fc172e749568a7ad29c3bd41f1a990ba56010" exitCode=0 Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.029852 4751 generic.go:334] "Generic (PLEG): container finished" podID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" containerID="20273cc05bc490f8dd5258dccd02aa0e97d1d22c159ce2c177594920f4af83de" exitCode=0 Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.029874 4751 generic.go:334] "Generic (PLEG): container finished" podID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" containerID="f3446cde960ab284f70a338a9a4bfee334fdbdef503d868692fb847f4df6acba" exitCode=0 Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.029894 4751 scope.go:117] "RemoveContainer" containerID="373c89defd0c3e17f3124be6af9afba6b241a48af85f558bb51d281d16ba27ac" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.029894 4751 generic.go:334] "Generic (PLEG): container finished" podID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" containerID="357afc89f219ac34037aaeb7086076b36f05907502c53006aeb588eaf440cc74" exitCode=143 Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.030020 4751 generic.go:334] "Generic (PLEG): container finished" podID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" containerID="e8f6719bdf2fef7e21f7c5cdcb3de4b4f08e69bd62e22cb10fe157f638b3c148" exitCode=143 Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.029875 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" event={"ID":"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b","Type":"ContainerDied","Data":"e7f5e784895f315aa4d8fb6039b9e70c46f977a80fdd5d39ead81d75048abd6c"} Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.030124 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" event={"ID":"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b","Type":"ContainerDied","Data":"701178503ef461547e687442de0607af20cee8bca075347d1d1cb2a439562232"} Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.030153 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" event={"ID":"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b","Type":"ContainerDied","Data":"5623c591d9a6806335a9b671648fc172e749568a7ad29c3bd41f1a990ba56010"} Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.030168 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" event={"ID":"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b","Type":"ContainerDied","Data":"20273cc05bc490f8dd5258dccd02aa0e97d1d22c159ce2c177594920f4af83de"} Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.030181 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" event={"ID":"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b","Type":"ContainerDied","Data":"f3446cde960ab284f70a338a9a4bfee334fdbdef503d868692fb847f4df6acba"} Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.030193 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" event={"ID":"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b","Type":"ContainerDied","Data":"357afc89f219ac34037aaeb7086076b36f05907502c53006aeb588eaf440cc74"} Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.030206 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" event={"ID":"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b","Type":"ContainerDied","Data":"e8f6719bdf2fef7e21f7c5cdcb3de4b4f08e69bd62e22cb10fe157f638b3c148"} Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.030219 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" event={"ID":"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b","Type":"ContainerDied","Data":"4bbc5e8f3ce6775d094673644f5cb7355eba674b33cab2a960c6b275357e72b8"} Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.030236 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4bbc5e8f3ce6775d094673644f5cb7355eba674b33cab2a960c6b275357e72b8" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.031362 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n8cdt_ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b/ovn-acl-logging/0.log" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.032212 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n8cdt_ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b/ovn-controller/0.log" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.033330 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rtthp_e7dd989b-33df-4562-a60b-f273428fea3d/kube-multus/2.log" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.033379 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.033888 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rtthp_e7dd989b-33df-4562-a60b-f273428fea3d/kube-multus/1.log" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.033951 4751 generic.go:334] "Generic (PLEG): container finished" podID="e7dd989b-33df-4562-a60b-f273428fea3d" containerID="98a2f0e75ca2c214fba50a70792a41195e5b7e674dbe1ae5b98cd015b7526483" exitCode=2 Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.033990 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rtthp" event={"ID":"e7dd989b-33df-4562-a60b-f273428fea3d","Type":"ContainerDied","Data":"98a2f0e75ca2c214fba50a70792a41195e5b7e674dbe1ae5b98cd015b7526483"} Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.034846 4751 scope.go:117] "RemoveContainer" containerID="98a2f0e75ca2c214fba50a70792a41195e5b7e674dbe1ae5b98cd015b7526483" Jan 31 14:52:15 crc kubenswrapper[4751]: E0131 14:52:15.035386 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-rtthp_openshift-multus(e7dd989b-33df-4562-a60b-f273428fea3d)\"" pod="openshift-multus/multus-rtthp" podUID="e7dd989b-33df-4562-a60b-f273428fea3d" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.053342 4751 scope.go:117] "RemoveContainer" containerID="2bdcbdac0cc4b17e027947c041a0ee4a7d7f549aa6dbe5c07c370ca7c0c50475" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.094214 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-9bvhc"] Jan 31 14:52:15 crc kubenswrapper[4751]: E0131 14:52:15.094878 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" containerName="kube-rbac-proxy-node" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.095019 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" containerName="kube-rbac-proxy-node" Jan 31 14:52:15 crc kubenswrapper[4751]: E0131 14:52:15.095180 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" containerName="kube-rbac-proxy-ovn-metrics" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.095282 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" containerName="kube-rbac-proxy-ovn-metrics" Jan 31 14:52:15 crc kubenswrapper[4751]: E0131 14:52:15.095377 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" containerName="kubecfg-setup" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.095478 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" containerName="kubecfg-setup" Jan 31 14:52:15 crc kubenswrapper[4751]: E0131 14:52:15.095590 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" containerName="northd" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.095689 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" containerName="northd" Jan 31 14:52:15 crc kubenswrapper[4751]: E0131 14:52:15.095804 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" containerName="nbdb" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.095905 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" containerName="nbdb" Jan 31 14:52:15 crc kubenswrapper[4751]: E0131 14:52:15.096004 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" containerName="ovn-acl-logging" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.096130 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" containerName="ovn-acl-logging" Jan 31 14:52:15 crc kubenswrapper[4751]: E0131 14:52:15.096267 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" containerName="ovnkube-controller" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.096370 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" containerName="ovnkube-controller" Jan 31 14:52:15 crc kubenswrapper[4751]: E0131 14:52:15.096475 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" containerName="ovnkube-controller" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.096574 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" containerName="ovnkube-controller" Jan 31 14:52:15 crc kubenswrapper[4751]: E0131 14:52:15.096671 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e18e163-6cf0-48ef-9a6f-90cbece870b0" containerName="registry" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.096770 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e18e163-6cf0-48ef-9a6f-90cbece870b0" containerName="registry" Jan 31 14:52:15 crc kubenswrapper[4751]: E0131 14:52:15.096874 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" containerName="ovnkube-controller" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.096968 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" containerName="ovnkube-controller" Jan 31 14:52:15 crc kubenswrapper[4751]: E0131 14:52:15.097109 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" containerName="ovnkube-controller" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.097227 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" containerName="ovnkube-controller" Jan 31 14:52:15 crc kubenswrapper[4751]: E0131 14:52:15.097334 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" containerName="ovn-controller" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.097433 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" containerName="ovn-controller" Jan 31 14:52:15 crc kubenswrapper[4751]: E0131 14:52:15.097538 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" containerName="sbdb" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.097629 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" containerName="sbdb" Jan 31 14:52:15 crc kubenswrapper[4751]: E0131 14:52:15.097728 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" containerName="ovnkube-controller" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.097852 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" containerName="ovnkube-controller" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.098142 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" containerName="ovnkube-controller" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.098275 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" containerName="sbdb" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.098382 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" containerName="ovnkube-controller" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.098484 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" containerName="kube-rbac-proxy-node" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.098630 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" containerName="ovnkube-controller" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.098763 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e18e163-6cf0-48ef-9a6f-90cbece870b0" containerName="registry" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.098861 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" containerName="ovnkube-controller" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.098959 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" containerName="nbdb" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.099060 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" containerName="ovn-controller" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.099201 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" containerName="ovn-acl-logging" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.099301 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" containerName="kube-rbac-proxy-ovn-metrics" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.099437 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" containerName="northd" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.099864 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" containerName="ovnkube-controller" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.102940 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.106755 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-ovn-node-metrics-cert\") pod \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.106823 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhmb7\" (UniqueName: \"kubernetes.io/projected/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-kube-api-access-zhmb7\") pod \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.106857 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-run-systemd\") pod \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.106883 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-host-run-netns\") pod \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.106902 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-log-socket\") pod \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.106920 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-ovnkube-config\") pod \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.106958 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-run-ovn\") pod \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.106982 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.107014 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-host-slash\") pod \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.107037 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-host-kubelet\") pod \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.107091 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-ovnkube-script-lib\") pod \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.107142 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-host-cni-bin\") pod \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.107163 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-systemd-units\") pod \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.107192 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-host-cni-netd\") pod \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.107216 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-etc-openvswitch\") pod \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.107244 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-host-run-ovn-kubernetes\") pod \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.107259 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-node-log\") pod \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.107302 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-env-overrides\") pod \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.107318 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-run-openvswitch\") pod \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.107334 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-var-lib-openvswitch\") pod \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\" (UID: \"ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b\") " Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.107514 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" (UID: "ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.107605 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-log-socket" (OuterVolumeSpecName: "log-socket") pod "ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" (UID: "ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.107674 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" (UID: "ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.107750 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" (UID: "ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.107801 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" (UID: "ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.107854 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-host-slash" (OuterVolumeSpecName: "host-slash") pod "ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" (UID: "ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.107917 4751 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-log-socket\") on node \"crc\" DevicePath \"\"" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.107942 4751 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.107959 4751 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-host-kubelet\") on node \"crc\" DevicePath \"\"" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.107998 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" (UID: "ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.108096 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" (UID: "ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.108143 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" (UID: "ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.108166 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" (UID: "ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.108193 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" (UID: "ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.108199 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-node-log" (OuterVolumeSpecName: "node-log") pod "ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" (UID: "ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.108240 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" (UID: "ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.108293 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" (UID: "ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.108326 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" (UID: "ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.108584 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" (UID: "ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.108623 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" (UID: "ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.114166 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" (UID: "ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.116991 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-kube-api-access-zhmb7" (OuterVolumeSpecName: "kube-api-access-zhmb7") pod "ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" (UID: "ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b"). InnerVolumeSpecName "kube-api-access-zhmb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.122210 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" (UID: "ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.208921 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5c1c10ff-f217-4a26-8bd1-7d4642d08976-run-systemd\") pod \"ovnkube-node-9bvhc\" (UID: \"5c1c10ff-f217-4a26-8bd1-7d4642d08976\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.209429 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5c1c10ff-f217-4a26-8bd1-7d4642d08976-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9bvhc\" (UID: \"5c1c10ff-f217-4a26-8bd1-7d4642d08976\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.209476 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5c1c10ff-f217-4a26-8bd1-7d4642d08976-host-slash\") pod \"ovnkube-node-9bvhc\" (UID: \"5c1c10ff-f217-4a26-8bd1-7d4642d08976\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.209512 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5c1c10ff-f217-4a26-8bd1-7d4642d08976-ovnkube-config\") pod \"ovnkube-node-9bvhc\" (UID: \"5c1c10ff-f217-4a26-8bd1-7d4642d08976\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.209630 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqqmg\" (UniqueName: \"kubernetes.io/projected/5c1c10ff-f217-4a26-8bd1-7d4642d08976-kube-api-access-lqqmg\") pod \"ovnkube-node-9bvhc\" (UID: \"5c1c10ff-f217-4a26-8bd1-7d4642d08976\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.209686 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5c1c10ff-f217-4a26-8bd1-7d4642d08976-host-cni-netd\") pod \"ovnkube-node-9bvhc\" (UID: \"5c1c10ff-f217-4a26-8bd1-7d4642d08976\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.209733 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5c1c10ff-f217-4a26-8bd1-7d4642d08976-var-lib-openvswitch\") pod \"ovnkube-node-9bvhc\" (UID: \"5c1c10ff-f217-4a26-8bd1-7d4642d08976\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.209790 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5c1c10ff-f217-4a26-8bd1-7d4642d08976-node-log\") pod \"ovnkube-node-9bvhc\" (UID: \"5c1c10ff-f217-4a26-8bd1-7d4642d08976\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.209839 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5c1c10ff-f217-4a26-8bd1-7d4642d08976-env-overrides\") pod \"ovnkube-node-9bvhc\" (UID: \"5c1c10ff-f217-4a26-8bd1-7d4642d08976\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.209959 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5c1c10ff-f217-4a26-8bd1-7d4642d08976-host-kubelet\") pod \"ovnkube-node-9bvhc\" (UID: \"5c1c10ff-f217-4a26-8bd1-7d4642d08976\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.210026 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5c1c10ff-f217-4a26-8bd1-7d4642d08976-host-cni-bin\") pod \"ovnkube-node-9bvhc\" (UID: \"5c1c10ff-f217-4a26-8bd1-7d4642d08976\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.210127 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5c1c10ff-f217-4a26-8bd1-7d4642d08976-host-run-netns\") pod \"ovnkube-node-9bvhc\" (UID: \"5c1c10ff-f217-4a26-8bd1-7d4642d08976\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.210175 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5c1c10ff-f217-4a26-8bd1-7d4642d08976-run-ovn\") pod \"ovnkube-node-9bvhc\" (UID: \"5c1c10ff-f217-4a26-8bd1-7d4642d08976\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.210261 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5c1c10ff-f217-4a26-8bd1-7d4642d08976-ovnkube-script-lib\") pod \"ovnkube-node-9bvhc\" (UID: \"5c1c10ff-f217-4a26-8bd1-7d4642d08976\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.210326 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5c1c10ff-f217-4a26-8bd1-7d4642d08976-log-socket\") pod \"ovnkube-node-9bvhc\" (UID: \"5c1c10ff-f217-4a26-8bd1-7d4642d08976\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.210385 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5c1c10ff-f217-4a26-8bd1-7d4642d08976-host-run-ovn-kubernetes\") pod \"ovnkube-node-9bvhc\" (UID: \"5c1c10ff-f217-4a26-8bd1-7d4642d08976\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.210500 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5c1c10ff-f217-4a26-8bd1-7d4642d08976-run-openvswitch\") pod \"ovnkube-node-9bvhc\" (UID: \"5c1c10ff-f217-4a26-8bd1-7d4642d08976\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.210632 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5c1c10ff-f217-4a26-8bd1-7d4642d08976-ovn-node-metrics-cert\") pod \"ovnkube-node-9bvhc\" (UID: \"5c1c10ff-f217-4a26-8bd1-7d4642d08976\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.210696 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5c1c10ff-f217-4a26-8bd1-7d4642d08976-systemd-units\") pod \"ovnkube-node-9bvhc\" (UID: \"5c1c10ff-f217-4a26-8bd1-7d4642d08976\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.210742 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5c1c10ff-f217-4a26-8bd1-7d4642d08976-etc-openvswitch\") pod \"ovnkube-node-9bvhc\" (UID: \"5c1c10ff-f217-4a26-8bd1-7d4642d08976\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.210889 4751 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-run-systemd\") on node \"crc\" DevicePath \"\"" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.210918 4751 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-host-run-netns\") on node \"crc\" DevicePath \"\"" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.210941 4751 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.210960 4751 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.210977 4751 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-host-slash\") on node \"crc\" DevicePath \"\"" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.210995 4751 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.211013 4751 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-host-cni-bin\") on node \"crc\" DevicePath \"\"" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.211032 4751 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-systemd-units\") on node \"crc\" DevicePath \"\"" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.211050 4751 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-host-cni-netd\") on node \"crc\" DevicePath \"\"" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.211093 4751 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.211130 4751 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-node-log\") on node \"crc\" DevicePath \"\"" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.211149 4751 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.211169 4751 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.211186 4751 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-run-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.211203 4751 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.211220 4751 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.211238 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhmb7\" (UniqueName: \"kubernetes.io/projected/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b-kube-api-access-zhmb7\") on node \"crc\" DevicePath \"\"" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.312363 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5c1c10ff-f217-4a26-8bd1-7d4642d08976-host-kubelet\") pod \"ovnkube-node-9bvhc\" (UID: \"5c1c10ff-f217-4a26-8bd1-7d4642d08976\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.312450 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5c1c10ff-f217-4a26-8bd1-7d4642d08976-host-cni-bin\") pod \"ovnkube-node-9bvhc\" (UID: \"5c1c10ff-f217-4a26-8bd1-7d4642d08976\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.312501 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5c1c10ff-f217-4a26-8bd1-7d4642d08976-host-run-netns\") pod \"ovnkube-node-9bvhc\" (UID: \"5c1c10ff-f217-4a26-8bd1-7d4642d08976\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.312532 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5c1c10ff-f217-4a26-8bd1-7d4642d08976-run-ovn\") pod \"ovnkube-node-9bvhc\" (UID: \"5c1c10ff-f217-4a26-8bd1-7d4642d08976\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.312541 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5c1c10ff-f217-4a26-8bd1-7d4642d08976-host-kubelet\") pod \"ovnkube-node-9bvhc\" (UID: \"5c1c10ff-f217-4a26-8bd1-7d4642d08976\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.312570 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5c1c10ff-f217-4a26-8bd1-7d4642d08976-ovnkube-script-lib\") pod \"ovnkube-node-9bvhc\" (UID: \"5c1c10ff-f217-4a26-8bd1-7d4642d08976\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.312615 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5c1c10ff-f217-4a26-8bd1-7d4642d08976-host-cni-bin\") pod \"ovnkube-node-9bvhc\" (UID: \"5c1c10ff-f217-4a26-8bd1-7d4642d08976\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.312809 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5c1c10ff-f217-4a26-8bd1-7d4642d08976-host-run-netns\") pod \"ovnkube-node-9bvhc\" (UID: \"5c1c10ff-f217-4a26-8bd1-7d4642d08976\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.312853 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5c1c10ff-f217-4a26-8bd1-7d4642d08976-log-socket\") pod \"ovnkube-node-9bvhc\" (UID: \"5c1c10ff-f217-4a26-8bd1-7d4642d08976\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.312884 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5c1c10ff-f217-4a26-8bd1-7d4642d08976-run-ovn\") pod \"ovnkube-node-9bvhc\" (UID: \"5c1c10ff-f217-4a26-8bd1-7d4642d08976\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.312896 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5c1c10ff-f217-4a26-8bd1-7d4642d08976-host-run-ovn-kubernetes\") pod \"ovnkube-node-9bvhc\" (UID: \"5c1c10ff-f217-4a26-8bd1-7d4642d08976\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.312950 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5c1c10ff-f217-4a26-8bd1-7d4642d08976-run-openvswitch\") pod \"ovnkube-node-9bvhc\" (UID: \"5c1c10ff-f217-4a26-8bd1-7d4642d08976\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.312971 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5c1c10ff-f217-4a26-8bd1-7d4642d08976-log-socket\") pod \"ovnkube-node-9bvhc\" (UID: \"5c1c10ff-f217-4a26-8bd1-7d4642d08976\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.313011 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5c1c10ff-f217-4a26-8bd1-7d4642d08976-host-run-ovn-kubernetes\") pod \"ovnkube-node-9bvhc\" (UID: \"5c1c10ff-f217-4a26-8bd1-7d4642d08976\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.313021 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5c1c10ff-f217-4a26-8bd1-7d4642d08976-ovn-node-metrics-cert\") pod \"ovnkube-node-9bvhc\" (UID: \"5c1c10ff-f217-4a26-8bd1-7d4642d08976\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.313112 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5c1c10ff-f217-4a26-8bd1-7d4642d08976-systemd-units\") pod \"ovnkube-node-9bvhc\" (UID: \"5c1c10ff-f217-4a26-8bd1-7d4642d08976\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.313018 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5c1c10ff-f217-4a26-8bd1-7d4642d08976-run-openvswitch\") pod \"ovnkube-node-9bvhc\" (UID: \"5c1c10ff-f217-4a26-8bd1-7d4642d08976\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.313146 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5c1c10ff-f217-4a26-8bd1-7d4642d08976-etc-openvswitch\") pod \"ovnkube-node-9bvhc\" (UID: \"5c1c10ff-f217-4a26-8bd1-7d4642d08976\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.313181 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5c1c10ff-f217-4a26-8bd1-7d4642d08976-systemd-units\") pod \"ovnkube-node-9bvhc\" (UID: \"5c1c10ff-f217-4a26-8bd1-7d4642d08976\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.313196 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5c1c10ff-f217-4a26-8bd1-7d4642d08976-run-systemd\") pod \"ovnkube-node-9bvhc\" (UID: \"5c1c10ff-f217-4a26-8bd1-7d4642d08976\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.313225 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5c1c10ff-f217-4a26-8bd1-7d4642d08976-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9bvhc\" (UID: \"5c1c10ff-f217-4a26-8bd1-7d4642d08976\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.313245 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5c1c10ff-f217-4a26-8bd1-7d4642d08976-host-slash\") pod \"ovnkube-node-9bvhc\" (UID: \"5c1c10ff-f217-4a26-8bd1-7d4642d08976\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.313264 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5c1c10ff-f217-4a26-8bd1-7d4642d08976-ovnkube-config\") pod \"ovnkube-node-9bvhc\" (UID: \"5c1c10ff-f217-4a26-8bd1-7d4642d08976\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.313296 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqqmg\" (UniqueName: \"kubernetes.io/projected/5c1c10ff-f217-4a26-8bd1-7d4642d08976-kube-api-access-lqqmg\") pod \"ovnkube-node-9bvhc\" (UID: \"5c1c10ff-f217-4a26-8bd1-7d4642d08976\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.313318 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5c1c10ff-f217-4a26-8bd1-7d4642d08976-host-cni-netd\") pod \"ovnkube-node-9bvhc\" (UID: \"5c1c10ff-f217-4a26-8bd1-7d4642d08976\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.313343 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5c1c10ff-f217-4a26-8bd1-7d4642d08976-var-lib-openvswitch\") pod \"ovnkube-node-9bvhc\" (UID: \"5c1c10ff-f217-4a26-8bd1-7d4642d08976\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.313343 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5c1c10ff-f217-4a26-8bd1-7d4642d08976-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9bvhc\" (UID: \"5c1c10ff-f217-4a26-8bd1-7d4642d08976\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.313376 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5c1c10ff-f217-4a26-8bd1-7d4642d08976-node-log\") pod \"ovnkube-node-9bvhc\" (UID: \"5c1c10ff-f217-4a26-8bd1-7d4642d08976\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.313358 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5c1c10ff-f217-4a26-8bd1-7d4642d08976-node-log\") pod \"ovnkube-node-9bvhc\" (UID: \"5c1c10ff-f217-4a26-8bd1-7d4642d08976\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.313401 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5c1c10ff-f217-4a26-8bd1-7d4642d08976-host-slash\") pod \"ovnkube-node-9bvhc\" (UID: \"5c1c10ff-f217-4a26-8bd1-7d4642d08976\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.313430 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5c1c10ff-f217-4a26-8bd1-7d4642d08976-env-overrides\") pod \"ovnkube-node-9bvhc\" (UID: \"5c1c10ff-f217-4a26-8bd1-7d4642d08976\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.313621 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5c1c10ff-f217-4a26-8bd1-7d4642d08976-run-systemd\") pod \"ovnkube-node-9bvhc\" (UID: \"5c1c10ff-f217-4a26-8bd1-7d4642d08976\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.313735 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5c1c10ff-f217-4a26-8bd1-7d4642d08976-host-cni-netd\") pod \"ovnkube-node-9bvhc\" (UID: \"5c1c10ff-f217-4a26-8bd1-7d4642d08976\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.313290 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5c1c10ff-f217-4a26-8bd1-7d4642d08976-etc-openvswitch\") pod \"ovnkube-node-9bvhc\" (UID: \"5c1c10ff-f217-4a26-8bd1-7d4642d08976\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.313799 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5c1c10ff-f217-4a26-8bd1-7d4642d08976-ovnkube-script-lib\") pod \"ovnkube-node-9bvhc\" (UID: \"5c1c10ff-f217-4a26-8bd1-7d4642d08976\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.313818 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5c1c10ff-f217-4a26-8bd1-7d4642d08976-var-lib-openvswitch\") pod \"ovnkube-node-9bvhc\" (UID: \"5c1c10ff-f217-4a26-8bd1-7d4642d08976\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.314416 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5c1c10ff-f217-4a26-8bd1-7d4642d08976-ovnkube-config\") pod \"ovnkube-node-9bvhc\" (UID: \"5c1c10ff-f217-4a26-8bd1-7d4642d08976\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.314826 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5c1c10ff-f217-4a26-8bd1-7d4642d08976-env-overrides\") pod \"ovnkube-node-9bvhc\" (UID: \"5c1c10ff-f217-4a26-8bd1-7d4642d08976\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.318553 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5c1c10ff-f217-4a26-8bd1-7d4642d08976-ovn-node-metrics-cert\") pod \"ovnkube-node-9bvhc\" (UID: \"5c1c10ff-f217-4a26-8bd1-7d4642d08976\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.337586 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqqmg\" (UniqueName: \"kubernetes.io/projected/5c1c10ff-f217-4a26-8bd1-7d4642d08976-kube-api-access-lqqmg\") pod \"ovnkube-node-9bvhc\" (UID: \"5c1c10ff-f217-4a26-8bd1-7d4642d08976\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: I0131 14:52:15.421240 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:15 crc kubenswrapper[4751]: W0131 14:52:15.454960 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c1c10ff_f217_4a26_8bd1_7d4642d08976.slice/crio-d0edb5a30a101580494e50b6ac133f141a917c7ad10f4f7b5dad725c96af65ab WatchSource:0}: Error finding container d0edb5a30a101580494e50b6ac133f141a917c7ad10f4f7b5dad725c96af65ab: Status 404 returned error can't find the container with id d0edb5a30a101580494e50b6ac133f141a917c7ad10f4f7b5dad725c96af65ab Jan 31 14:52:16 crc kubenswrapper[4751]: I0131 14:52:16.042473 4751 generic.go:334] "Generic (PLEG): container finished" podID="5c1c10ff-f217-4a26-8bd1-7d4642d08976" containerID="d1c543d3283531f95fc28795636b70ac7c361ee5585f2bced078691fc7907cae" exitCode=0 Jan 31 14:52:16 crc kubenswrapper[4751]: I0131 14:52:16.042558 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" event={"ID":"5c1c10ff-f217-4a26-8bd1-7d4642d08976","Type":"ContainerDied","Data":"d1c543d3283531f95fc28795636b70ac7c361ee5585f2bced078691fc7907cae"} Jan 31 14:52:16 crc kubenswrapper[4751]: I0131 14:52:16.042972 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" event={"ID":"5c1c10ff-f217-4a26-8bd1-7d4642d08976","Type":"ContainerStarted","Data":"d0edb5a30a101580494e50b6ac133f141a917c7ad10f4f7b5dad725c96af65ab"} Jan 31 14:52:16 crc kubenswrapper[4751]: I0131 14:52:16.044770 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rtthp_e7dd989b-33df-4562-a60b-f273428fea3d/kube-multus/2.log" Jan 31 14:52:16 crc kubenswrapper[4751]: I0131 14:52:16.049884 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n8cdt_ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b/ovn-acl-logging/0.log" Jan 31 14:52:16 crc kubenswrapper[4751]: I0131 14:52:16.050536 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n8cdt_ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b/ovn-controller/0.log" Jan 31 14:52:16 crc kubenswrapper[4751]: I0131 14:52:16.051109 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-n8cdt" Jan 31 14:52:16 crc kubenswrapper[4751]: I0131 14:52:16.130131 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-n8cdt"] Jan 31 14:52:16 crc kubenswrapper[4751]: I0131 14:52:16.134464 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-n8cdt"] Jan 31 14:52:16 crc kubenswrapper[4751]: I0131 14:52:16.419238 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b" path="/var/lib/kubelet/pods/ceef6ba7-8d2d-4105-beee-6a8bdfd12c9b/volumes" Jan 31 14:52:17 crc kubenswrapper[4751]: I0131 14:52:17.060828 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" event={"ID":"5c1c10ff-f217-4a26-8bd1-7d4642d08976","Type":"ContainerStarted","Data":"6be2c0aab27a1ea907b6e1922671cfb2dd1e74f772feba152bb0a87106238934"} Jan 31 14:52:17 crc kubenswrapper[4751]: I0131 14:52:17.061277 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" event={"ID":"5c1c10ff-f217-4a26-8bd1-7d4642d08976","Type":"ContainerStarted","Data":"af8e7735fca292c3fa8c7d13ef2c2bed0aff024bf3578472b0f8e0537d6e1eac"} Jan 31 14:52:17 crc kubenswrapper[4751]: I0131 14:52:17.061296 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" event={"ID":"5c1c10ff-f217-4a26-8bd1-7d4642d08976","Type":"ContainerStarted","Data":"5cd9a556e224cc7ab141d373b791f2ede7c5a7ecdb33c8aaa303590991f20727"} Jan 31 14:52:17 crc kubenswrapper[4751]: I0131 14:52:17.061308 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" event={"ID":"5c1c10ff-f217-4a26-8bd1-7d4642d08976","Type":"ContainerStarted","Data":"fea6867b3acc68182fd2a6da399b36c6d7d7d0503753c6ce288b1a45e8a6ecbe"} Jan 31 14:52:17 crc kubenswrapper[4751]: I0131 14:52:17.061320 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" event={"ID":"5c1c10ff-f217-4a26-8bd1-7d4642d08976","Type":"ContainerStarted","Data":"79095c5483fb640b1ae163e960478ba096385b102267c80108d34889fd4fd3df"} Jan 31 14:52:17 crc kubenswrapper[4751]: I0131 14:52:17.061332 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" event={"ID":"5c1c10ff-f217-4a26-8bd1-7d4642d08976","Type":"ContainerStarted","Data":"1dff3d05a267b9c9481ca9057c01eae797fb8297693b00d48d276f5b7ae061e3"} Jan 31 14:52:20 crc kubenswrapper[4751]: I0131 14:52:20.090889 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" event={"ID":"5c1c10ff-f217-4a26-8bd1-7d4642d08976","Type":"ContainerStarted","Data":"4e30e3a765c67d68b9bd75d540c6f046d7b7840c6e140d3677658fd08f94febb"} Jan 31 14:52:22 crc kubenswrapper[4751]: I0131 14:52:22.109155 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" event={"ID":"5c1c10ff-f217-4a26-8bd1-7d4642d08976","Type":"ContainerStarted","Data":"b96fc3c9debec3be139ce32a8916361e28ffb3782ac29f4639739f6635a0b9c2"} Jan 31 14:52:22 crc kubenswrapper[4751]: I0131 14:52:22.110010 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:22 crc kubenswrapper[4751]: I0131 14:52:22.110026 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:22 crc kubenswrapper[4751]: I0131 14:52:22.141198 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:22 crc kubenswrapper[4751]: I0131 14:52:22.146113 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" podStartSLOduration=7.14609324 podStartE2EDuration="7.14609324s" podCreationTimestamp="2026-01-31 14:52:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:52:22.144371745 +0000 UTC m=+646.519084630" watchObservedRunningTime="2026-01-31 14:52:22.14609324 +0000 UTC m=+646.520806135" Jan 31 14:52:23 crc kubenswrapper[4751]: I0131 14:52:23.115908 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:23 crc kubenswrapper[4751]: I0131 14:52:23.154211 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:26 crc kubenswrapper[4751]: I0131 14:52:26.411393 4751 scope.go:117] "RemoveContainer" containerID="98a2f0e75ca2c214fba50a70792a41195e5b7e674dbe1ae5b98cd015b7526483" Jan 31 14:52:26 crc kubenswrapper[4751]: E0131 14:52:26.414341 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-rtthp_openshift-multus(e7dd989b-33df-4562-a60b-f273428fea3d)\"" pod="openshift-multus/multus-rtthp" podUID="e7dd989b-33df-4562-a60b-f273428fea3d" Jan 31 14:52:36 crc kubenswrapper[4751]: I0131 14:52:36.831978 4751 scope.go:117] "RemoveContainer" containerID="e8f6719bdf2fef7e21f7c5cdcb3de4b4f08e69bd62e22cb10fe157f638b3c148" Jan 31 14:52:36 crc kubenswrapper[4751]: I0131 14:52:36.861192 4751 scope.go:117] "RemoveContainer" containerID="153c98b7ebe36043f7ae094ec4ae3226c12652e95174c4ff2d00efc441bdb785" Jan 31 14:52:36 crc kubenswrapper[4751]: I0131 14:52:36.881957 4751 scope.go:117] "RemoveContainer" containerID="5623c591d9a6806335a9b671648fc172e749568a7ad29c3bd41f1a990ba56010" Jan 31 14:52:36 crc kubenswrapper[4751]: I0131 14:52:36.900656 4751 scope.go:117] "RemoveContainer" containerID="20273cc05bc490f8dd5258dccd02aa0e97d1d22c159ce2c177594920f4af83de" Jan 31 14:52:36 crc kubenswrapper[4751]: I0131 14:52:36.922438 4751 scope.go:117] "RemoveContainer" containerID="122856fd111512aaab2292d448505fb13a6bea5859652ab4e8bb512ee8d6aba9" Jan 31 14:52:36 crc kubenswrapper[4751]: I0131 14:52:36.948511 4751 scope.go:117] "RemoveContainer" containerID="f3446cde960ab284f70a338a9a4bfee334fdbdef503d868692fb847f4df6acba" Jan 31 14:52:36 crc kubenswrapper[4751]: I0131 14:52:36.979000 4751 scope.go:117] "RemoveContainer" containerID="357afc89f219ac34037aaeb7086076b36f05907502c53006aeb588eaf440cc74" Jan 31 14:52:36 crc kubenswrapper[4751]: I0131 14:52:36.993027 4751 scope.go:117] "RemoveContainer" containerID="e7f5e784895f315aa4d8fb6039b9e70c46f977a80fdd5d39ead81d75048abd6c" Jan 31 14:52:37 crc kubenswrapper[4751]: I0131 14:52:37.006152 4751 scope.go:117] "RemoveContainer" containerID="701178503ef461547e687442de0607af20cee8bca075347d1d1cb2a439562232" Jan 31 14:52:38 crc kubenswrapper[4751]: I0131 14:52:38.406813 4751 scope.go:117] "RemoveContainer" containerID="98a2f0e75ca2c214fba50a70792a41195e5b7e674dbe1ae5b98cd015b7526483" Jan 31 14:52:39 crc kubenswrapper[4751]: I0131 14:52:39.243438 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rtthp_e7dd989b-33df-4562-a60b-f273428fea3d/kube-multus/2.log" Jan 31 14:52:39 crc kubenswrapper[4751]: I0131 14:52:39.243798 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rtthp" event={"ID":"e7dd989b-33df-4562-a60b-f273428fea3d","Type":"ContainerStarted","Data":"8207fb66e68bce2bc8a8e7120aa05f6d77ac2b5b91eb20ca05dc568acd4bd4aa"} Jan 31 14:52:45 crc kubenswrapper[4751]: I0131 14:52:45.450807 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9bvhc" Jan 31 14:52:48 crc kubenswrapper[4751]: I0131 14:52:48.796682 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczsbcz"] Jan 31 14:52:48 crc kubenswrapper[4751]: I0131 14:52:48.799333 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczsbcz" Jan 31 14:52:48 crc kubenswrapper[4751]: I0131 14:52:48.802491 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 31 14:52:48 crc kubenswrapper[4751]: I0131 14:52:48.810414 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczsbcz"] Jan 31 14:52:48 crc kubenswrapper[4751]: I0131 14:52:48.948657 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f3380dc7-49d9-4d61-a0bb-003c1c5e2742-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczsbcz\" (UID: \"f3380dc7-49d9-4d61-a0bb-003c1c5e2742\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczsbcz" Jan 31 14:52:48 crc kubenswrapper[4751]: I0131 14:52:48.948769 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f3380dc7-49d9-4d61-a0bb-003c1c5e2742-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczsbcz\" (UID: \"f3380dc7-49d9-4d61-a0bb-003c1c5e2742\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczsbcz" Jan 31 14:52:48 crc kubenswrapper[4751]: I0131 14:52:48.948836 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zvc7\" (UniqueName: \"kubernetes.io/projected/f3380dc7-49d9-4d61-a0bb-003c1c5e2742-kube-api-access-4zvc7\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczsbcz\" (UID: \"f3380dc7-49d9-4d61-a0bb-003c1c5e2742\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczsbcz" Jan 31 14:52:49 crc kubenswrapper[4751]: I0131 14:52:49.050189 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f3380dc7-49d9-4d61-a0bb-003c1c5e2742-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczsbcz\" (UID: \"f3380dc7-49d9-4d61-a0bb-003c1c5e2742\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczsbcz" Jan 31 14:52:49 crc kubenswrapper[4751]: I0131 14:52:49.050305 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f3380dc7-49d9-4d61-a0bb-003c1c5e2742-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczsbcz\" (UID: \"f3380dc7-49d9-4d61-a0bb-003c1c5e2742\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczsbcz" Jan 31 14:52:49 crc kubenswrapper[4751]: I0131 14:52:49.050378 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zvc7\" (UniqueName: \"kubernetes.io/projected/f3380dc7-49d9-4d61-a0bb-003c1c5e2742-kube-api-access-4zvc7\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczsbcz\" (UID: \"f3380dc7-49d9-4d61-a0bb-003c1c5e2742\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczsbcz" Jan 31 14:52:49 crc kubenswrapper[4751]: I0131 14:52:49.050956 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f3380dc7-49d9-4d61-a0bb-003c1c5e2742-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczsbcz\" (UID: \"f3380dc7-49d9-4d61-a0bb-003c1c5e2742\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczsbcz" Jan 31 14:52:49 crc kubenswrapper[4751]: I0131 14:52:49.050978 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f3380dc7-49d9-4d61-a0bb-003c1c5e2742-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczsbcz\" (UID: \"f3380dc7-49d9-4d61-a0bb-003c1c5e2742\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczsbcz" Jan 31 14:52:49 crc kubenswrapper[4751]: I0131 14:52:49.091358 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zvc7\" (UniqueName: \"kubernetes.io/projected/f3380dc7-49d9-4d61-a0bb-003c1c5e2742-kube-api-access-4zvc7\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczsbcz\" (UID: \"f3380dc7-49d9-4d61-a0bb-003c1c5e2742\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczsbcz" Jan 31 14:52:49 crc kubenswrapper[4751]: I0131 14:52:49.117320 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczsbcz" Jan 31 14:52:49 crc kubenswrapper[4751]: I0131 14:52:49.634127 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczsbcz"] Jan 31 14:52:50 crc kubenswrapper[4751]: I0131 14:52:50.308421 4751 generic.go:334] "Generic (PLEG): container finished" podID="f3380dc7-49d9-4d61-a0bb-003c1c5e2742" containerID="3efd438a3705eea1a3efedddcd876eeacfb008f62b18f56c151922dc22bf9158" exitCode=0 Jan 31 14:52:50 crc kubenswrapper[4751]: I0131 14:52:50.308478 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczsbcz" event={"ID":"f3380dc7-49d9-4d61-a0bb-003c1c5e2742","Type":"ContainerDied","Data":"3efd438a3705eea1a3efedddcd876eeacfb008f62b18f56c151922dc22bf9158"} Jan 31 14:52:50 crc kubenswrapper[4751]: I0131 14:52:50.308749 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczsbcz" event={"ID":"f3380dc7-49d9-4d61-a0bb-003c1c5e2742","Type":"ContainerStarted","Data":"40f87475e03d21ed5061858cc5fb9fd5c9dea64b23c1bf0fe86b08af74845ea6"} Jan 31 14:52:50 crc kubenswrapper[4751]: I0131 14:52:50.310592 4751 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 31 14:52:51 crc kubenswrapper[4751]: I0131 14:52:51.316397 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczsbcz" event={"ID":"f3380dc7-49d9-4d61-a0bb-003c1c5e2742","Type":"ContainerStarted","Data":"601bf14e8637916fa045ca6931615dde91e0b3bf3c4252b9f52fbc9df9e4bd03"} Jan 31 14:52:52 crc kubenswrapper[4751]: I0131 14:52:52.323687 4751 generic.go:334] "Generic (PLEG): container finished" podID="f3380dc7-49d9-4d61-a0bb-003c1c5e2742" containerID="601bf14e8637916fa045ca6931615dde91e0b3bf3c4252b9f52fbc9df9e4bd03" exitCode=0 Jan 31 14:52:52 crc kubenswrapper[4751]: I0131 14:52:52.323814 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczsbcz" event={"ID":"f3380dc7-49d9-4d61-a0bb-003c1c5e2742","Type":"ContainerDied","Data":"601bf14e8637916fa045ca6931615dde91e0b3bf3c4252b9f52fbc9df9e4bd03"} Jan 31 14:52:53 crc kubenswrapper[4751]: I0131 14:52:53.330203 4751 generic.go:334] "Generic (PLEG): container finished" podID="f3380dc7-49d9-4d61-a0bb-003c1c5e2742" containerID="9ec8aed50894d62e18f3fb826346eabfdce4a94c9f2fae5ad6a9c65b1db62d96" exitCode=0 Jan 31 14:52:53 crc kubenswrapper[4751]: I0131 14:52:53.330250 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczsbcz" event={"ID":"f3380dc7-49d9-4d61-a0bb-003c1c5e2742","Type":"ContainerDied","Data":"9ec8aed50894d62e18f3fb826346eabfdce4a94c9f2fae5ad6a9c65b1db62d96"} Jan 31 14:52:54 crc kubenswrapper[4751]: I0131 14:52:54.610776 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczsbcz" Jan 31 14:52:54 crc kubenswrapper[4751]: I0131 14:52:54.720677 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f3380dc7-49d9-4d61-a0bb-003c1c5e2742-bundle\") pod \"f3380dc7-49d9-4d61-a0bb-003c1c5e2742\" (UID: \"f3380dc7-49d9-4d61-a0bb-003c1c5e2742\") " Jan 31 14:52:54 crc kubenswrapper[4751]: I0131 14:52:54.720893 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f3380dc7-49d9-4d61-a0bb-003c1c5e2742-util\") pod \"f3380dc7-49d9-4d61-a0bb-003c1c5e2742\" (UID: \"f3380dc7-49d9-4d61-a0bb-003c1c5e2742\") " Jan 31 14:52:54 crc kubenswrapper[4751]: I0131 14:52:54.720982 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zvc7\" (UniqueName: \"kubernetes.io/projected/f3380dc7-49d9-4d61-a0bb-003c1c5e2742-kube-api-access-4zvc7\") pod \"f3380dc7-49d9-4d61-a0bb-003c1c5e2742\" (UID: \"f3380dc7-49d9-4d61-a0bb-003c1c5e2742\") " Jan 31 14:52:54 crc kubenswrapper[4751]: I0131 14:52:54.722627 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3380dc7-49d9-4d61-a0bb-003c1c5e2742-bundle" (OuterVolumeSpecName: "bundle") pod "f3380dc7-49d9-4d61-a0bb-003c1c5e2742" (UID: "f3380dc7-49d9-4d61-a0bb-003c1c5e2742"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 14:52:54 crc kubenswrapper[4751]: I0131 14:52:54.729322 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3380dc7-49d9-4d61-a0bb-003c1c5e2742-kube-api-access-4zvc7" (OuterVolumeSpecName: "kube-api-access-4zvc7") pod "f3380dc7-49d9-4d61-a0bb-003c1c5e2742" (UID: "f3380dc7-49d9-4d61-a0bb-003c1c5e2742"). InnerVolumeSpecName "kube-api-access-4zvc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:52:54 crc kubenswrapper[4751]: I0131 14:52:54.750121 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3380dc7-49d9-4d61-a0bb-003c1c5e2742-util" (OuterVolumeSpecName: "util") pod "f3380dc7-49d9-4d61-a0bb-003c1c5e2742" (UID: "f3380dc7-49d9-4d61-a0bb-003c1c5e2742"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 14:52:54 crc kubenswrapper[4751]: I0131 14:52:54.823145 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zvc7\" (UniqueName: \"kubernetes.io/projected/f3380dc7-49d9-4d61-a0bb-003c1c5e2742-kube-api-access-4zvc7\") on node \"crc\" DevicePath \"\"" Jan 31 14:52:54 crc kubenswrapper[4751]: I0131 14:52:54.823195 4751 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f3380dc7-49d9-4d61-a0bb-003c1c5e2742-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 14:52:54 crc kubenswrapper[4751]: I0131 14:52:54.823218 4751 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f3380dc7-49d9-4d61-a0bb-003c1c5e2742-util\") on node \"crc\" DevicePath \"\"" Jan 31 14:52:55 crc kubenswrapper[4751]: I0131 14:52:55.351131 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczsbcz" event={"ID":"f3380dc7-49d9-4d61-a0bb-003c1c5e2742","Type":"ContainerDied","Data":"40f87475e03d21ed5061858cc5fb9fd5c9dea64b23c1bf0fe86b08af74845ea6"} Jan 31 14:52:55 crc kubenswrapper[4751]: I0131 14:52:55.351183 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40f87475e03d21ed5061858cc5fb9fd5c9dea64b23c1bf0fe86b08af74845ea6" Jan 31 14:52:55 crc kubenswrapper[4751]: I0131 14:52:55.351235 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczsbcz" Jan 31 14:53:04 crc kubenswrapper[4751]: I0131 14:53:04.307771 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-6b999687d7-vf7mn"] Jan 31 14:53:04 crc kubenswrapper[4751]: E0131 14:53:04.308472 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3380dc7-49d9-4d61-a0bb-003c1c5e2742" containerName="util" Jan 31 14:53:04 crc kubenswrapper[4751]: I0131 14:53:04.308484 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3380dc7-49d9-4d61-a0bb-003c1c5e2742" containerName="util" Jan 31 14:53:04 crc kubenswrapper[4751]: E0131 14:53:04.308493 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3380dc7-49d9-4d61-a0bb-003c1c5e2742" containerName="extract" Jan 31 14:53:04 crc kubenswrapper[4751]: I0131 14:53:04.308499 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3380dc7-49d9-4d61-a0bb-003c1c5e2742" containerName="extract" Jan 31 14:53:04 crc kubenswrapper[4751]: E0131 14:53:04.308517 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3380dc7-49d9-4d61-a0bb-003c1c5e2742" containerName="pull" Jan 31 14:53:04 crc kubenswrapper[4751]: I0131 14:53:04.308524 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3380dc7-49d9-4d61-a0bb-003c1c5e2742" containerName="pull" Jan 31 14:53:04 crc kubenswrapper[4751]: I0131 14:53:04.308613 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3380dc7-49d9-4d61-a0bb-003c1c5e2742" containerName="extract" Jan 31 14:53:04 crc kubenswrapper[4751]: I0131 14:53:04.308989 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6b999687d7-vf7mn" Jan 31 14:53:04 crc kubenswrapper[4751]: I0131 14:53:04.310212 4751 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Jan 31 14:53:04 crc kubenswrapper[4751]: I0131 14:53:04.311130 4751 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Jan 31 14:53:04 crc kubenswrapper[4751]: I0131 14:53:04.311252 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Jan 31 14:53:04 crc kubenswrapper[4751]: I0131 14:53:04.311300 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Jan 31 14:53:04 crc kubenswrapper[4751]: I0131 14:53:04.311348 4751 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-zdzdv" Jan 31 14:53:04 crc kubenswrapper[4751]: I0131 14:53:04.323636 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6b999687d7-vf7mn"] Jan 31 14:53:04 crc kubenswrapper[4751]: I0131 14:53:04.446937 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bd60e998-83e4-442a-98ac-c4e33d4b4765-apiservice-cert\") pod \"metallb-operator-controller-manager-6b999687d7-vf7mn\" (UID: \"bd60e998-83e4-442a-98ac-c4e33d4b4765\") " pod="metallb-system/metallb-operator-controller-manager-6b999687d7-vf7mn" Jan 31 14:53:04 crc kubenswrapper[4751]: I0131 14:53:04.447318 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bd60e998-83e4-442a-98ac-c4e33d4b4765-webhook-cert\") pod \"metallb-operator-controller-manager-6b999687d7-vf7mn\" (UID: \"bd60e998-83e4-442a-98ac-c4e33d4b4765\") " pod="metallb-system/metallb-operator-controller-manager-6b999687d7-vf7mn" Jan 31 14:53:04 crc kubenswrapper[4751]: I0131 14:53:04.447352 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpgrt\" (UniqueName: \"kubernetes.io/projected/bd60e998-83e4-442a-98ac-c4e33d4b4765-kube-api-access-tpgrt\") pod \"metallb-operator-controller-manager-6b999687d7-vf7mn\" (UID: \"bd60e998-83e4-442a-98ac-c4e33d4b4765\") " pod="metallb-system/metallb-operator-controller-manager-6b999687d7-vf7mn" Jan 31 14:53:04 crc kubenswrapper[4751]: I0131 14:53:04.544823 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-5c46dd7d46-8xt78"] Jan 31 14:53:04 crc kubenswrapper[4751]: I0131 14:53:04.545777 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5c46dd7d46-8xt78" Jan 31 14:53:04 crc kubenswrapper[4751]: I0131 14:53:04.547240 4751 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-98hk2" Jan 31 14:53:04 crc kubenswrapper[4751]: I0131 14:53:04.547263 4751 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jan 31 14:53:04 crc kubenswrapper[4751]: I0131 14:53:04.547629 4751 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Jan 31 14:53:04 crc kubenswrapper[4751]: I0131 14:53:04.547942 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bd60e998-83e4-442a-98ac-c4e33d4b4765-webhook-cert\") pod \"metallb-operator-controller-manager-6b999687d7-vf7mn\" (UID: \"bd60e998-83e4-442a-98ac-c4e33d4b4765\") " pod="metallb-system/metallb-operator-controller-manager-6b999687d7-vf7mn" Jan 31 14:53:04 crc kubenswrapper[4751]: I0131 14:53:04.547980 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpgrt\" (UniqueName: \"kubernetes.io/projected/bd60e998-83e4-442a-98ac-c4e33d4b4765-kube-api-access-tpgrt\") pod \"metallb-operator-controller-manager-6b999687d7-vf7mn\" (UID: \"bd60e998-83e4-442a-98ac-c4e33d4b4765\") " pod="metallb-system/metallb-operator-controller-manager-6b999687d7-vf7mn" Jan 31 14:53:04 crc kubenswrapper[4751]: I0131 14:53:04.548020 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bd60e998-83e4-442a-98ac-c4e33d4b4765-apiservice-cert\") pod \"metallb-operator-controller-manager-6b999687d7-vf7mn\" (UID: \"bd60e998-83e4-442a-98ac-c4e33d4b4765\") " pod="metallb-system/metallb-operator-controller-manager-6b999687d7-vf7mn" Jan 31 14:53:04 crc kubenswrapper[4751]: I0131 14:53:04.555035 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bd60e998-83e4-442a-98ac-c4e33d4b4765-webhook-cert\") pod \"metallb-operator-controller-manager-6b999687d7-vf7mn\" (UID: \"bd60e998-83e4-442a-98ac-c4e33d4b4765\") " pod="metallb-system/metallb-operator-controller-manager-6b999687d7-vf7mn" Jan 31 14:53:04 crc kubenswrapper[4751]: I0131 14:53:04.557914 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bd60e998-83e4-442a-98ac-c4e33d4b4765-apiservice-cert\") pod \"metallb-operator-controller-manager-6b999687d7-vf7mn\" (UID: \"bd60e998-83e4-442a-98ac-c4e33d4b4765\") " pod="metallb-system/metallb-operator-controller-manager-6b999687d7-vf7mn" Jan 31 14:53:04 crc kubenswrapper[4751]: I0131 14:53:04.566659 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpgrt\" (UniqueName: \"kubernetes.io/projected/bd60e998-83e4-442a-98ac-c4e33d4b4765-kube-api-access-tpgrt\") pod \"metallb-operator-controller-manager-6b999687d7-vf7mn\" (UID: \"bd60e998-83e4-442a-98ac-c4e33d4b4765\") " pod="metallb-system/metallb-operator-controller-manager-6b999687d7-vf7mn" Jan 31 14:53:04 crc kubenswrapper[4751]: I0131 14:53:04.572089 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5c46dd7d46-8xt78"] Jan 31 14:53:04 crc kubenswrapper[4751]: I0131 14:53:04.624566 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6b999687d7-vf7mn" Jan 31 14:53:04 crc kubenswrapper[4751]: I0131 14:53:04.649614 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/01320eb9-ccb5-4593-866a-f49553fa7262-webhook-cert\") pod \"metallb-operator-webhook-server-5c46dd7d46-8xt78\" (UID: \"01320eb9-ccb5-4593-866a-f49553fa7262\") " pod="metallb-system/metallb-operator-webhook-server-5c46dd7d46-8xt78" Jan 31 14:53:04 crc kubenswrapper[4751]: I0131 14:53:04.649673 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnvst\" (UniqueName: \"kubernetes.io/projected/01320eb9-ccb5-4593-866a-f49553fa7262-kube-api-access-hnvst\") pod \"metallb-operator-webhook-server-5c46dd7d46-8xt78\" (UID: \"01320eb9-ccb5-4593-866a-f49553fa7262\") " pod="metallb-system/metallb-operator-webhook-server-5c46dd7d46-8xt78" Jan 31 14:53:04 crc kubenswrapper[4751]: I0131 14:53:04.649729 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/01320eb9-ccb5-4593-866a-f49553fa7262-apiservice-cert\") pod \"metallb-operator-webhook-server-5c46dd7d46-8xt78\" (UID: \"01320eb9-ccb5-4593-866a-f49553fa7262\") " pod="metallb-system/metallb-operator-webhook-server-5c46dd7d46-8xt78" Jan 31 14:53:04 crc kubenswrapper[4751]: I0131 14:53:04.751600 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/01320eb9-ccb5-4593-866a-f49553fa7262-webhook-cert\") pod \"metallb-operator-webhook-server-5c46dd7d46-8xt78\" (UID: \"01320eb9-ccb5-4593-866a-f49553fa7262\") " pod="metallb-system/metallb-operator-webhook-server-5c46dd7d46-8xt78" Jan 31 14:53:04 crc kubenswrapper[4751]: I0131 14:53:04.751654 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnvst\" (UniqueName: \"kubernetes.io/projected/01320eb9-ccb5-4593-866a-f49553fa7262-kube-api-access-hnvst\") pod \"metallb-operator-webhook-server-5c46dd7d46-8xt78\" (UID: \"01320eb9-ccb5-4593-866a-f49553fa7262\") " pod="metallb-system/metallb-operator-webhook-server-5c46dd7d46-8xt78" Jan 31 14:53:04 crc kubenswrapper[4751]: I0131 14:53:04.751699 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/01320eb9-ccb5-4593-866a-f49553fa7262-apiservice-cert\") pod \"metallb-operator-webhook-server-5c46dd7d46-8xt78\" (UID: \"01320eb9-ccb5-4593-866a-f49553fa7262\") " pod="metallb-system/metallb-operator-webhook-server-5c46dd7d46-8xt78" Jan 31 14:53:04 crc kubenswrapper[4751]: I0131 14:53:04.755374 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/01320eb9-ccb5-4593-866a-f49553fa7262-webhook-cert\") pod \"metallb-operator-webhook-server-5c46dd7d46-8xt78\" (UID: \"01320eb9-ccb5-4593-866a-f49553fa7262\") " pod="metallb-system/metallb-operator-webhook-server-5c46dd7d46-8xt78" Jan 31 14:53:04 crc kubenswrapper[4751]: I0131 14:53:04.755652 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/01320eb9-ccb5-4593-866a-f49553fa7262-apiservice-cert\") pod \"metallb-operator-webhook-server-5c46dd7d46-8xt78\" (UID: \"01320eb9-ccb5-4593-866a-f49553fa7262\") " pod="metallb-system/metallb-operator-webhook-server-5c46dd7d46-8xt78" Jan 31 14:53:04 crc kubenswrapper[4751]: I0131 14:53:04.767990 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnvst\" (UniqueName: \"kubernetes.io/projected/01320eb9-ccb5-4593-866a-f49553fa7262-kube-api-access-hnvst\") pod \"metallb-operator-webhook-server-5c46dd7d46-8xt78\" (UID: \"01320eb9-ccb5-4593-866a-f49553fa7262\") " pod="metallb-system/metallb-operator-webhook-server-5c46dd7d46-8xt78" Jan 31 14:53:04 crc kubenswrapper[4751]: I0131 14:53:04.831210 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6b999687d7-vf7mn"] Jan 31 14:53:04 crc kubenswrapper[4751]: W0131 14:53:04.844090 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd60e998_83e4_442a_98ac_c4e33d4b4765.slice/crio-87d9ee3b1421f1fc472f483c122f5e08c3e3c2b1658c093f196ba55ec588a8e2 WatchSource:0}: Error finding container 87d9ee3b1421f1fc472f483c122f5e08c3e3c2b1658c093f196ba55ec588a8e2: Status 404 returned error can't find the container with id 87d9ee3b1421f1fc472f483c122f5e08c3e3c2b1658c093f196ba55ec588a8e2 Jan 31 14:53:04 crc kubenswrapper[4751]: I0131 14:53:04.900040 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5c46dd7d46-8xt78" Jan 31 14:53:05 crc kubenswrapper[4751]: W0131 14:53:05.107797 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01320eb9_ccb5_4593_866a_f49553fa7262.slice/crio-fa5fc459280f07966a5966cfa6c51e8689e2d93ddb81e4227e19d9c78d8a044e WatchSource:0}: Error finding container fa5fc459280f07966a5966cfa6c51e8689e2d93ddb81e4227e19d9c78d8a044e: Status 404 returned error can't find the container with id fa5fc459280f07966a5966cfa6c51e8689e2d93ddb81e4227e19d9c78d8a044e Jan 31 14:53:05 crc kubenswrapper[4751]: I0131 14:53:05.109403 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5c46dd7d46-8xt78"] Jan 31 14:53:05 crc kubenswrapper[4751]: I0131 14:53:05.405959 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6b999687d7-vf7mn" event={"ID":"bd60e998-83e4-442a-98ac-c4e33d4b4765","Type":"ContainerStarted","Data":"87d9ee3b1421f1fc472f483c122f5e08c3e3c2b1658c093f196ba55ec588a8e2"} Jan 31 14:53:05 crc kubenswrapper[4751]: I0131 14:53:05.408446 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5c46dd7d46-8xt78" event={"ID":"01320eb9-ccb5-4593-866a-f49553fa7262","Type":"ContainerStarted","Data":"fa5fc459280f07966a5966cfa6c51e8689e2d93ddb81e4227e19d9c78d8a044e"} Jan 31 14:53:10 crc kubenswrapper[4751]: I0131 14:53:10.450043 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5c46dd7d46-8xt78" event={"ID":"01320eb9-ccb5-4593-866a-f49553fa7262","Type":"ContainerStarted","Data":"2f7a224d466d03f11dd0cffdf6425400f947fda4e2e007bf4265a921bfaa57d4"} Jan 31 14:53:10 crc kubenswrapper[4751]: I0131 14:53:10.450698 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-5c46dd7d46-8xt78" Jan 31 14:53:10 crc kubenswrapper[4751]: I0131 14:53:10.453290 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6b999687d7-vf7mn" event={"ID":"bd60e998-83e4-442a-98ac-c4e33d4b4765","Type":"ContainerStarted","Data":"c931d1a3cbf0137ad695710fb71fc10ddceeefcd678f5a98931970128e0dd1f9"} Jan 31 14:53:10 crc kubenswrapper[4751]: I0131 14:53:10.453449 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-6b999687d7-vf7mn" Jan 31 14:53:10 crc kubenswrapper[4751]: I0131 14:53:10.520404 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-5c46dd7d46-8xt78" podStartSLOduration=2.047920445 podStartE2EDuration="6.52037971s" podCreationTimestamp="2026-01-31 14:53:04 +0000 UTC" firstStartedPulling="2026-01-31 14:53:05.113147079 +0000 UTC m=+689.487859964" lastFinishedPulling="2026-01-31 14:53:09.585606344 +0000 UTC m=+693.960319229" observedRunningTime="2026-01-31 14:53:10.483340836 +0000 UTC m=+694.858053741" watchObservedRunningTime="2026-01-31 14:53:10.52037971 +0000 UTC m=+694.895092615" Jan 31 14:53:10 crc kubenswrapper[4751]: I0131 14:53:10.524807 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-6b999687d7-vf7mn" podStartSLOduration=1.813497146 podStartE2EDuration="6.524788537s" podCreationTimestamp="2026-01-31 14:53:04 +0000 UTC" firstStartedPulling="2026-01-31 14:53:04.847416979 +0000 UTC m=+689.222129864" lastFinishedPulling="2026-01-31 14:53:09.55870837 +0000 UTC m=+693.933421255" observedRunningTime="2026-01-31 14:53:10.516959489 +0000 UTC m=+694.891672394" watchObservedRunningTime="2026-01-31 14:53:10.524788537 +0000 UTC m=+694.899501442" Jan 31 14:53:24 crc kubenswrapper[4751]: I0131 14:53:24.907583 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-5c46dd7d46-8xt78" Jan 31 14:53:44 crc kubenswrapper[4751]: I0131 14:53:44.627559 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-6b999687d7-vf7mn" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.251169 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-9z9n2"] Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.254134 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-9z9n2" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.255336 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-qf86j"] Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.256124 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-qf86j" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.256573 4751 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-rqxft" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.256845 4751 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.256957 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.257382 4751 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.275626 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-qf86j"] Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.305180 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkvz6\" (UniqueName: \"kubernetes.io/projected/b1f214e9-14db-462f-900c-3652ec7908e5-kube-api-access-zkvz6\") pod \"frr-k8s-9z9n2\" (UID: \"b1f214e9-14db-462f-900c-3652ec7908e5\") " pod="metallb-system/frr-k8s-9z9n2" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.305231 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/b1f214e9-14db-462f-900c-3652ec7908e5-frr-startup\") pod \"frr-k8s-9z9n2\" (UID: \"b1f214e9-14db-462f-900c-3652ec7908e5\") " pod="metallb-system/frr-k8s-9z9n2" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.305637 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b1f214e9-14db-462f-900c-3652ec7908e5-metrics-certs\") pod \"frr-k8s-9z9n2\" (UID: \"b1f214e9-14db-462f-900c-3652ec7908e5\") " pod="metallb-system/frr-k8s-9z9n2" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.305778 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/b1f214e9-14db-462f-900c-3652ec7908e5-frr-sockets\") pod \"frr-k8s-9z9n2\" (UID: \"b1f214e9-14db-462f-900c-3652ec7908e5\") " pod="metallb-system/frr-k8s-9z9n2" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.305832 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/b1f214e9-14db-462f-900c-3652ec7908e5-frr-conf\") pod \"frr-k8s-9z9n2\" (UID: \"b1f214e9-14db-462f-900c-3652ec7908e5\") " pod="metallb-system/frr-k8s-9z9n2" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.305879 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/b1f214e9-14db-462f-900c-3652ec7908e5-reloader\") pod \"frr-k8s-9z9n2\" (UID: \"b1f214e9-14db-462f-900c-3652ec7908e5\") " pod="metallb-system/frr-k8s-9z9n2" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.305963 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/b1f214e9-14db-462f-900c-3652ec7908e5-metrics\") pod \"frr-k8s-9z9n2\" (UID: \"b1f214e9-14db-462f-900c-3652ec7908e5\") " pod="metallb-system/frr-k8s-9z9n2" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.340000 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-qv6gh"] Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.341241 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-qv6gh" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.345044 4751 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.345114 4751 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-sp9w4" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.345527 4751 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.346749 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.355993 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-6968d8fdc4-6dhf9"] Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.356903 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-6dhf9" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.360896 4751 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.369045 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-6dhf9"] Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.406686 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b667c31-e911-496a-9c8b-12c906e724ec-cert\") pod \"controller-6968d8fdc4-6dhf9\" (UID: \"6b667c31-e911-496a-9c8b-12c906e724ec\") " pod="metallb-system/controller-6968d8fdc4-6dhf9" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.406724 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/b1f214e9-14db-462f-900c-3652ec7908e5-metrics\") pod \"frr-k8s-9z9n2\" (UID: \"b1f214e9-14db-462f-900c-3652ec7908e5\") " pod="metallb-system/frr-k8s-9z9n2" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.406743 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/7e8b57b9-8a0b-4ef2-83c9-85f9e60c9adc-memberlist\") pod \"speaker-qv6gh\" (UID: \"7e8b57b9-8a0b-4ef2-83c9-85f9e60c9adc\") " pod="metallb-system/speaker-qv6gh" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.406795 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkvz6\" (UniqueName: \"kubernetes.io/projected/b1f214e9-14db-462f-900c-3652ec7908e5-kube-api-access-zkvz6\") pod \"frr-k8s-9z9n2\" (UID: \"b1f214e9-14db-462f-900c-3652ec7908e5\") " pod="metallb-system/frr-k8s-9z9n2" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.406818 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/b1f214e9-14db-462f-900c-3652ec7908e5-frr-startup\") pod \"frr-k8s-9z9n2\" (UID: \"b1f214e9-14db-462f-900c-3652ec7908e5\") " pod="metallb-system/frr-k8s-9z9n2" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.406836 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94655b12-be6a-4043-8f7c-80d1b7fb1a2f-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-qf86j\" (UID: \"94655b12-be6a-4043-8f7c-80d1b7fb1a2f\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-qf86j" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.406852 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/7e8b57b9-8a0b-4ef2-83c9-85f9e60c9adc-metallb-excludel2\") pod \"speaker-qv6gh\" (UID: \"7e8b57b9-8a0b-4ef2-83c9-85f9e60c9adc\") " pod="metallb-system/speaker-qv6gh" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.406872 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e8b57b9-8a0b-4ef2-83c9-85f9e60c9adc-metrics-certs\") pod \"speaker-qv6gh\" (UID: \"7e8b57b9-8a0b-4ef2-83c9-85f9e60c9adc\") " pod="metallb-system/speaker-qv6gh" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.406894 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prsbb\" (UniqueName: \"kubernetes.io/projected/7e8b57b9-8a0b-4ef2-83c9-85f9e60c9adc-kube-api-access-prsbb\") pod \"speaker-qv6gh\" (UID: \"7e8b57b9-8a0b-4ef2-83c9-85f9e60c9adc\") " pod="metallb-system/speaker-qv6gh" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.406912 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b1f214e9-14db-462f-900c-3652ec7908e5-metrics-certs\") pod \"frr-k8s-9z9n2\" (UID: \"b1f214e9-14db-462f-900c-3652ec7908e5\") " pod="metallb-system/frr-k8s-9z9n2" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.406944 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/b1f214e9-14db-462f-900c-3652ec7908e5-frr-sockets\") pod \"frr-k8s-9z9n2\" (UID: \"b1f214e9-14db-462f-900c-3652ec7908e5\") " pod="metallb-system/frr-k8s-9z9n2" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.406957 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/b1f214e9-14db-462f-900c-3652ec7908e5-frr-conf\") pod \"frr-k8s-9z9n2\" (UID: \"b1f214e9-14db-462f-900c-3652ec7908e5\") " pod="metallb-system/frr-k8s-9z9n2" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.407051 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/b1f214e9-14db-462f-900c-3652ec7908e5-metrics\") pod \"frr-k8s-9z9n2\" (UID: \"b1f214e9-14db-462f-900c-3652ec7908e5\") " pod="metallb-system/frr-k8s-9z9n2" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.407078 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/b1f214e9-14db-462f-900c-3652ec7908e5-reloader\") pod \"frr-k8s-9z9n2\" (UID: \"b1f214e9-14db-462f-900c-3652ec7908e5\") " pod="metallb-system/frr-k8s-9z9n2" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.407125 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92mdn\" (UniqueName: \"kubernetes.io/projected/94655b12-be6a-4043-8f7c-80d1b7fb1a2f-kube-api-access-92mdn\") pod \"frr-k8s-webhook-server-7df86c4f6c-qf86j\" (UID: \"94655b12-be6a-4043-8f7c-80d1b7fb1a2f\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-qf86j" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.407147 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjmxk\" (UniqueName: \"kubernetes.io/projected/6b667c31-e911-496a-9c8b-12c906e724ec-kube-api-access-tjmxk\") pod \"controller-6968d8fdc4-6dhf9\" (UID: \"6b667c31-e911-496a-9c8b-12c906e724ec\") " pod="metallb-system/controller-6968d8fdc4-6dhf9" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.407165 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/b1f214e9-14db-462f-900c-3652ec7908e5-frr-conf\") pod \"frr-k8s-9z9n2\" (UID: \"b1f214e9-14db-462f-900c-3652ec7908e5\") " pod="metallb-system/frr-k8s-9z9n2" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.407173 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6b667c31-e911-496a-9c8b-12c906e724ec-metrics-certs\") pod \"controller-6968d8fdc4-6dhf9\" (UID: \"6b667c31-e911-496a-9c8b-12c906e724ec\") " pod="metallb-system/controller-6968d8fdc4-6dhf9" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.407481 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/b1f214e9-14db-462f-900c-3652ec7908e5-frr-sockets\") pod \"frr-k8s-9z9n2\" (UID: \"b1f214e9-14db-462f-900c-3652ec7908e5\") " pod="metallb-system/frr-k8s-9z9n2" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.407607 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/b1f214e9-14db-462f-900c-3652ec7908e5-reloader\") pod \"frr-k8s-9z9n2\" (UID: \"b1f214e9-14db-462f-900c-3652ec7908e5\") " pod="metallb-system/frr-k8s-9z9n2" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.407781 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/b1f214e9-14db-462f-900c-3652ec7908e5-frr-startup\") pod \"frr-k8s-9z9n2\" (UID: \"b1f214e9-14db-462f-900c-3652ec7908e5\") " pod="metallb-system/frr-k8s-9z9n2" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.419682 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b1f214e9-14db-462f-900c-3652ec7908e5-metrics-certs\") pod \"frr-k8s-9z9n2\" (UID: \"b1f214e9-14db-462f-900c-3652ec7908e5\") " pod="metallb-system/frr-k8s-9z9n2" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.432885 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkvz6\" (UniqueName: \"kubernetes.io/projected/b1f214e9-14db-462f-900c-3652ec7908e5-kube-api-access-zkvz6\") pod \"frr-k8s-9z9n2\" (UID: \"b1f214e9-14db-462f-900c-3652ec7908e5\") " pod="metallb-system/frr-k8s-9z9n2" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.507864 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92mdn\" (UniqueName: \"kubernetes.io/projected/94655b12-be6a-4043-8f7c-80d1b7fb1a2f-kube-api-access-92mdn\") pod \"frr-k8s-webhook-server-7df86c4f6c-qf86j\" (UID: \"94655b12-be6a-4043-8f7c-80d1b7fb1a2f\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-qf86j" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.507907 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjmxk\" (UniqueName: \"kubernetes.io/projected/6b667c31-e911-496a-9c8b-12c906e724ec-kube-api-access-tjmxk\") pod \"controller-6968d8fdc4-6dhf9\" (UID: \"6b667c31-e911-496a-9c8b-12c906e724ec\") " pod="metallb-system/controller-6968d8fdc4-6dhf9" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.507923 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6b667c31-e911-496a-9c8b-12c906e724ec-metrics-certs\") pod \"controller-6968d8fdc4-6dhf9\" (UID: \"6b667c31-e911-496a-9c8b-12c906e724ec\") " pod="metallb-system/controller-6968d8fdc4-6dhf9" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.507942 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b667c31-e911-496a-9c8b-12c906e724ec-cert\") pod \"controller-6968d8fdc4-6dhf9\" (UID: \"6b667c31-e911-496a-9c8b-12c906e724ec\") " pod="metallb-system/controller-6968d8fdc4-6dhf9" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.507965 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/7e8b57b9-8a0b-4ef2-83c9-85f9e60c9adc-memberlist\") pod \"speaker-qv6gh\" (UID: \"7e8b57b9-8a0b-4ef2-83c9-85f9e60c9adc\") " pod="metallb-system/speaker-qv6gh" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.507987 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/7e8b57b9-8a0b-4ef2-83c9-85f9e60c9adc-metallb-excludel2\") pod \"speaker-qv6gh\" (UID: \"7e8b57b9-8a0b-4ef2-83c9-85f9e60c9adc\") " pod="metallb-system/speaker-qv6gh" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.508005 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94655b12-be6a-4043-8f7c-80d1b7fb1a2f-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-qf86j\" (UID: \"94655b12-be6a-4043-8f7c-80d1b7fb1a2f\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-qf86j" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.508026 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e8b57b9-8a0b-4ef2-83c9-85f9e60c9adc-metrics-certs\") pod \"speaker-qv6gh\" (UID: \"7e8b57b9-8a0b-4ef2-83c9-85f9e60c9adc\") " pod="metallb-system/speaker-qv6gh" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.508047 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prsbb\" (UniqueName: \"kubernetes.io/projected/7e8b57b9-8a0b-4ef2-83c9-85f9e60c9adc-kube-api-access-prsbb\") pod \"speaker-qv6gh\" (UID: \"7e8b57b9-8a0b-4ef2-83c9-85f9e60c9adc\") " pod="metallb-system/speaker-qv6gh" Jan 31 14:53:45 crc kubenswrapper[4751]: E0131 14:53:45.508228 4751 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 31 14:53:45 crc kubenswrapper[4751]: E0131 14:53:45.508286 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e8b57b9-8a0b-4ef2-83c9-85f9e60c9adc-memberlist podName:7e8b57b9-8a0b-4ef2-83c9-85f9e60c9adc nodeName:}" failed. No retries permitted until 2026-01-31 14:53:46.008266003 +0000 UTC m=+730.382978888 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/7e8b57b9-8a0b-4ef2-83c9-85f9e60c9adc-memberlist") pod "speaker-qv6gh" (UID: "7e8b57b9-8a0b-4ef2-83c9-85f9e60c9adc") : secret "metallb-memberlist" not found Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.508747 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/7e8b57b9-8a0b-4ef2-83c9-85f9e60c9adc-metallb-excludel2\") pod \"speaker-qv6gh\" (UID: \"7e8b57b9-8a0b-4ef2-83c9-85f9e60c9adc\") " pod="metallb-system/speaker-qv6gh" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.525755 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e8b57b9-8a0b-4ef2-83c9-85f9e60c9adc-metrics-certs\") pod \"speaker-qv6gh\" (UID: \"7e8b57b9-8a0b-4ef2-83c9-85f9e60c9adc\") " pod="metallb-system/speaker-qv6gh" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.525911 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94655b12-be6a-4043-8f7c-80d1b7fb1a2f-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-qf86j\" (UID: \"94655b12-be6a-4043-8f7c-80d1b7fb1a2f\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-qf86j" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.525946 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6b667c31-e911-496a-9c8b-12c906e724ec-metrics-certs\") pod \"controller-6968d8fdc4-6dhf9\" (UID: \"6b667c31-e911-496a-9c8b-12c906e724ec\") " pod="metallb-system/controller-6968d8fdc4-6dhf9" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.526058 4751 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.529876 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prsbb\" (UniqueName: \"kubernetes.io/projected/7e8b57b9-8a0b-4ef2-83c9-85f9e60c9adc-kube-api-access-prsbb\") pod \"speaker-qv6gh\" (UID: \"7e8b57b9-8a0b-4ef2-83c9-85f9e60c9adc\") " pod="metallb-system/speaker-qv6gh" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.530130 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b667c31-e911-496a-9c8b-12c906e724ec-cert\") pod \"controller-6968d8fdc4-6dhf9\" (UID: \"6b667c31-e911-496a-9c8b-12c906e724ec\") " pod="metallb-system/controller-6968d8fdc4-6dhf9" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.532264 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92mdn\" (UniqueName: \"kubernetes.io/projected/94655b12-be6a-4043-8f7c-80d1b7fb1a2f-kube-api-access-92mdn\") pod \"frr-k8s-webhook-server-7df86c4f6c-qf86j\" (UID: \"94655b12-be6a-4043-8f7c-80d1b7fb1a2f\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-qf86j" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.532301 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjmxk\" (UniqueName: \"kubernetes.io/projected/6b667c31-e911-496a-9c8b-12c906e724ec-kube-api-access-tjmxk\") pod \"controller-6968d8fdc4-6dhf9\" (UID: \"6b667c31-e911-496a-9c8b-12c906e724ec\") " pod="metallb-system/controller-6968d8fdc4-6dhf9" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.571587 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-9z9n2" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.576305 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-qf86j" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.670763 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-6dhf9" Jan 31 14:53:45 crc kubenswrapper[4751]: I0131 14:53:45.891209 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-6dhf9"] Jan 31 14:53:45 crc kubenswrapper[4751]: W0131 14:53:45.896215 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b667c31_e911_496a_9c8b_12c906e724ec.slice/crio-d4d116ee8dafcae9c090310f10d6ed4f9123268eef0738f0033c39ed2c3f8b2b WatchSource:0}: Error finding container d4d116ee8dafcae9c090310f10d6ed4f9123268eef0738f0033c39ed2c3f8b2b: Status 404 returned error can't find the container with id d4d116ee8dafcae9c090310f10d6ed4f9123268eef0738f0033c39ed2c3f8b2b Jan 31 14:53:46 crc kubenswrapper[4751]: I0131 14:53:46.039218 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/7e8b57b9-8a0b-4ef2-83c9-85f9e60c9adc-memberlist\") pod \"speaker-qv6gh\" (UID: \"7e8b57b9-8a0b-4ef2-83c9-85f9e60c9adc\") " pod="metallb-system/speaker-qv6gh" Jan 31 14:53:46 crc kubenswrapper[4751]: E0131 14:53:46.039375 4751 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 31 14:53:46 crc kubenswrapper[4751]: E0131 14:53:46.039593 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e8b57b9-8a0b-4ef2-83c9-85f9e60c9adc-memberlist podName:7e8b57b9-8a0b-4ef2-83c9-85f9e60c9adc nodeName:}" failed. No retries permitted until 2026-01-31 14:53:47.03957607 +0000 UTC m=+731.414288955 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/7e8b57b9-8a0b-4ef2-83c9-85f9e60c9adc-memberlist") pod "speaker-qv6gh" (UID: "7e8b57b9-8a0b-4ef2-83c9-85f9e60c9adc") : secret "metallb-memberlist" not found Jan 31 14:53:46 crc kubenswrapper[4751]: I0131 14:53:46.053426 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-qf86j"] Jan 31 14:53:46 crc kubenswrapper[4751]: W0131 14:53:46.056764 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94655b12_be6a_4043_8f7c_80d1b7fb1a2f.slice/crio-389496d9322ade055f02884ba64abb01cdde5b6cd775d1f30ed0ee2c3cb8a7ae WatchSource:0}: Error finding container 389496d9322ade055f02884ba64abb01cdde5b6cd775d1f30ed0ee2c3cb8a7ae: Status 404 returned error can't find the container with id 389496d9322ade055f02884ba64abb01cdde5b6cd775d1f30ed0ee2c3cb8a7ae Jan 31 14:53:46 crc kubenswrapper[4751]: I0131 14:53:46.765316 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9z9n2" event={"ID":"b1f214e9-14db-462f-900c-3652ec7908e5","Type":"ContainerStarted","Data":"518736c27fabd294ea93ad285af465864a682c0d8cb72298fefc50f24acee05b"} Jan 31 14:53:46 crc kubenswrapper[4751]: I0131 14:53:46.767383 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-6dhf9" event={"ID":"6b667c31-e911-496a-9c8b-12c906e724ec","Type":"ContainerStarted","Data":"a8b15329d8728f30cc4260b6a3385271f981350e73f6b2e23267cc83aae5ba6f"} Jan 31 14:53:46 crc kubenswrapper[4751]: I0131 14:53:46.767846 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-6dhf9" event={"ID":"6b667c31-e911-496a-9c8b-12c906e724ec","Type":"ContainerStarted","Data":"d4d116ee8dafcae9c090310f10d6ed4f9123268eef0738f0033c39ed2c3f8b2b"} Jan 31 14:53:46 crc kubenswrapper[4751]: I0131 14:53:46.769229 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-qf86j" event={"ID":"94655b12-be6a-4043-8f7c-80d1b7fb1a2f","Type":"ContainerStarted","Data":"389496d9322ade055f02884ba64abb01cdde5b6cd775d1f30ed0ee2c3cb8a7ae"} Jan 31 14:53:47 crc kubenswrapper[4751]: I0131 14:53:47.051687 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/7e8b57b9-8a0b-4ef2-83c9-85f9e60c9adc-memberlist\") pod \"speaker-qv6gh\" (UID: \"7e8b57b9-8a0b-4ef2-83c9-85f9e60c9adc\") " pod="metallb-system/speaker-qv6gh" Jan 31 14:53:47 crc kubenswrapper[4751]: I0131 14:53:47.059409 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/7e8b57b9-8a0b-4ef2-83c9-85f9e60c9adc-memberlist\") pod \"speaker-qv6gh\" (UID: \"7e8b57b9-8a0b-4ef2-83c9-85f9e60c9adc\") " pod="metallb-system/speaker-qv6gh" Jan 31 14:53:47 crc kubenswrapper[4751]: I0131 14:53:47.154203 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-qv6gh" Jan 31 14:53:47 crc kubenswrapper[4751]: I0131 14:53:47.792559 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-qv6gh" event={"ID":"7e8b57b9-8a0b-4ef2-83c9-85f9e60c9adc","Type":"ContainerStarted","Data":"190173bc608108504a2a8a915f84497adc2dbddcc38e89ac61add1a81967cf73"} Jan 31 14:53:47 crc kubenswrapper[4751]: I0131 14:53:47.792603 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-qv6gh" event={"ID":"7e8b57b9-8a0b-4ef2-83c9-85f9e60c9adc","Type":"ContainerStarted","Data":"4ab8a47082d2773e0e4484534cf763c30b1a72d1b83d66d59262f5fa02591767"} Jan 31 14:53:49 crc kubenswrapper[4751]: I0131 14:53:49.807499 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-6dhf9" event={"ID":"6b667c31-e911-496a-9c8b-12c906e724ec","Type":"ContainerStarted","Data":"8bb42588577eb54f01b6d5e02179bed292d5a8faa442ca4ef9d8daa015180af1"} Jan 31 14:53:49 crc kubenswrapper[4751]: I0131 14:53:49.809249 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-6968d8fdc4-6dhf9" Jan 31 14:53:49 crc kubenswrapper[4751]: I0131 14:53:49.815817 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-qv6gh" event={"ID":"7e8b57b9-8a0b-4ef2-83c9-85f9e60c9adc","Type":"ContainerStarted","Data":"29c7b873bfa2415d9deba65b7ff9c29b6493af1d4c749b28d36a4b6c43e9814c"} Jan 31 14:53:49 crc kubenswrapper[4751]: I0131 14:53:49.815913 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-qv6gh" Jan 31 14:53:49 crc kubenswrapper[4751]: I0131 14:53:49.826650 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-6968d8fdc4-6dhf9" podStartSLOduration=1.857049602 podStartE2EDuration="4.826629189s" podCreationTimestamp="2026-01-31 14:53:45 +0000 UTC" firstStartedPulling="2026-01-31 14:53:46.011237893 +0000 UTC m=+730.385950778" lastFinishedPulling="2026-01-31 14:53:48.98081748 +0000 UTC m=+733.355530365" observedRunningTime="2026-01-31 14:53:49.823713332 +0000 UTC m=+734.198426217" watchObservedRunningTime="2026-01-31 14:53:49.826629189 +0000 UTC m=+734.201342074" Jan 31 14:53:49 crc kubenswrapper[4751]: I0131 14:53:49.846842 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-qv6gh" podStartSLOduration=3.432028772 podStartE2EDuration="4.846810691s" podCreationTimestamp="2026-01-31 14:53:45 +0000 UTC" firstStartedPulling="2026-01-31 14:53:47.57584837 +0000 UTC m=+731.950561255" lastFinishedPulling="2026-01-31 14:53:48.990630289 +0000 UTC m=+733.365343174" observedRunningTime="2026-01-31 14:53:49.845387913 +0000 UTC m=+734.220100838" watchObservedRunningTime="2026-01-31 14:53:49.846810691 +0000 UTC m=+734.221523616" Jan 31 14:53:52 crc kubenswrapper[4751]: I0131 14:53:52.835931 4751 generic.go:334] "Generic (PLEG): container finished" podID="b1f214e9-14db-462f-900c-3652ec7908e5" containerID="e257b83e2996505f76b5e69df401be892c8d5b5e9f72784f8bf893f34751a7e9" exitCode=0 Jan 31 14:53:52 crc kubenswrapper[4751]: I0131 14:53:52.836126 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9z9n2" event={"ID":"b1f214e9-14db-462f-900c-3652ec7908e5","Type":"ContainerDied","Data":"e257b83e2996505f76b5e69df401be892c8d5b5e9f72784f8bf893f34751a7e9"} Jan 31 14:53:52 crc kubenswrapper[4751]: I0131 14:53:52.840531 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-qf86j" event={"ID":"94655b12-be6a-4043-8f7c-80d1b7fb1a2f","Type":"ContainerStarted","Data":"5574e9a95192d2c8caf42a417e367b5c74a01a96dca3b6ace3a0b7e862064801"} Jan 31 14:53:52 crc kubenswrapper[4751]: I0131 14:53:52.841115 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-qf86j" Jan 31 14:53:52 crc kubenswrapper[4751]: I0131 14:53:52.895110 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-qf86j" podStartSLOduration=1.454432017 podStartE2EDuration="7.895059953s" podCreationTimestamp="2026-01-31 14:53:45 +0000 UTC" firstStartedPulling="2026-01-31 14:53:46.059196557 +0000 UTC m=+730.433909452" lastFinishedPulling="2026-01-31 14:53:52.499824503 +0000 UTC m=+736.874537388" observedRunningTime="2026-01-31 14:53:52.883288222 +0000 UTC m=+737.258001137" watchObservedRunningTime="2026-01-31 14:53:52.895059953 +0000 UTC m=+737.269772878" Jan 31 14:53:53 crc kubenswrapper[4751]: I0131 14:53:53.854340 4751 generic.go:334] "Generic (PLEG): container finished" podID="b1f214e9-14db-462f-900c-3652ec7908e5" containerID="b98b40b2437353e75bd1003d04e4148c2446d403f633b173d2a155e2c2d07298" exitCode=0 Jan 31 14:53:53 crc kubenswrapper[4751]: I0131 14:53:53.854416 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9z9n2" event={"ID":"b1f214e9-14db-462f-900c-3652ec7908e5","Type":"ContainerDied","Data":"b98b40b2437353e75bd1003d04e4148c2446d403f633b173d2a155e2c2d07298"} Jan 31 14:53:54 crc kubenswrapper[4751]: I0131 14:53:54.864179 4751 generic.go:334] "Generic (PLEG): container finished" podID="b1f214e9-14db-462f-900c-3652ec7908e5" containerID="6828715cee1ab3682671ec05297551484dd98c12c6fd0285752d41b1067aee54" exitCode=0 Jan 31 14:53:54 crc kubenswrapper[4751]: I0131 14:53:54.867145 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9z9n2" event={"ID":"b1f214e9-14db-462f-900c-3652ec7908e5","Type":"ContainerDied","Data":"6828715cee1ab3682671ec05297551484dd98c12c6fd0285752d41b1067aee54"} Jan 31 14:53:55 crc kubenswrapper[4751]: I0131 14:53:55.678267 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-6968d8fdc4-6dhf9" Jan 31 14:53:55 crc kubenswrapper[4751]: I0131 14:53:55.879134 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9z9n2" event={"ID":"b1f214e9-14db-462f-900c-3652ec7908e5","Type":"ContainerStarted","Data":"b46a14e9de29292256b412db85b552467e7f108925d938e36716fa0bdd0d2eff"} Jan 31 14:53:55 crc kubenswrapper[4751]: I0131 14:53:55.879174 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9z9n2" event={"ID":"b1f214e9-14db-462f-900c-3652ec7908e5","Type":"ContainerStarted","Data":"1f010f84732b788fb5ace83d817423b09432c5953343ba03b1d5e092938cfc31"} Jan 31 14:53:55 crc kubenswrapper[4751]: I0131 14:53:55.879185 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9z9n2" event={"ID":"b1f214e9-14db-462f-900c-3652ec7908e5","Type":"ContainerStarted","Data":"1e02ed482355326b14499113df95495187e4101200954abfc593f8342f579635"} Jan 31 14:53:55 crc kubenswrapper[4751]: I0131 14:53:55.879197 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9z9n2" event={"ID":"b1f214e9-14db-462f-900c-3652ec7908e5","Type":"ContainerStarted","Data":"305700feab0f0c008275b1522e6cf82c889e25a0932e5d9125d079744f803262"} Jan 31 14:53:55 crc kubenswrapper[4751]: I0131 14:53:55.879206 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9z9n2" event={"ID":"b1f214e9-14db-462f-900c-3652ec7908e5","Type":"ContainerStarted","Data":"a06051de686fba3f136e364fbce12cfafe61eff286ab01c7331656ea2bea5ca4"} Jan 31 14:53:56 crc kubenswrapper[4751]: I0131 14:53:56.885815 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9z9n2" event={"ID":"b1f214e9-14db-462f-900c-3652ec7908e5","Type":"ContainerStarted","Data":"56b997727b93f490fefed1daa5c38fe367d74cc0f7af37afe6e03a931ce56c3e"} Jan 31 14:53:56 crc kubenswrapper[4751]: I0131 14:53:56.886278 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-9z9n2" Jan 31 14:53:56 crc kubenswrapper[4751]: I0131 14:53:56.915330 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-9z9n2" podStartSLOduration=5.194901807 podStartE2EDuration="11.915310029s" podCreationTimestamp="2026-01-31 14:53:45 +0000 UTC" firstStartedPulling="2026-01-31 14:53:45.763796509 +0000 UTC m=+730.138509394" lastFinishedPulling="2026-01-31 14:53:52.484204721 +0000 UTC m=+736.858917616" observedRunningTime="2026-01-31 14:53:56.911189251 +0000 UTC m=+741.285902146" watchObservedRunningTime="2026-01-31 14:53:56.915310029 +0000 UTC m=+741.290022914" Jan 31 14:53:57 crc kubenswrapper[4751]: I0131 14:53:57.160016 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-qv6gh" Jan 31 14:54:00 crc kubenswrapper[4751]: I0131 14:54:00.571887 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-9z9n2" Jan 31 14:54:00 crc kubenswrapper[4751]: I0131 14:54:00.639609 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-9z9n2" Jan 31 14:54:02 crc kubenswrapper[4751]: I0131 14:54:02.591308 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-index-8khjf"] Jan 31 14:54:02 crc kubenswrapper[4751]: I0131 14:54:02.593491 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-8khjf" Jan 31 14:54:02 crc kubenswrapper[4751]: I0131 14:54:02.596469 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Jan 31 14:54:02 crc kubenswrapper[4751]: I0131 14:54:02.596824 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-index-dockercfg-rlk9f" Jan 31 14:54:02 crc kubenswrapper[4751]: I0131 14:54:02.596891 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Jan 31 14:54:02 crc kubenswrapper[4751]: I0131 14:54:02.613720 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-8khjf"] Jan 31 14:54:02 crc kubenswrapper[4751]: I0131 14:54:02.711524 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ct4t5\" (UniqueName: \"kubernetes.io/projected/45186c12-b6c6-4360-91c6-f44b7a20835c-kube-api-access-ct4t5\") pod \"mariadb-operator-index-8khjf\" (UID: \"45186c12-b6c6-4360-91c6-f44b7a20835c\") " pod="openstack-operators/mariadb-operator-index-8khjf" Jan 31 14:54:02 crc kubenswrapper[4751]: I0131 14:54:02.813287 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ct4t5\" (UniqueName: \"kubernetes.io/projected/45186c12-b6c6-4360-91c6-f44b7a20835c-kube-api-access-ct4t5\") pod \"mariadb-operator-index-8khjf\" (UID: \"45186c12-b6c6-4360-91c6-f44b7a20835c\") " pod="openstack-operators/mariadb-operator-index-8khjf" Jan 31 14:54:02 crc kubenswrapper[4751]: I0131 14:54:02.833026 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ct4t5\" (UniqueName: \"kubernetes.io/projected/45186c12-b6c6-4360-91c6-f44b7a20835c-kube-api-access-ct4t5\") pod \"mariadb-operator-index-8khjf\" (UID: \"45186c12-b6c6-4360-91c6-f44b7a20835c\") " pod="openstack-operators/mariadb-operator-index-8khjf" Jan 31 14:54:02 crc kubenswrapper[4751]: I0131 14:54:02.940095 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-8khjf" Jan 31 14:54:03 crc kubenswrapper[4751]: I0131 14:54:03.168703 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-8khjf"] Jan 31 14:54:03 crc kubenswrapper[4751]: W0131 14:54:03.182280 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45186c12_b6c6_4360_91c6_f44b7a20835c.slice/crio-df12732ce1dcb1caf73b1304f1223fbf619c3f1573236232e36e405a77fa3ed6 WatchSource:0}: Error finding container df12732ce1dcb1caf73b1304f1223fbf619c3f1573236232e36e405a77fa3ed6: Status 404 returned error can't find the container with id df12732ce1dcb1caf73b1304f1223fbf619c3f1573236232e36e405a77fa3ed6 Jan 31 14:54:03 crc kubenswrapper[4751]: I0131 14:54:03.932129 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-8khjf" event={"ID":"45186c12-b6c6-4360-91c6-f44b7a20835c","Type":"ContainerStarted","Data":"df12732ce1dcb1caf73b1304f1223fbf619c3f1573236232e36e405a77fa3ed6"} Jan 31 14:54:04 crc kubenswrapper[4751]: I0131 14:54:04.943232 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-8khjf" event={"ID":"45186c12-b6c6-4360-91c6-f44b7a20835c","Type":"ContainerStarted","Data":"6676f9025b980659e32fdcadd8ff687e74b4114ecb7fb9a20f0836497a011c64"} Jan 31 14:54:04 crc kubenswrapper[4751]: I0131 14:54:04.964566 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-index-8khjf" podStartSLOduration=2.052181049 podStartE2EDuration="2.964535743s" podCreationTimestamp="2026-01-31 14:54:02 +0000 UTC" firstStartedPulling="2026-01-31 14:54:03.184855244 +0000 UTC m=+747.559568129" lastFinishedPulling="2026-01-31 14:54:04.097209918 +0000 UTC m=+748.471922823" observedRunningTime="2026-01-31 14:54:04.959673655 +0000 UTC m=+749.334386580" watchObservedRunningTime="2026-01-31 14:54:04.964535743 +0000 UTC m=+749.339248668" Jan 31 14:54:05 crc kubenswrapper[4751]: I0131 14:54:05.575837 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-9z9n2" Jan 31 14:54:05 crc kubenswrapper[4751]: I0131 14:54:05.583333 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-qf86j" Jan 31 14:54:05 crc kubenswrapper[4751]: I0131 14:54:05.964277 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-8khjf"] Jan 31 14:54:06 crc kubenswrapper[4751]: I0131 14:54:06.569768 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-index-lpshr"] Jan 31 14:54:06 crc kubenswrapper[4751]: I0131 14:54:06.570546 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-lpshr" Jan 31 14:54:06 crc kubenswrapper[4751]: I0131 14:54:06.582907 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-lpshr"] Jan 31 14:54:06 crc kubenswrapper[4751]: I0131 14:54:06.675612 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hltq9\" (UniqueName: \"kubernetes.io/projected/11fab5ff-3041-45d3-8aab-29e25ed8c6ae-kube-api-access-hltq9\") pod \"mariadb-operator-index-lpshr\" (UID: \"11fab5ff-3041-45d3-8aab-29e25ed8c6ae\") " pod="openstack-operators/mariadb-operator-index-lpshr" Jan 31 14:54:06 crc kubenswrapper[4751]: I0131 14:54:06.776568 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hltq9\" (UniqueName: \"kubernetes.io/projected/11fab5ff-3041-45d3-8aab-29e25ed8c6ae-kube-api-access-hltq9\") pod \"mariadb-operator-index-lpshr\" (UID: \"11fab5ff-3041-45d3-8aab-29e25ed8c6ae\") " pod="openstack-operators/mariadb-operator-index-lpshr" Jan 31 14:54:06 crc kubenswrapper[4751]: I0131 14:54:06.826751 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hltq9\" (UniqueName: \"kubernetes.io/projected/11fab5ff-3041-45d3-8aab-29e25ed8c6ae-kube-api-access-hltq9\") pod \"mariadb-operator-index-lpshr\" (UID: \"11fab5ff-3041-45d3-8aab-29e25ed8c6ae\") " pod="openstack-operators/mariadb-operator-index-lpshr" Jan 31 14:54:06 crc kubenswrapper[4751]: I0131 14:54:06.894233 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-lpshr" Jan 31 14:54:06 crc kubenswrapper[4751]: I0131 14:54:06.957470 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-index-8khjf" podUID="45186c12-b6c6-4360-91c6-f44b7a20835c" containerName="registry-server" containerID="cri-o://6676f9025b980659e32fdcadd8ff687e74b4114ecb7fb9a20f0836497a011c64" gracePeriod=2 Jan 31 14:54:07 crc kubenswrapper[4751]: I0131 14:54:07.106382 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-lpshr"] Jan 31 14:54:07 crc kubenswrapper[4751]: W0131 14:54:07.116259 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11fab5ff_3041_45d3_8aab_29e25ed8c6ae.slice/crio-713674d4d326d8545cf66e50aee47bded08afa8e12a04a763f8540e3552c31dd WatchSource:0}: Error finding container 713674d4d326d8545cf66e50aee47bded08afa8e12a04a763f8540e3552c31dd: Status 404 returned error can't find the container with id 713674d4d326d8545cf66e50aee47bded08afa8e12a04a763f8540e3552c31dd Jan 31 14:54:07 crc kubenswrapper[4751]: I0131 14:54:07.965598 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-lpshr" event={"ID":"11fab5ff-3041-45d3-8aab-29e25ed8c6ae","Type":"ContainerStarted","Data":"713674d4d326d8545cf66e50aee47bded08afa8e12a04a763f8540e3552c31dd"} Jan 31 14:54:07 crc kubenswrapper[4751]: I0131 14:54:07.967715 4751 generic.go:334] "Generic (PLEG): container finished" podID="45186c12-b6c6-4360-91c6-f44b7a20835c" containerID="6676f9025b980659e32fdcadd8ff687e74b4114ecb7fb9a20f0836497a011c64" exitCode=0 Jan 31 14:54:07 crc kubenswrapper[4751]: I0131 14:54:07.967780 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-8khjf" event={"ID":"45186c12-b6c6-4360-91c6-f44b7a20835c","Type":"ContainerDied","Data":"6676f9025b980659e32fdcadd8ff687e74b4114ecb7fb9a20f0836497a011c64"} Jan 31 14:54:08 crc kubenswrapper[4751]: I0131 14:54:08.358162 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-8khjf" Jan 31 14:54:08 crc kubenswrapper[4751]: I0131 14:54:08.423528 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ct4t5\" (UniqueName: \"kubernetes.io/projected/45186c12-b6c6-4360-91c6-f44b7a20835c-kube-api-access-ct4t5\") pod \"45186c12-b6c6-4360-91c6-f44b7a20835c\" (UID: \"45186c12-b6c6-4360-91c6-f44b7a20835c\") " Jan 31 14:54:08 crc kubenswrapper[4751]: I0131 14:54:08.431603 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45186c12-b6c6-4360-91c6-f44b7a20835c-kube-api-access-ct4t5" (OuterVolumeSpecName: "kube-api-access-ct4t5") pod "45186c12-b6c6-4360-91c6-f44b7a20835c" (UID: "45186c12-b6c6-4360-91c6-f44b7a20835c"). InnerVolumeSpecName "kube-api-access-ct4t5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:54:08 crc kubenswrapper[4751]: I0131 14:54:08.525199 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ct4t5\" (UniqueName: \"kubernetes.io/projected/45186c12-b6c6-4360-91c6-f44b7a20835c-kube-api-access-ct4t5\") on node \"crc\" DevicePath \"\"" Jan 31 14:54:08 crc kubenswrapper[4751]: I0131 14:54:08.976968 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-8khjf" Jan 31 14:54:08 crc kubenswrapper[4751]: I0131 14:54:08.976985 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-8khjf" event={"ID":"45186c12-b6c6-4360-91c6-f44b7a20835c","Type":"ContainerDied","Data":"df12732ce1dcb1caf73b1304f1223fbf619c3f1573236232e36e405a77fa3ed6"} Jan 31 14:54:08 crc kubenswrapper[4751]: I0131 14:54:08.977110 4751 scope.go:117] "RemoveContainer" containerID="6676f9025b980659e32fdcadd8ff687e74b4114ecb7fb9a20f0836497a011c64" Jan 31 14:54:08 crc kubenswrapper[4751]: I0131 14:54:08.979577 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-lpshr" event={"ID":"11fab5ff-3041-45d3-8aab-29e25ed8c6ae","Type":"ContainerStarted","Data":"ee8f26396384b77a167d32d6b25c5bda6d0d4b830d367daeecc1c7c591a2ef63"} Jan 31 14:54:09 crc kubenswrapper[4751]: I0131 14:54:09.019786 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-index-lpshr" podStartSLOduration=2.076689559 podStartE2EDuration="3.019761281s" podCreationTimestamp="2026-01-31 14:54:06 +0000 UTC" firstStartedPulling="2026-01-31 14:54:07.119237658 +0000 UTC m=+751.493950543" lastFinishedPulling="2026-01-31 14:54:08.06230934 +0000 UTC m=+752.437022265" observedRunningTime="2026-01-31 14:54:09.008693469 +0000 UTC m=+753.383406384" watchObservedRunningTime="2026-01-31 14:54:09.019761281 +0000 UTC m=+753.394474196" Jan 31 14:54:09 crc kubenswrapper[4751]: I0131 14:54:09.027682 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-8khjf"] Jan 31 14:54:09 crc kubenswrapper[4751]: I0131 14:54:09.033825 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-index-8khjf"] Jan 31 14:54:09 crc kubenswrapper[4751]: I0131 14:54:09.591758 4751 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 31 14:54:10 crc kubenswrapper[4751]: I0131 14:54:10.417443 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45186c12-b6c6-4360-91c6-f44b7a20835c" path="/var/lib/kubelet/pods/45186c12-b6c6-4360-91c6-f44b7a20835c/volumes" Jan 31 14:54:16 crc kubenswrapper[4751]: I0131 14:54:16.894845 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-index-lpshr" Jan 31 14:54:16 crc kubenswrapper[4751]: I0131 14:54:16.895306 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/mariadb-operator-index-lpshr" Jan 31 14:54:16 crc kubenswrapper[4751]: I0131 14:54:16.937896 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/mariadb-operator-index-lpshr" Jan 31 14:54:17 crc kubenswrapper[4751]: I0131 14:54:17.084013 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-index-lpshr" Jan 31 14:54:18 crc kubenswrapper[4751]: I0131 14:54:18.207924 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40b4j2q"] Jan 31 14:54:18 crc kubenswrapper[4751]: E0131 14:54:18.208444 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45186c12-b6c6-4360-91c6-f44b7a20835c" containerName="registry-server" Jan 31 14:54:18 crc kubenswrapper[4751]: I0131 14:54:18.208458 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="45186c12-b6c6-4360-91c6-f44b7a20835c" containerName="registry-server" Jan 31 14:54:18 crc kubenswrapper[4751]: I0131 14:54:18.208595 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="45186c12-b6c6-4360-91c6-f44b7a20835c" containerName="registry-server" Jan 31 14:54:18 crc kubenswrapper[4751]: I0131 14:54:18.209474 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40b4j2q" Jan 31 14:54:18 crc kubenswrapper[4751]: I0131 14:54:18.211690 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-wxkjx" Jan 31 14:54:18 crc kubenswrapper[4751]: I0131 14:54:18.223276 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40b4j2q"] Jan 31 14:54:18 crc kubenswrapper[4751]: I0131 14:54:18.290447 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/667a6cec-bf73-4340-9be6-f4bc10182004-util\") pod \"f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40b4j2q\" (UID: \"667a6cec-bf73-4340-9be6-f4bc10182004\") " pod="openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40b4j2q" Jan 31 14:54:18 crc kubenswrapper[4751]: I0131 14:54:18.290505 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwwbj\" (UniqueName: \"kubernetes.io/projected/667a6cec-bf73-4340-9be6-f4bc10182004-kube-api-access-bwwbj\") pod \"f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40b4j2q\" (UID: \"667a6cec-bf73-4340-9be6-f4bc10182004\") " pod="openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40b4j2q" Jan 31 14:54:18 crc kubenswrapper[4751]: I0131 14:54:18.290613 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/667a6cec-bf73-4340-9be6-f4bc10182004-bundle\") pod \"f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40b4j2q\" (UID: \"667a6cec-bf73-4340-9be6-f4bc10182004\") " pod="openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40b4j2q" Jan 31 14:54:18 crc kubenswrapper[4751]: I0131 14:54:18.391973 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/667a6cec-bf73-4340-9be6-f4bc10182004-util\") pod \"f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40b4j2q\" (UID: \"667a6cec-bf73-4340-9be6-f4bc10182004\") " pod="openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40b4j2q" Jan 31 14:54:18 crc kubenswrapper[4751]: I0131 14:54:18.392036 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwwbj\" (UniqueName: \"kubernetes.io/projected/667a6cec-bf73-4340-9be6-f4bc10182004-kube-api-access-bwwbj\") pod \"f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40b4j2q\" (UID: \"667a6cec-bf73-4340-9be6-f4bc10182004\") " pod="openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40b4j2q" Jan 31 14:54:18 crc kubenswrapper[4751]: I0131 14:54:18.392195 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/667a6cec-bf73-4340-9be6-f4bc10182004-bundle\") pod \"f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40b4j2q\" (UID: \"667a6cec-bf73-4340-9be6-f4bc10182004\") " pod="openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40b4j2q" Jan 31 14:54:18 crc kubenswrapper[4751]: I0131 14:54:18.392623 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/667a6cec-bf73-4340-9be6-f4bc10182004-util\") pod \"f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40b4j2q\" (UID: \"667a6cec-bf73-4340-9be6-f4bc10182004\") " pod="openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40b4j2q" Jan 31 14:54:18 crc kubenswrapper[4751]: I0131 14:54:18.392946 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/667a6cec-bf73-4340-9be6-f4bc10182004-bundle\") pod \"f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40b4j2q\" (UID: \"667a6cec-bf73-4340-9be6-f4bc10182004\") " pod="openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40b4j2q" Jan 31 14:54:18 crc kubenswrapper[4751]: I0131 14:54:18.426329 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwwbj\" (UniqueName: \"kubernetes.io/projected/667a6cec-bf73-4340-9be6-f4bc10182004-kube-api-access-bwwbj\") pod \"f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40b4j2q\" (UID: \"667a6cec-bf73-4340-9be6-f4bc10182004\") " pod="openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40b4j2q" Jan 31 14:54:18 crc kubenswrapper[4751]: I0131 14:54:18.539587 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40b4j2q" Jan 31 14:54:18 crc kubenswrapper[4751]: I0131 14:54:18.785755 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40b4j2q"] Jan 31 14:54:18 crc kubenswrapper[4751]: W0131 14:54:18.790322 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod667a6cec_bf73_4340_9be6_f4bc10182004.slice/crio-d7229488283ac7b8dda0826471cb965309cf1c47e2f8ba0d50557f1751121c0d WatchSource:0}: Error finding container d7229488283ac7b8dda0826471cb965309cf1c47e2f8ba0d50557f1751121c0d: Status 404 returned error can't find the container with id d7229488283ac7b8dda0826471cb965309cf1c47e2f8ba0d50557f1751121c0d Jan 31 14:54:19 crc kubenswrapper[4751]: I0131 14:54:19.070550 4751 generic.go:334] "Generic (PLEG): container finished" podID="667a6cec-bf73-4340-9be6-f4bc10182004" containerID="8c99859db003b8960447da601e95711f7b0d1554d7ee22f9d6cb9490f3263093" exitCode=0 Jan 31 14:54:19 crc kubenswrapper[4751]: I0131 14:54:19.070604 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40b4j2q" event={"ID":"667a6cec-bf73-4340-9be6-f4bc10182004","Type":"ContainerDied","Data":"8c99859db003b8960447da601e95711f7b0d1554d7ee22f9d6cb9490f3263093"} Jan 31 14:54:19 crc kubenswrapper[4751]: I0131 14:54:19.070640 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40b4j2q" event={"ID":"667a6cec-bf73-4340-9be6-f4bc10182004","Type":"ContainerStarted","Data":"d7229488283ac7b8dda0826471cb965309cf1c47e2f8ba0d50557f1751121c0d"} Jan 31 14:54:20 crc kubenswrapper[4751]: I0131 14:54:20.078581 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40b4j2q" event={"ID":"667a6cec-bf73-4340-9be6-f4bc10182004","Type":"ContainerStarted","Data":"48b5fe15e6b5d08f52dd98326462e391e5fafbd3bd396d34d3c7b444efffa146"} Jan 31 14:54:21 crc kubenswrapper[4751]: I0131 14:54:21.084715 4751 generic.go:334] "Generic (PLEG): container finished" podID="667a6cec-bf73-4340-9be6-f4bc10182004" containerID="48b5fe15e6b5d08f52dd98326462e391e5fafbd3bd396d34d3c7b444efffa146" exitCode=0 Jan 31 14:54:21 crc kubenswrapper[4751]: I0131 14:54:21.084765 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40b4j2q" event={"ID":"667a6cec-bf73-4340-9be6-f4bc10182004","Type":"ContainerDied","Data":"48b5fe15e6b5d08f52dd98326462e391e5fafbd3bd396d34d3c7b444efffa146"} Jan 31 14:54:22 crc kubenswrapper[4751]: I0131 14:54:22.094883 4751 generic.go:334] "Generic (PLEG): container finished" podID="667a6cec-bf73-4340-9be6-f4bc10182004" containerID="1220529350d17a7bb750446818ec08ebb9bc079afd6ac80866f7fe1abd4f1db3" exitCode=0 Jan 31 14:54:22 crc kubenswrapper[4751]: I0131 14:54:22.094939 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40b4j2q" event={"ID":"667a6cec-bf73-4340-9be6-f4bc10182004","Type":"ContainerDied","Data":"1220529350d17a7bb750446818ec08ebb9bc079afd6ac80866f7fe1abd4f1db3"} Jan 31 14:54:23 crc kubenswrapper[4751]: I0131 14:54:23.366591 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40b4j2q" Jan 31 14:54:23 crc kubenswrapper[4751]: I0131 14:54:23.463476 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/667a6cec-bf73-4340-9be6-f4bc10182004-bundle\") pod \"667a6cec-bf73-4340-9be6-f4bc10182004\" (UID: \"667a6cec-bf73-4340-9be6-f4bc10182004\") " Jan 31 14:54:23 crc kubenswrapper[4751]: I0131 14:54:23.463602 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/667a6cec-bf73-4340-9be6-f4bc10182004-util\") pod \"667a6cec-bf73-4340-9be6-f4bc10182004\" (UID: \"667a6cec-bf73-4340-9be6-f4bc10182004\") " Jan 31 14:54:23 crc kubenswrapper[4751]: I0131 14:54:23.463672 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwwbj\" (UniqueName: \"kubernetes.io/projected/667a6cec-bf73-4340-9be6-f4bc10182004-kube-api-access-bwwbj\") pod \"667a6cec-bf73-4340-9be6-f4bc10182004\" (UID: \"667a6cec-bf73-4340-9be6-f4bc10182004\") " Jan 31 14:54:23 crc kubenswrapper[4751]: I0131 14:54:23.464815 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/667a6cec-bf73-4340-9be6-f4bc10182004-bundle" (OuterVolumeSpecName: "bundle") pod "667a6cec-bf73-4340-9be6-f4bc10182004" (UID: "667a6cec-bf73-4340-9be6-f4bc10182004"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 14:54:23 crc kubenswrapper[4751]: I0131 14:54:23.469703 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/667a6cec-bf73-4340-9be6-f4bc10182004-kube-api-access-bwwbj" (OuterVolumeSpecName: "kube-api-access-bwwbj") pod "667a6cec-bf73-4340-9be6-f4bc10182004" (UID: "667a6cec-bf73-4340-9be6-f4bc10182004"). InnerVolumeSpecName "kube-api-access-bwwbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:54:23 crc kubenswrapper[4751]: I0131 14:54:23.494301 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/667a6cec-bf73-4340-9be6-f4bc10182004-util" (OuterVolumeSpecName: "util") pod "667a6cec-bf73-4340-9be6-f4bc10182004" (UID: "667a6cec-bf73-4340-9be6-f4bc10182004"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 14:54:23 crc kubenswrapper[4751]: I0131 14:54:23.565625 4751 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/667a6cec-bf73-4340-9be6-f4bc10182004-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 14:54:23 crc kubenswrapper[4751]: I0131 14:54:23.565675 4751 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/667a6cec-bf73-4340-9be6-f4bc10182004-util\") on node \"crc\" DevicePath \"\"" Jan 31 14:54:23 crc kubenswrapper[4751]: I0131 14:54:23.565695 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwwbj\" (UniqueName: \"kubernetes.io/projected/667a6cec-bf73-4340-9be6-f4bc10182004-kube-api-access-bwwbj\") on node \"crc\" DevicePath \"\"" Jan 31 14:54:24 crc kubenswrapper[4751]: I0131 14:54:24.112369 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40b4j2q" event={"ID":"667a6cec-bf73-4340-9be6-f4bc10182004","Type":"ContainerDied","Data":"d7229488283ac7b8dda0826471cb965309cf1c47e2f8ba0d50557f1751121c0d"} Jan 31 14:54:24 crc kubenswrapper[4751]: I0131 14:54:24.112442 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7229488283ac7b8dda0826471cb965309cf1c47e2f8ba0d50557f1751121c0d" Jan 31 14:54:24 crc kubenswrapper[4751]: I0131 14:54:24.112936 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40b4j2q" Jan 31 14:54:31 crc kubenswrapper[4751]: I0131 14:54:31.433163 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-65848b4486-qb6hn"] Jan 31 14:54:31 crc kubenswrapper[4751]: E0131 14:54:31.433920 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="667a6cec-bf73-4340-9be6-f4bc10182004" containerName="pull" Jan 31 14:54:31 crc kubenswrapper[4751]: I0131 14:54:31.433936 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="667a6cec-bf73-4340-9be6-f4bc10182004" containerName="pull" Jan 31 14:54:31 crc kubenswrapper[4751]: E0131 14:54:31.433957 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="667a6cec-bf73-4340-9be6-f4bc10182004" containerName="util" Jan 31 14:54:31 crc kubenswrapper[4751]: I0131 14:54:31.433965 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="667a6cec-bf73-4340-9be6-f4bc10182004" containerName="util" Jan 31 14:54:31 crc kubenswrapper[4751]: E0131 14:54:31.433974 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="667a6cec-bf73-4340-9be6-f4bc10182004" containerName="extract" Jan 31 14:54:31 crc kubenswrapper[4751]: I0131 14:54:31.433983 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="667a6cec-bf73-4340-9be6-f4bc10182004" containerName="extract" Jan 31 14:54:31 crc kubenswrapper[4751]: I0131 14:54:31.434134 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="667a6cec-bf73-4340-9be6-f4bc10182004" containerName="extract" Jan 31 14:54:31 crc kubenswrapper[4751]: I0131 14:54:31.434565 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-65848b4486-qb6hn" Jan 31 14:54:31 crc kubenswrapper[4751]: I0131 14:54:31.436863 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-vt8x4" Jan 31 14:54:31 crc kubenswrapper[4751]: I0131 14:54:31.436984 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Jan 31 14:54:31 crc kubenswrapper[4751]: I0131 14:54:31.437507 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-service-cert" Jan 31 14:54:31 crc kubenswrapper[4751]: I0131 14:54:31.446609 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-65848b4486-qb6hn"] Jan 31 14:54:31 crc kubenswrapper[4751]: I0131 14:54:31.580495 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/14df28b7-d7cb-466e-aa07-69e320d71620-apiservice-cert\") pod \"mariadb-operator-controller-manager-65848b4486-qb6hn\" (UID: \"14df28b7-d7cb-466e-aa07-69e320d71620\") " pod="openstack-operators/mariadb-operator-controller-manager-65848b4486-qb6hn" Jan 31 14:54:31 crc kubenswrapper[4751]: I0131 14:54:31.580571 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zcjm\" (UniqueName: \"kubernetes.io/projected/14df28b7-d7cb-466e-aa07-69e320d71620-kube-api-access-5zcjm\") pod \"mariadb-operator-controller-manager-65848b4486-qb6hn\" (UID: \"14df28b7-d7cb-466e-aa07-69e320d71620\") " pod="openstack-operators/mariadb-operator-controller-manager-65848b4486-qb6hn" Jan 31 14:54:31 crc kubenswrapper[4751]: I0131 14:54:31.580609 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/14df28b7-d7cb-466e-aa07-69e320d71620-webhook-cert\") pod \"mariadb-operator-controller-manager-65848b4486-qb6hn\" (UID: \"14df28b7-d7cb-466e-aa07-69e320d71620\") " pod="openstack-operators/mariadb-operator-controller-manager-65848b4486-qb6hn" Jan 31 14:54:31 crc kubenswrapper[4751]: I0131 14:54:31.681980 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zcjm\" (UniqueName: \"kubernetes.io/projected/14df28b7-d7cb-466e-aa07-69e320d71620-kube-api-access-5zcjm\") pod \"mariadb-operator-controller-manager-65848b4486-qb6hn\" (UID: \"14df28b7-d7cb-466e-aa07-69e320d71620\") " pod="openstack-operators/mariadb-operator-controller-manager-65848b4486-qb6hn" Jan 31 14:54:31 crc kubenswrapper[4751]: I0131 14:54:31.682059 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/14df28b7-d7cb-466e-aa07-69e320d71620-webhook-cert\") pod \"mariadb-operator-controller-manager-65848b4486-qb6hn\" (UID: \"14df28b7-d7cb-466e-aa07-69e320d71620\") " pod="openstack-operators/mariadb-operator-controller-manager-65848b4486-qb6hn" Jan 31 14:54:31 crc kubenswrapper[4751]: I0131 14:54:31.682272 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/14df28b7-d7cb-466e-aa07-69e320d71620-apiservice-cert\") pod \"mariadb-operator-controller-manager-65848b4486-qb6hn\" (UID: \"14df28b7-d7cb-466e-aa07-69e320d71620\") " pod="openstack-operators/mariadb-operator-controller-manager-65848b4486-qb6hn" Jan 31 14:54:31 crc kubenswrapper[4751]: I0131 14:54:31.692117 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/14df28b7-d7cb-466e-aa07-69e320d71620-webhook-cert\") pod \"mariadb-operator-controller-manager-65848b4486-qb6hn\" (UID: \"14df28b7-d7cb-466e-aa07-69e320d71620\") " pod="openstack-operators/mariadb-operator-controller-manager-65848b4486-qb6hn" Jan 31 14:54:31 crc kubenswrapper[4751]: I0131 14:54:31.693360 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/14df28b7-d7cb-466e-aa07-69e320d71620-apiservice-cert\") pod \"mariadb-operator-controller-manager-65848b4486-qb6hn\" (UID: \"14df28b7-d7cb-466e-aa07-69e320d71620\") " pod="openstack-operators/mariadb-operator-controller-manager-65848b4486-qb6hn" Jan 31 14:54:31 crc kubenswrapper[4751]: I0131 14:54:31.727231 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zcjm\" (UniqueName: \"kubernetes.io/projected/14df28b7-d7cb-466e-aa07-69e320d71620-kube-api-access-5zcjm\") pod \"mariadb-operator-controller-manager-65848b4486-qb6hn\" (UID: \"14df28b7-d7cb-466e-aa07-69e320d71620\") " pod="openstack-operators/mariadb-operator-controller-manager-65848b4486-qb6hn" Jan 31 14:54:31 crc kubenswrapper[4751]: I0131 14:54:31.752548 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-65848b4486-qb6hn" Jan 31 14:54:32 crc kubenswrapper[4751]: I0131 14:54:32.066498 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-65848b4486-qb6hn"] Jan 31 14:54:32 crc kubenswrapper[4751]: I0131 14:54:32.168428 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-65848b4486-qb6hn" event={"ID":"14df28b7-d7cb-466e-aa07-69e320d71620","Type":"ContainerStarted","Data":"8d7ddc4e6b1f882339c27c9bee06d6abc3c29498935b356f92bf581f66149e68"} Jan 31 14:54:36 crc kubenswrapper[4751]: I0131 14:54:36.205253 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-65848b4486-qb6hn" event={"ID":"14df28b7-d7cb-466e-aa07-69e320d71620","Type":"ContainerStarted","Data":"335b59bfcc7957562303125aed6f69b84b76a35a78ef760e49817204c373ec41"} Jan 31 14:54:36 crc kubenswrapper[4751]: I0131 14:54:36.205870 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-65848b4486-qb6hn" Jan 31 14:54:36 crc kubenswrapper[4751]: I0131 14:54:36.235749 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-65848b4486-qb6hn" podStartSLOduration=1.911500285 podStartE2EDuration="5.235715992s" podCreationTimestamp="2026-01-31 14:54:31 +0000 UTC" firstStartedPulling="2026-01-31 14:54:32.07448376 +0000 UTC m=+776.449196645" lastFinishedPulling="2026-01-31 14:54:35.398699467 +0000 UTC m=+779.773412352" observedRunningTime="2026-01-31 14:54:36.230192687 +0000 UTC m=+780.604905612" watchObservedRunningTime="2026-01-31 14:54:36.235715992 +0000 UTC m=+780.610428917" Jan 31 14:54:38 crc kubenswrapper[4751]: I0131 14:54:38.897051 4751 patch_prober.go:28] interesting pod/machine-config-daemon-2wpj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 14:54:38 crc kubenswrapper[4751]: I0131 14:54:38.897609 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podUID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 14:54:41 crc kubenswrapper[4751]: I0131 14:54:41.758146 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-65848b4486-qb6hn" Jan 31 14:54:46 crc kubenswrapper[4751]: I0131 14:54:46.441876 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-index-9b26d"] Jan 31 14:54:46 crc kubenswrapper[4751]: I0131 14:54:46.453223 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-9b26d" Jan 31 14:54:46 crc kubenswrapper[4751]: I0131 14:54:46.484654 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26444\" (UniqueName: \"kubernetes.io/projected/f24c1af7-b130-4ede-a7be-24aedb5c293b-kube-api-access-26444\") pod \"infra-operator-index-9b26d\" (UID: \"f24c1af7-b130-4ede-a7be-24aedb5c293b\") " pod="openstack-operators/infra-operator-index-9b26d" Jan 31 14:54:46 crc kubenswrapper[4751]: I0131 14:54:46.524443 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-index-dockercfg-2wdmz" Jan 31 14:54:46 crc kubenswrapper[4751]: I0131 14:54:46.528434 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-9b26d"] Jan 31 14:54:46 crc kubenswrapper[4751]: I0131 14:54:46.585517 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26444\" (UniqueName: \"kubernetes.io/projected/f24c1af7-b130-4ede-a7be-24aedb5c293b-kube-api-access-26444\") pod \"infra-operator-index-9b26d\" (UID: \"f24c1af7-b130-4ede-a7be-24aedb5c293b\") " pod="openstack-operators/infra-operator-index-9b26d" Jan 31 14:54:46 crc kubenswrapper[4751]: I0131 14:54:46.769045 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26444\" (UniqueName: \"kubernetes.io/projected/f24c1af7-b130-4ede-a7be-24aedb5c293b-kube-api-access-26444\") pod \"infra-operator-index-9b26d\" (UID: \"f24c1af7-b130-4ede-a7be-24aedb5c293b\") " pod="openstack-operators/infra-operator-index-9b26d" Jan 31 14:54:46 crc kubenswrapper[4751]: I0131 14:54:46.781499 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-9b26d" Jan 31 14:54:47 crc kubenswrapper[4751]: I0131 14:54:47.265395 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-9b26d"] Jan 31 14:54:48 crc kubenswrapper[4751]: I0131 14:54:48.279911 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-9b26d" event={"ID":"f24c1af7-b130-4ede-a7be-24aedb5c293b","Type":"ContainerStarted","Data":"8a29ac7a55371e1e5a88fa32404a31e8d6cddca143fb918552f1ab9d6575739d"} Jan 31 14:54:49 crc kubenswrapper[4751]: I0131 14:54:49.286979 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-9b26d" event={"ID":"f24c1af7-b130-4ede-a7be-24aedb5c293b","Type":"ContainerStarted","Data":"0932a4c503ee3b0e04317220a8d546932e9688b754dc52c70da600000553bab7"} Jan 31 14:54:49 crc kubenswrapper[4751]: I0131 14:54:49.300129 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-index-9b26d" podStartSLOduration=1.9576949209999999 podStartE2EDuration="3.30004309s" podCreationTimestamp="2026-01-31 14:54:46 +0000 UTC" firstStartedPulling="2026-01-31 14:54:47.283165727 +0000 UTC m=+791.657878612" lastFinishedPulling="2026-01-31 14:54:48.625513896 +0000 UTC m=+793.000226781" observedRunningTime="2026-01-31 14:54:49.298650443 +0000 UTC m=+793.673363328" watchObservedRunningTime="2026-01-31 14:54:49.30004309 +0000 UTC m=+793.674755985" Jan 31 14:54:49 crc kubenswrapper[4751]: I0131 14:54:49.395639 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-9b26d"] Jan 31 14:54:50 crc kubenswrapper[4751]: I0131 14:54:50.017497 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-index-5tz82"] Jan 31 14:54:50 crc kubenswrapper[4751]: I0131 14:54:50.018942 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-5tz82" Jan 31 14:54:50 crc kubenswrapper[4751]: I0131 14:54:50.025356 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-5tz82"] Jan 31 14:54:50 crc kubenswrapper[4751]: I0131 14:54:50.165225 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmd6m\" (UniqueName: \"kubernetes.io/projected/15539f33-874c-45ae-8ee2-7f821c54b267-kube-api-access-rmd6m\") pod \"infra-operator-index-5tz82\" (UID: \"15539f33-874c-45ae-8ee2-7f821c54b267\") " pod="openstack-operators/infra-operator-index-5tz82" Jan 31 14:54:50 crc kubenswrapper[4751]: I0131 14:54:50.266845 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmd6m\" (UniqueName: \"kubernetes.io/projected/15539f33-874c-45ae-8ee2-7f821c54b267-kube-api-access-rmd6m\") pod \"infra-operator-index-5tz82\" (UID: \"15539f33-874c-45ae-8ee2-7f821c54b267\") " pod="openstack-operators/infra-operator-index-5tz82" Jan 31 14:54:50 crc kubenswrapper[4751]: I0131 14:54:50.303113 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmd6m\" (UniqueName: \"kubernetes.io/projected/15539f33-874c-45ae-8ee2-7f821c54b267-kube-api-access-rmd6m\") pod \"infra-operator-index-5tz82\" (UID: \"15539f33-874c-45ae-8ee2-7f821c54b267\") " pod="openstack-operators/infra-operator-index-5tz82" Jan 31 14:54:50 crc kubenswrapper[4751]: I0131 14:54:50.391552 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-5tz82" Jan 31 14:54:50 crc kubenswrapper[4751]: I0131 14:54:50.828582 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-5tz82"] Jan 31 14:54:50 crc kubenswrapper[4751]: W0131 14:54:50.839337 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15539f33_874c_45ae_8ee2_7f821c54b267.slice/crio-a32321b4d51d551ed7ea834004f3d66d0ba16e8c6d1b16cfe9fefade795fabc7 WatchSource:0}: Error finding container a32321b4d51d551ed7ea834004f3d66d0ba16e8c6d1b16cfe9fefade795fabc7: Status 404 returned error can't find the container with id a32321b4d51d551ed7ea834004f3d66d0ba16e8c6d1b16cfe9fefade795fabc7 Jan 31 14:54:51 crc kubenswrapper[4751]: I0131 14:54:51.308398 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-5tz82" event={"ID":"15539f33-874c-45ae-8ee2-7f821c54b267","Type":"ContainerStarted","Data":"a32321b4d51d551ed7ea834004f3d66d0ba16e8c6d1b16cfe9fefade795fabc7"} Jan 31 14:54:51 crc kubenswrapper[4751]: I0131 14:54:51.308966 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-index-9b26d" podUID="f24c1af7-b130-4ede-a7be-24aedb5c293b" containerName="registry-server" containerID="cri-o://0932a4c503ee3b0e04317220a8d546932e9688b754dc52c70da600000553bab7" gracePeriod=2 Jan 31 14:54:51 crc kubenswrapper[4751]: I0131 14:54:51.700518 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-9b26d" Jan 31 14:54:51 crc kubenswrapper[4751]: I0131 14:54:51.786543 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26444\" (UniqueName: \"kubernetes.io/projected/f24c1af7-b130-4ede-a7be-24aedb5c293b-kube-api-access-26444\") pod \"f24c1af7-b130-4ede-a7be-24aedb5c293b\" (UID: \"f24c1af7-b130-4ede-a7be-24aedb5c293b\") " Jan 31 14:54:51 crc kubenswrapper[4751]: I0131 14:54:51.793917 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f24c1af7-b130-4ede-a7be-24aedb5c293b-kube-api-access-26444" (OuterVolumeSpecName: "kube-api-access-26444") pod "f24c1af7-b130-4ede-a7be-24aedb5c293b" (UID: "f24c1af7-b130-4ede-a7be-24aedb5c293b"). InnerVolumeSpecName "kube-api-access-26444". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:54:51 crc kubenswrapper[4751]: I0131 14:54:51.888550 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26444\" (UniqueName: \"kubernetes.io/projected/f24c1af7-b130-4ede-a7be-24aedb5c293b-kube-api-access-26444\") on node \"crc\" DevicePath \"\"" Jan 31 14:54:52 crc kubenswrapper[4751]: I0131 14:54:52.316244 4751 generic.go:334] "Generic (PLEG): container finished" podID="f24c1af7-b130-4ede-a7be-24aedb5c293b" containerID="0932a4c503ee3b0e04317220a8d546932e9688b754dc52c70da600000553bab7" exitCode=0 Jan 31 14:54:52 crc kubenswrapper[4751]: I0131 14:54:52.316314 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-9b26d" event={"ID":"f24c1af7-b130-4ede-a7be-24aedb5c293b","Type":"ContainerDied","Data":"0932a4c503ee3b0e04317220a8d546932e9688b754dc52c70da600000553bab7"} Jan 31 14:54:52 crc kubenswrapper[4751]: I0131 14:54:52.316341 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-9b26d" event={"ID":"f24c1af7-b130-4ede-a7be-24aedb5c293b","Type":"ContainerDied","Data":"8a29ac7a55371e1e5a88fa32404a31e8d6cddca143fb918552f1ab9d6575739d"} Jan 31 14:54:52 crc kubenswrapper[4751]: I0131 14:54:52.316334 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-9b26d" Jan 31 14:54:52 crc kubenswrapper[4751]: I0131 14:54:52.316391 4751 scope.go:117] "RemoveContainer" containerID="0932a4c503ee3b0e04317220a8d546932e9688b754dc52c70da600000553bab7" Jan 31 14:54:52 crc kubenswrapper[4751]: I0131 14:54:52.318351 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-5tz82" event={"ID":"15539f33-874c-45ae-8ee2-7f821c54b267","Type":"ContainerStarted","Data":"2880fdfb3f748e15e74bf089fd52f14043b007613c5d6a9fb47038e2c465f42a"} Jan 31 14:54:52 crc kubenswrapper[4751]: I0131 14:54:52.342240 4751 scope.go:117] "RemoveContainer" containerID="0932a4c503ee3b0e04317220a8d546932e9688b754dc52c70da600000553bab7" Jan 31 14:54:52 crc kubenswrapper[4751]: E0131 14:54:52.342830 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0932a4c503ee3b0e04317220a8d546932e9688b754dc52c70da600000553bab7\": container with ID starting with 0932a4c503ee3b0e04317220a8d546932e9688b754dc52c70da600000553bab7 not found: ID does not exist" containerID="0932a4c503ee3b0e04317220a8d546932e9688b754dc52c70da600000553bab7" Jan 31 14:54:52 crc kubenswrapper[4751]: I0131 14:54:52.342869 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0932a4c503ee3b0e04317220a8d546932e9688b754dc52c70da600000553bab7"} err="failed to get container status \"0932a4c503ee3b0e04317220a8d546932e9688b754dc52c70da600000553bab7\": rpc error: code = NotFound desc = could not find container \"0932a4c503ee3b0e04317220a8d546932e9688b754dc52c70da600000553bab7\": container with ID starting with 0932a4c503ee3b0e04317220a8d546932e9688b754dc52c70da600000553bab7 not found: ID does not exist" Jan 31 14:54:52 crc kubenswrapper[4751]: I0131 14:54:52.356493 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-index-5tz82" podStartSLOduration=2.92474711 podStartE2EDuration="3.356469785s" podCreationTimestamp="2026-01-31 14:54:49 +0000 UTC" firstStartedPulling="2026-01-31 14:54:50.843994633 +0000 UTC m=+795.218707548" lastFinishedPulling="2026-01-31 14:54:51.275717328 +0000 UTC m=+795.650430223" observedRunningTime="2026-01-31 14:54:52.342293721 +0000 UTC m=+796.717006626" watchObservedRunningTime="2026-01-31 14:54:52.356469785 +0000 UTC m=+796.731182680" Jan 31 14:54:52 crc kubenswrapper[4751]: I0131 14:54:52.360319 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-9b26d"] Jan 31 14:54:52 crc kubenswrapper[4751]: I0131 14:54:52.365230 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-index-9b26d"] Jan 31 14:54:52 crc kubenswrapper[4751]: I0131 14:54:52.413326 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f24c1af7-b130-4ede-a7be-24aedb5c293b" path="/var/lib/kubelet/pods/f24c1af7-b130-4ede-a7be-24aedb5c293b/volumes" Jan 31 14:55:00 crc kubenswrapper[4751]: I0131 14:55:00.392225 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-index-5tz82" Jan 31 14:55:00 crc kubenswrapper[4751]: I0131 14:55:00.392994 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/infra-operator-index-5tz82" Jan 31 14:55:00 crc kubenswrapper[4751]: I0131 14:55:00.437545 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/infra-operator-index-5tz82" Jan 31 14:55:01 crc kubenswrapper[4751]: I0131 14:55:01.401790 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-index-5tz82" Jan 31 14:55:02 crc kubenswrapper[4751]: I0131 14:55:02.647456 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576w2vw6"] Jan 31 14:55:02 crc kubenswrapper[4751]: E0131 14:55:02.648294 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f24c1af7-b130-4ede-a7be-24aedb5c293b" containerName="registry-server" Jan 31 14:55:02 crc kubenswrapper[4751]: I0131 14:55:02.648326 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="f24c1af7-b130-4ede-a7be-24aedb5c293b" containerName="registry-server" Jan 31 14:55:02 crc kubenswrapper[4751]: I0131 14:55:02.648612 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="f24c1af7-b130-4ede-a7be-24aedb5c293b" containerName="registry-server" Jan 31 14:55:02 crc kubenswrapper[4751]: I0131 14:55:02.650432 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576w2vw6" Jan 31 14:55:02 crc kubenswrapper[4751]: I0131 14:55:02.656680 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576w2vw6"] Jan 31 14:55:02 crc kubenswrapper[4751]: I0131 14:55:02.657939 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-wxkjx" Jan 31 14:55:02 crc kubenswrapper[4751]: I0131 14:55:02.755763 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8l97j\" (UniqueName: \"kubernetes.io/projected/29a3b16f-f39d-413a-b623-3ac15aba50cf-kube-api-access-8l97j\") pod \"d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576w2vw6\" (UID: \"29a3b16f-f39d-413a-b623-3ac15aba50cf\") " pod="openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576w2vw6" Jan 31 14:55:02 crc kubenswrapper[4751]: I0131 14:55:02.755841 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/29a3b16f-f39d-413a-b623-3ac15aba50cf-util\") pod \"d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576w2vw6\" (UID: \"29a3b16f-f39d-413a-b623-3ac15aba50cf\") " pod="openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576w2vw6" Jan 31 14:55:02 crc kubenswrapper[4751]: I0131 14:55:02.755869 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/29a3b16f-f39d-413a-b623-3ac15aba50cf-bundle\") pod \"d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576w2vw6\" (UID: \"29a3b16f-f39d-413a-b623-3ac15aba50cf\") " pod="openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576w2vw6" Jan 31 14:55:02 crc kubenswrapper[4751]: I0131 14:55:02.857184 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/29a3b16f-f39d-413a-b623-3ac15aba50cf-util\") pod \"d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576w2vw6\" (UID: \"29a3b16f-f39d-413a-b623-3ac15aba50cf\") " pod="openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576w2vw6" Jan 31 14:55:02 crc kubenswrapper[4751]: I0131 14:55:02.857223 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/29a3b16f-f39d-413a-b623-3ac15aba50cf-bundle\") pod \"d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576w2vw6\" (UID: \"29a3b16f-f39d-413a-b623-3ac15aba50cf\") " pod="openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576w2vw6" Jan 31 14:55:02 crc kubenswrapper[4751]: I0131 14:55:02.857278 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8l97j\" (UniqueName: \"kubernetes.io/projected/29a3b16f-f39d-413a-b623-3ac15aba50cf-kube-api-access-8l97j\") pod \"d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576w2vw6\" (UID: \"29a3b16f-f39d-413a-b623-3ac15aba50cf\") " pod="openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576w2vw6" Jan 31 14:55:02 crc kubenswrapper[4751]: I0131 14:55:02.857818 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/29a3b16f-f39d-413a-b623-3ac15aba50cf-util\") pod \"d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576w2vw6\" (UID: \"29a3b16f-f39d-413a-b623-3ac15aba50cf\") " pod="openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576w2vw6" Jan 31 14:55:02 crc kubenswrapper[4751]: I0131 14:55:02.857886 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/29a3b16f-f39d-413a-b623-3ac15aba50cf-bundle\") pod \"d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576w2vw6\" (UID: \"29a3b16f-f39d-413a-b623-3ac15aba50cf\") " pod="openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576w2vw6" Jan 31 14:55:02 crc kubenswrapper[4751]: I0131 14:55:02.876692 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8l97j\" (UniqueName: \"kubernetes.io/projected/29a3b16f-f39d-413a-b623-3ac15aba50cf-kube-api-access-8l97j\") pod \"d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576w2vw6\" (UID: \"29a3b16f-f39d-413a-b623-3ac15aba50cf\") " pod="openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576w2vw6" Jan 31 14:55:02 crc kubenswrapper[4751]: I0131 14:55:02.967337 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576w2vw6" Jan 31 14:55:03 crc kubenswrapper[4751]: I0131 14:55:03.295637 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576w2vw6"] Jan 31 14:55:03 crc kubenswrapper[4751]: I0131 14:55:03.392161 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576w2vw6" event={"ID":"29a3b16f-f39d-413a-b623-3ac15aba50cf","Type":"ContainerStarted","Data":"02ab8ec2e4c1f038366997dc0d3a6ec842779cf69755b318b2167207ddaf7560"} Jan 31 14:55:04 crc kubenswrapper[4751]: I0131 14:55:04.402498 4751 generic.go:334] "Generic (PLEG): container finished" podID="29a3b16f-f39d-413a-b623-3ac15aba50cf" containerID="914fa7bc157f85f90159778e4a352984883804f817b8f2353eb69568b5c31c21" exitCode=0 Jan 31 14:55:04 crc kubenswrapper[4751]: I0131 14:55:04.402569 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576w2vw6" event={"ID":"29a3b16f-f39d-413a-b623-3ac15aba50cf","Type":"ContainerDied","Data":"914fa7bc157f85f90159778e4a352984883804f817b8f2353eb69568b5c31c21"} Jan 31 14:55:05 crc kubenswrapper[4751]: I0131 14:55:05.409710 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576w2vw6" event={"ID":"29a3b16f-f39d-413a-b623-3ac15aba50cf","Type":"ContainerStarted","Data":"c816a8193fdacfa313315863400dd00d03b42ba5e5ce2524c35985ffd3fa845b"} Jan 31 14:55:06 crc kubenswrapper[4751]: I0131 14:55:06.420949 4751 generic.go:334] "Generic (PLEG): container finished" podID="29a3b16f-f39d-413a-b623-3ac15aba50cf" containerID="c816a8193fdacfa313315863400dd00d03b42ba5e5ce2524c35985ffd3fa845b" exitCode=0 Jan 31 14:55:06 crc kubenswrapper[4751]: I0131 14:55:06.421521 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576w2vw6" event={"ID":"29a3b16f-f39d-413a-b623-3ac15aba50cf","Type":"ContainerDied","Data":"c816a8193fdacfa313315863400dd00d03b42ba5e5ce2524c35985ffd3fa845b"} Jan 31 14:55:07 crc kubenswrapper[4751]: I0131 14:55:07.429369 4751 generic.go:334] "Generic (PLEG): container finished" podID="29a3b16f-f39d-413a-b623-3ac15aba50cf" containerID="ed9ea3bb8f54f1c0a1685efd692fcb4334fbd2ea55432c305b974a3bf1ca584b" exitCode=0 Jan 31 14:55:07 crc kubenswrapper[4751]: I0131 14:55:07.429442 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576w2vw6" event={"ID":"29a3b16f-f39d-413a-b623-3ac15aba50cf","Type":"ContainerDied","Data":"ed9ea3bb8f54f1c0a1685efd692fcb4334fbd2ea55432c305b974a3bf1ca584b"} Jan 31 14:55:08 crc kubenswrapper[4751]: I0131 14:55:08.842487 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576w2vw6" Jan 31 14:55:08 crc kubenswrapper[4751]: I0131 14:55:08.896677 4751 patch_prober.go:28] interesting pod/machine-config-daemon-2wpj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 14:55:08 crc kubenswrapper[4751]: I0131 14:55:08.896765 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podUID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 14:55:08 crc kubenswrapper[4751]: I0131 14:55:08.949731 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8l97j\" (UniqueName: \"kubernetes.io/projected/29a3b16f-f39d-413a-b623-3ac15aba50cf-kube-api-access-8l97j\") pod \"29a3b16f-f39d-413a-b623-3ac15aba50cf\" (UID: \"29a3b16f-f39d-413a-b623-3ac15aba50cf\") " Jan 31 14:55:08 crc kubenswrapper[4751]: I0131 14:55:08.949792 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/29a3b16f-f39d-413a-b623-3ac15aba50cf-bundle\") pod \"29a3b16f-f39d-413a-b623-3ac15aba50cf\" (UID: \"29a3b16f-f39d-413a-b623-3ac15aba50cf\") " Jan 31 14:55:08 crc kubenswrapper[4751]: I0131 14:55:08.949825 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/29a3b16f-f39d-413a-b623-3ac15aba50cf-util\") pod \"29a3b16f-f39d-413a-b623-3ac15aba50cf\" (UID: \"29a3b16f-f39d-413a-b623-3ac15aba50cf\") " Jan 31 14:55:08 crc kubenswrapper[4751]: I0131 14:55:08.954763 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29a3b16f-f39d-413a-b623-3ac15aba50cf-bundle" (OuterVolumeSpecName: "bundle") pod "29a3b16f-f39d-413a-b623-3ac15aba50cf" (UID: "29a3b16f-f39d-413a-b623-3ac15aba50cf"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 14:55:08 crc kubenswrapper[4751]: I0131 14:55:08.957157 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29a3b16f-f39d-413a-b623-3ac15aba50cf-kube-api-access-8l97j" (OuterVolumeSpecName: "kube-api-access-8l97j") pod "29a3b16f-f39d-413a-b623-3ac15aba50cf" (UID: "29a3b16f-f39d-413a-b623-3ac15aba50cf"). InnerVolumeSpecName "kube-api-access-8l97j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:55:09 crc kubenswrapper[4751]: I0131 14:55:09.026671 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29a3b16f-f39d-413a-b623-3ac15aba50cf-util" (OuterVolumeSpecName: "util") pod "29a3b16f-f39d-413a-b623-3ac15aba50cf" (UID: "29a3b16f-f39d-413a-b623-3ac15aba50cf"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 14:55:09 crc kubenswrapper[4751]: I0131 14:55:09.051844 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8l97j\" (UniqueName: \"kubernetes.io/projected/29a3b16f-f39d-413a-b623-3ac15aba50cf-kube-api-access-8l97j\") on node \"crc\" DevicePath \"\"" Jan 31 14:55:09 crc kubenswrapper[4751]: I0131 14:55:09.051898 4751 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/29a3b16f-f39d-413a-b623-3ac15aba50cf-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 14:55:09 crc kubenswrapper[4751]: I0131 14:55:09.051918 4751 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/29a3b16f-f39d-413a-b623-3ac15aba50cf-util\") on node \"crc\" DevicePath \"\"" Jan 31 14:55:09 crc kubenswrapper[4751]: I0131 14:55:09.443154 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576w2vw6" event={"ID":"29a3b16f-f39d-413a-b623-3ac15aba50cf","Type":"ContainerDied","Data":"02ab8ec2e4c1f038366997dc0d3a6ec842779cf69755b318b2167207ddaf7560"} Jan 31 14:55:09 crc kubenswrapper[4751]: I0131 14:55:09.443221 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02ab8ec2e4c1f038366997dc0d3a6ec842779cf69755b318b2167207ddaf7560" Jan 31 14:55:09 crc kubenswrapper[4751]: I0131 14:55:09.443325 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576w2vw6" Jan 31 14:55:19 crc kubenswrapper[4751]: I0131 14:55:19.530404 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-57f67fdff5-45pkl"] Jan 31 14:55:19 crc kubenswrapper[4751]: E0131 14:55:19.531116 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29a3b16f-f39d-413a-b623-3ac15aba50cf" containerName="extract" Jan 31 14:55:19 crc kubenswrapper[4751]: I0131 14:55:19.531132 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="29a3b16f-f39d-413a-b623-3ac15aba50cf" containerName="extract" Jan 31 14:55:19 crc kubenswrapper[4751]: E0131 14:55:19.531150 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29a3b16f-f39d-413a-b623-3ac15aba50cf" containerName="pull" Jan 31 14:55:19 crc kubenswrapper[4751]: I0131 14:55:19.531158 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="29a3b16f-f39d-413a-b623-3ac15aba50cf" containerName="pull" Jan 31 14:55:19 crc kubenswrapper[4751]: E0131 14:55:19.531171 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29a3b16f-f39d-413a-b623-3ac15aba50cf" containerName="util" Jan 31 14:55:19 crc kubenswrapper[4751]: I0131 14:55:19.531180 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="29a3b16f-f39d-413a-b623-3ac15aba50cf" containerName="util" Jan 31 14:55:19 crc kubenswrapper[4751]: I0131 14:55:19.531310 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="29a3b16f-f39d-413a-b623-3ac15aba50cf" containerName="extract" Jan 31 14:55:19 crc kubenswrapper[4751]: I0131 14:55:19.531793 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57f67fdff5-45pkl" Jan 31 14:55:19 crc kubenswrapper[4751]: I0131 14:55:19.535251 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-service-cert" Jan 31 14:55:19 crc kubenswrapper[4751]: I0131 14:55:19.535403 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-t98gl" Jan 31 14:55:19 crc kubenswrapper[4751]: I0131 14:55:19.555461 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57f67fdff5-45pkl"] Jan 31 14:55:19 crc kubenswrapper[4751]: I0131 14:55:19.594846 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hq5r\" (UniqueName: \"kubernetes.io/projected/6578d137-d120-43b2-99e3-71d4f6525d6c-kube-api-access-7hq5r\") pod \"infra-operator-controller-manager-57f67fdff5-45pkl\" (UID: \"6578d137-d120-43b2-99e3-71d4f6525d6c\") " pod="openstack-operators/infra-operator-controller-manager-57f67fdff5-45pkl" Jan 31 14:55:19 crc kubenswrapper[4751]: I0131 14:55:19.594928 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6578d137-d120-43b2-99e3-71d4f6525d6c-webhook-cert\") pod \"infra-operator-controller-manager-57f67fdff5-45pkl\" (UID: \"6578d137-d120-43b2-99e3-71d4f6525d6c\") " pod="openstack-operators/infra-operator-controller-manager-57f67fdff5-45pkl" Jan 31 14:55:19 crc kubenswrapper[4751]: I0131 14:55:19.594965 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6578d137-d120-43b2-99e3-71d4f6525d6c-apiservice-cert\") pod \"infra-operator-controller-manager-57f67fdff5-45pkl\" (UID: \"6578d137-d120-43b2-99e3-71d4f6525d6c\") " pod="openstack-operators/infra-operator-controller-manager-57f67fdff5-45pkl" Jan 31 14:55:19 crc kubenswrapper[4751]: I0131 14:55:19.696008 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hq5r\" (UniqueName: \"kubernetes.io/projected/6578d137-d120-43b2-99e3-71d4f6525d6c-kube-api-access-7hq5r\") pod \"infra-operator-controller-manager-57f67fdff5-45pkl\" (UID: \"6578d137-d120-43b2-99e3-71d4f6525d6c\") " pod="openstack-operators/infra-operator-controller-manager-57f67fdff5-45pkl" Jan 31 14:55:19 crc kubenswrapper[4751]: I0131 14:55:19.696102 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6578d137-d120-43b2-99e3-71d4f6525d6c-webhook-cert\") pod \"infra-operator-controller-manager-57f67fdff5-45pkl\" (UID: \"6578d137-d120-43b2-99e3-71d4f6525d6c\") " pod="openstack-operators/infra-operator-controller-manager-57f67fdff5-45pkl" Jan 31 14:55:19 crc kubenswrapper[4751]: I0131 14:55:19.696137 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6578d137-d120-43b2-99e3-71d4f6525d6c-apiservice-cert\") pod \"infra-operator-controller-manager-57f67fdff5-45pkl\" (UID: \"6578d137-d120-43b2-99e3-71d4f6525d6c\") " pod="openstack-operators/infra-operator-controller-manager-57f67fdff5-45pkl" Jan 31 14:55:19 crc kubenswrapper[4751]: I0131 14:55:19.702401 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6578d137-d120-43b2-99e3-71d4f6525d6c-webhook-cert\") pod \"infra-operator-controller-manager-57f67fdff5-45pkl\" (UID: \"6578d137-d120-43b2-99e3-71d4f6525d6c\") " pod="openstack-operators/infra-operator-controller-manager-57f67fdff5-45pkl" Jan 31 14:55:19 crc kubenswrapper[4751]: I0131 14:55:19.706659 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6578d137-d120-43b2-99e3-71d4f6525d6c-apiservice-cert\") pod \"infra-operator-controller-manager-57f67fdff5-45pkl\" (UID: \"6578d137-d120-43b2-99e3-71d4f6525d6c\") " pod="openstack-operators/infra-operator-controller-manager-57f67fdff5-45pkl" Jan 31 14:55:19 crc kubenswrapper[4751]: I0131 14:55:19.711597 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hq5r\" (UniqueName: \"kubernetes.io/projected/6578d137-d120-43b2-99e3-71d4f6525d6c-kube-api-access-7hq5r\") pod \"infra-operator-controller-manager-57f67fdff5-45pkl\" (UID: \"6578d137-d120-43b2-99e3-71d4f6525d6c\") " pod="openstack-operators/infra-operator-controller-manager-57f67fdff5-45pkl" Jan 31 14:55:19 crc kubenswrapper[4751]: I0131 14:55:19.858148 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57f67fdff5-45pkl" Jan 31 14:55:20 crc kubenswrapper[4751]: I0131 14:55:20.249247 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57f67fdff5-45pkl"] Jan 31 14:55:20 crc kubenswrapper[4751]: W0131 14:55:20.260547 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6578d137_d120_43b2_99e3_71d4f6525d6c.slice/crio-fd210a97bb4f47dccbcdbfba3a6c2101ade7c45f9468d34991d8307e718c3b16 WatchSource:0}: Error finding container fd210a97bb4f47dccbcdbfba3a6c2101ade7c45f9468d34991d8307e718c3b16: Status 404 returned error can't find the container with id fd210a97bb4f47dccbcdbfba3a6c2101ade7c45f9468d34991d8307e718c3b16 Jan 31 14:55:20 crc kubenswrapper[4751]: I0131 14:55:20.517739 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57f67fdff5-45pkl" event={"ID":"6578d137-d120-43b2-99e3-71d4f6525d6c","Type":"ContainerStarted","Data":"fd210a97bb4f47dccbcdbfba3a6c2101ade7c45f9468d34991d8307e718c3b16"} Jan 31 14:55:22 crc kubenswrapper[4751]: I0131 14:55:22.531493 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57f67fdff5-45pkl" event={"ID":"6578d137-d120-43b2-99e3-71d4f6525d6c","Type":"ContainerStarted","Data":"5dbebc52897c07a2e7dfa38ad3cf5873d3f7ef9969f655b93dd162236a7cbaa8"} Jan 31 14:55:22 crc kubenswrapper[4751]: I0131 14:55:22.532315 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-57f67fdff5-45pkl" Jan 31 14:55:22 crc kubenswrapper[4751]: I0131 14:55:22.554099 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-57f67fdff5-45pkl" podStartSLOduration=1.70522749 podStartE2EDuration="3.554082306s" podCreationTimestamp="2026-01-31 14:55:19 +0000 UTC" firstStartedPulling="2026-01-31 14:55:20.262831913 +0000 UTC m=+824.637544798" lastFinishedPulling="2026-01-31 14:55:22.111686729 +0000 UTC m=+826.486399614" observedRunningTime="2026-01-31 14:55:22.549117295 +0000 UTC m=+826.923830220" watchObservedRunningTime="2026-01-31 14:55:22.554082306 +0000 UTC m=+826.928795191" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.242888 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/openstack-galera-0"] Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.244997 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-0" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.247902 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"galera-openstack-dockercfg-bgvdx" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.248886 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-config-data" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.248946 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"kube-root-ca.crt" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.248884 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-scripts" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.249496 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openshift-service-ca.crt" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.251654 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/openstack-galera-1"] Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.252710 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-1" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.262339 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-0"] Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.317249 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/07a2906d-db30-4578-8b1e-088ca2f20ced-config-data-generated\") pod \"openstack-galera-0\" (UID: \"07a2906d-db30-4578-8b1e-088ca2f20ced\") " pod="glance-kuttl-tests/openstack-galera-0" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.317331 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng8cd\" (UniqueName: \"kubernetes.io/projected/07a2906d-db30-4578-8b1e-088ca2f20ced-kube-api-access-ng8cd\") pod \"openstack-galera-0\" (UID: \"07a2906d-db30-4578-8b1e-088ca2f20ced\") " pod="glance-kuttl-tests/openstack-galera-0" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.317408 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"07a2906d-db30-4578-8b1e-088ca2f20ced\") " pod="glance-kuttl-tests/openstack-galera-0" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.317489 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/07a2906d-db30-4578-8b1e-088ca2f20ced-operator-scripts\") pod \"openstack-galera-0\" (UID: \"07a2906d-db30-4578-8b1e-088ca2f20ced\") " pod="glance-kuttl-tests/openstack-galera-0" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.317609 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/07a2906d-db30-4578-8b1e-088ca2f20ced-kolla-config\") pod \"openstack-galera-0\" (UID: \"07a2906d-db30-4578-8b1e-088ca2f20ced\") " pod="glance-kuttl-tests/openstack-galera-0" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.317663 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/07a2906d-db30-4578-8b1e-088ca2f20ced-config-data-default\") pod \"openstack-galera-0\" (UID: \"07a2906d-db30-4578-8b1e-088ca2f20ced\") " pod="glance-kuttl-tests/openstack-galera-0" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.319813 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/openstack-galera-2"] Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.325517 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-2" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.357472 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-1"] Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.366390 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-2"] Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.418384 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/07a2906d-db30-4578-8b1e-088ca2f20ced-kolla-config\") pod \"openstack-galera-0\" (UID: \"07a2906d-db30-4578-8b1e-088ca2f20ced\") " pod="glance-kuttl-tests/openstack-galera-0" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.418430 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/07a2906d-db30-4578-8b1e-088ca2f20ced-config-data-default\") pod \"openstack-galera-0\" (UID: \"07a2906d-db30-4578-8b1e-088ca2f20ced\") " pod="glance-kuttl-tests/openstack-galera-0" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.418466 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-1\" (UID: \"22459bcc-672e-4390-89ae-2b5fa48ded71\") " pod="glance-kuttl-tests/openstack-galera-1" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.418483 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/22459bcc-672e-4390-89ae-2b5fa48ded71-kolla-config\") pod \"openstack-galera-1\" (UID: \"22459bcc-672e-4390-89ae-2b5fa48ded71\") " pod="glance-kuttl-tests/openstack-galera-1" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.418498 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/22459bcc-672e-4390-89ae-2b5fa48ded71-config-data-default\") pod \"openstack-galera-1\" (UID: \"22459bcc-672e-4390-89ae-2b5fa48ded71\") " pod="glance-kuttl-tests/openstack-galera-1" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.418515 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22459bcc-672e-4390-89ae-2b5fa48ded71-operator-scripts\") pod \"openstack-galera-1\" (UID: \"22459bcc-672e-4390-89ae-2b5fa48ded71\") " pod="glance-kuttl-tests/openstack-galera-1" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.418541 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/07a2906d-db30-4578-8b1e-088ca2f20ced-config-data-generated\") pod \"openstack-galera-0\" (UID: \"07a2906d-db30-4578-8b1e-088ca2f20ced\") " pod="glance-kuttl-tests/openstack-galera-0" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.418559 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ng8cd\" (UniqueName: \"kubernetes.io/projected/07a2906d-db30-4578-8b1e-088ca2f20ced-kube-api-access-ng8cd\") pod \"openstack-galera-0\" (UID: \"07a2906d-db30-4578-8b1e-088ca2f20ced\") " pod="glance-kuttl-tests/openstack-galera-0" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.418634 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"07a2906d-db30-4578-8b1e-088ca2f20ced\") " pod="glance-kuttl-tests/openstack-galera-0" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.418698 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nj4k\" (UniqueName: \"kubernetes.io/projected/22459bcc-672e-4390-89ae-2b5fa48ded71-kube-api-access-5nj4k\") pod \"openstack-galera-1\" (UID: \"22459bcc-672e-4390-89ae-2b5fa48ded71\") " pod="glance-kuttl-tests/openstack-galera-1" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.418748 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/22459bcc-672e-4390-89ae-2b5fa48ded71-config-data-generated\") pod \"openstack-galera-1\" (UID: \"22459bcc-672e-4390-89ae-2b5fa48ded71\") " pod="glance-kuttl-tests/openstack-galera-1" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.418844 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/07a2906d-db30-4578-8b1e-088ca2f20ced-operator-scripts\") pod \"openstack-galera-0\" (UID: \"07a2906d-db30-4578-8b1e-088ca2f20ced\") " pod="glance-kuttl-tests/openstack-galera-0" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.419037 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"07a2906d-db30-4578-8b1e-088ca2f20ced\") device mount path \"/mnt/openstack/pv10\"" pod="glance-kuttl-tests/openstack-galera-0" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.419445 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/07a2906d-db30-4578-8b1e-088ca2f20ced-config-data-default\") pod \"openstack-galera-0\" (UID: \"07a2906d-db30-4578-8b1e-088ca2f20ced\") " pod="glance-kuttl-tests/openstack-galera-0" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.419599 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/07a2906d-db30-4578-8b1e-088ca2f20ced-kolla-config\") pod \"openstack-galera-0\" (UID: \"07a2906d-db30-4578-8b1e-088ca2f20ced\") " pod="glance-kuttl-tests/openstack-galera-0" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.419955 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/07a2906d-db30-4578-8b1e-088ca2f20ced-config-data-generated\") pod \"openstack-galera-0\" (UID: \"07a2906d-db30-4578-8b1e-088ca2f20ced\") " pod="glance-kuttl-tests/openstack-galera-0" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.420302 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/07a2906d-db30-4578-8b1e-088ca2f20ced-operator-scripts\") pod \"openstack-galera-0\" (UID: \"07a2906d-db30-4578-8b1e-088ca2f20ced\") " pod="glance-kuttl-tests/openstack-galera-0" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.440039 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng8cd\" (UniqueName: \"kubernetes.io/projected/07a2906d-db30-4578-8b1e-088ca2f20ced-kube-api-access-ng8cd\") pod \"openstack-galera-0\" (UID: \"07a2906d-db30-4578-8b1e-088ca2f20ced\") " pod="glance-kuttl-tests/openstack-galera-0" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.455765 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"07a2906d-db30-4578-8b1e-088ca2f20ced\") " pod="glance-kuttl-tests/openstack-galera-0" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.520577 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwsz6\" (UniqueName: \"kubernetes.io/projected/3fcd9bac-c0cb-4de4-b630-0db07f110da7-kube-api-access-rwsz6\") pod \"openstack-galera-2\" (UID: \"3fcd9bac-c0cb-4de4-b630-0db07f110da7\") " pod="glance-kuttl-tests/openstack-galera-2" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.520655 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-1\" (UID: \"22459bcc-672e-4390-89ae-2b5fa48ded71\") " pod="glance-kuttl-tests/openstack-galera-1" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.520692 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/22459bcc-672e-4390-89ae-2b5fa48ded71-kolla-config\") pod \"openstack-galera-1\" (UID: \"22459bcc-672e-4390-89ae-2b5fa48ded71\") " pod="glance-kuttl-tests/openstack-galera-1" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.520724 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/22459bcc-672e-4390-89ae-2b5fa48ded71-config-data-default\") pod \"openstack-galera-1\" (UID: \"22459bcc-672e-4390-89ae-2b5fa48ded71\") " pod="glance-kuttl-tests/openstack-galera-1" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.520752 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22459bcc-672e-4390-89ae-2b5fa48ded71-operator-scripts\") pod \"openstack-galera-1\" (UID: \"22459bcc-672e-4390-89ae-2b5fa48ded71\") " pod="glance-kuttl-tests/openstack-galera-1" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.520783 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3fcd9bac-c0cb-4de4-b630-0db07f110da7-kolla-config\") pod \"openstack-galera-2\" (UID: \"3fcd9bac-c0cb-4de4-b630-0db07f110da7\") " pod="glance-kuttl-tests/openstack-galera-2" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.520815 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3fcd9bac-c0cb-4de4-b630-0db07f110da7-config-data-default\") pod \"openstack-galera-2\" (UID: \"3fcd9bac-c0cb-4de4-b630-0db07f110da7\") " pod="glance-kuttl-tests/openstack-galera-2" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.520851 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3fcd9bac-c0cb-4de4-b630-0db07f110da7-config-data-generated\") pod \"openstack-galera-2\" (UID: \"3fcd9bac-c0cb-4de4-b630-0db07f110da7\") " pod="glance-kuttl-tests/openstack-galera-2" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.520891 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3fcd9bac-c0cb-4de4-b630-0db07f110da7-operator-scripts\") pod \"openstack-galera-2\" (UID: \"3fcd9bac-c0cb-4de4-b630-0db07f110da7\") " pod="glance-kuttl-tests/openstack-galera-2" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.520942 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nj4k\" (UniqueName: \"kubernetes.io/projected/22459bcc-672e-4390-89ae-2b5fa48ded71-kube-api-access-5nj4k\") pod \"openstack-galera-1\" (UID: \"22459bcc-672e-4390-89ae-2b5fa48ded71\") " pod="glance-kuttl-tests/openstack-galera-1" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.520972 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/22459bcc-672e-4390-89ae-2b5fa48ded71-config-data-generated\") pod \"openstack-galera-1\" (UID: \"22459bcc-672e-4390-89ae-2b5fa48ded71\") " pod="glance-kuttl-tests/openstack-galera-1" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.521049 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-2\" (UID: \"3fcd9bac-c0cb-4de4-b630-0db07f110da7\") " pod="glance-kuttl-tests/openstack-galera-2" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.521787 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-1\" (UID: \"22459bcc-672e-4390-89ae-2b5fa48ded71\") device mount path \"/mnt/openstack/pv06\"" pod="glance-kuttl-tests/openstack-galera-1" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.522654 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/22459bcc-672e-4390-89ae-2b5fa48ded71-kolla-config\") pod \"openstack-galera-1\" (UID: \"22459bcc-672e-4390-89ae-2b5fa48ded71\") " pod="glance-kuttl-tests/openstack-galera-1" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.523251 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/22459bcc-672e-4390-89ae-2b5fa48ded71-config-data-default\") pod \"openstack-galera-1\" (UID: \"22459bcc-672e-4390-89ae-2b5fa48ded71\") " pod="glance-kuttl-tests/openstack-galera-1" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.526194 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22459bcc-672e-4390-89ae-2b5fa48ded71-operator-scripts\") pod \"openstack-galera-1\" (UID: \"22459bcc-672e-4390-89ae-2b5fa48ded71\") " pod="glance-kuttl-tests/openstack-galera-1" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.526528 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/22459bcc-672e-4390-89ae-2b5fa48ded71-config-data-generated\") pod \"openstack-galera-1\" (UID: \"22459bcc-672e-4390-89ae-2b5fa48ded71\") " pod="glance-kuttl-tests/openstack-galera-1" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.537336 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nj4k\" (UniqueName: \"kubernetes.io/projected/22459bcc-672e-4390-89ae-2b5fa48ded71-kube-api-access-5nj4k\") pod \"openstack-galera-1\" (UID: \"22459bcc-672e-4390-89ae-2b5fa48ded71\") " pod="glance-kuttl-tests/openstack-galera-1" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.545300 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-1\" (UID: \"22459bcc-672e-4390-89ae-2b5fa48ded71\") " pod="glance-kuttl-tests/openstack-galera-1" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.622015 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-0" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.622489 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3fcd9bac-c0cb-4de4-b630-0db07f110da7-kolla-config\") pod \"openstack-galera-2\" (UID: \"3fcd9bac-c0cb-4de4-b630-0db07f110da7\") " pod="glance-kuttl-tests/openstack-galera-2" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.622558 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3fcd9bac-c0cb-4de4-b630-0db07f110da7-config-data-default\") pod \"openstack-galera-2\" (UID: \"3fcd9bac-c0cb-4de4-b630-0db07f110da7\") " pod="glance-kuttl-tests/openstack-galera-2" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.622587 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3fcd9bac-c0cb-4de4-b630-0db07f110da7-config-data-generated\") pod \"openstack-galera-2\" (UID: \"3fcd9bac-c0cb-4de4-b630-0db07f110da7\") " pod="glance-kuttl-tests/openstack-galera-2" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.622614 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3fcd9bac-c0cb-4de4-b630-0db07f110da7-operator-scripts\") pod \"openstack-galera-2\" (UID: \"3fcd9bac-c0cb-4de4-b630-0db07f110da7\") " pod="glance-kuttl-tests/openstack-galera-2" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.622669 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-2\" (UID: \"3fcd9bac-c0cb-4de4-b630-0db07f110da7\") " pod="glance-kuttl-tests/openstack-galera-2" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.622750 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwsz6\" (UniqueName: \"kubernetes.io/projected/3fcd9bac-c0cb-4de4-b630-0db07f110da7-kube-api-access-rwsz6\") pod \"openstack-galera-2\" (UID: \"3fcd9bac-c0cb-4de4-b630-0db07f110da7\") " pod="glance-kuttl-tests/openstack-galera-2" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.623343 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-2\" (UID: \"3fcd9bac-c0cb-4de4-b630-0db07f110da7\") device mount path \"/mnt/openstack/pv12\"" pod="glance-kuttl-tests/openstack-galera-2" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.623568 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3fcd9bac-c0cb-4de4-b630-0db07f110da7-config-data-default\") pod \"openstack-galera-2\" (UID: \"3fcd9bac-c0cb-4de4-b630-0db07f110da7\") " pod="glance-kuttl-tests/openstack-galera-2" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.625800 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3fcd9bac-c0cb-4de4-b630-0db07f110da7-operator-scripts\") pod \"openstack-galera-2\" (UID: \"3fcd9bac-c0cb-4de4-b630-0db07f110da7\") " pod="glance-kuttl-tests/openstack-galera-2" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.630911 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-1" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.631425 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3fcd9bac-c0cb-4de4-b630-0db07f110da7-config-data-generated\") pod \"openstack-galera-2\" (UID: \"3fcd9bac-c0cb-4de4-b630-0db07f110da7\") " pod="glance-kuttl-tests/openstack-galera-2" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.632232 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3fcd9bac-c0cb-4de4-b630-0db07f110da7-kolla-config\") pod \"openstack-galera-2\" (UID: \"3fcd9bac-c0cb-4de4-b630-0db07f110da7\") " pod="glance-kuttl-tests/openstack-galera-2" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.645388 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwsz6\" (UniqueName: \"kubernetes.io/projected/3fcd9bac-c0cb-4de4-b630-0db07f110da7-kube-api-access-rwsz6\") pod \"openstack-galera-2\" (UID: \"3fcd9bac-c0cb-4de4-b630-0db07f110da7\") " pod="glance-kuttl-tests/openstack-galera-2" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.653828 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-2\" (UID: \"3fcd9bac-c0cb-4de4-b630-0db07f110da7\") " pod="glance-kuttl-tests/openstack-galera-2" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.669629 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-2" Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.866602 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-0"] Jan 31 14:55:27 crc kubenswrapper[4751]: I0131 14:55:27.909404 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-1"] Jan 31 14:55:27 crc kubenswrapper[4751]: W0131 14:55:27.915727 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22459bcc_672e_4390_89ae_2b5fa48ded71.slice/crio-6b6faf7aa73840af2027f08065efac105f4b0ad43c2d2c60890bf024de99e2ca WatchSource:0}: Error finding container 6b6faf7aa73840af2027f08065efac105f4b0ad43c2d2c60890bf024de99e2ca: Status 404 returned error can't find the container with id 6b6faf7aa73840af2027f08065efac105f4b0ad43c2d2c60890bf024de99e2ca Jan 31 14:55:28 crc kubenswrapper[4751]: I0131 14:55:28.186365 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-2"] Jan 31 14:55:28 crc kubenswrapper[4751]: W0131 14:55:28.186681 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3fcd9bac_c0cb_4de4_b630_0db07f110da7.slice/crio-4483e874a8f4e15e4dfcdca687206a7af35257a8c5ba1cb56d33195e769924f9 WatchSource:0}: Error finding container 4483e874a8f4e15e4dfcdca687206a7af35257a8c5ba1cb56d33195e769924f9: Status 404 returned error can't find the container with id 4483e874a8f4e15e4dfcdca687206a7af35257a8c5ba1cb56d33195e769924f9 Jan 31 14:55:28 crc kubenswrapper[4751]: I0131 14:55:28.567332 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-0" event={"ID":"07a2906d-db30-4578-8b1e-088ca2f20ced","Type":"ContainerStarted","Data":"2161c6d33cfda8a5b256a8346412b18ad489372437142a6a6602a50128a7c01a"} Jan 31 14:55:28 crc kubenswrapper[4751]: I0131 14:55:28.569168 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-1" event={"ID":"22459bcc-672e-4390-89ae-2b5fa48ded71","Type":"ContainerStarted","Data":"6b6faf7aa73840af2027f08065efac105f4b0ad43c2d2c60890bf024de99e2ca"} Jan 31 14:55:28 crc kubenswrapper[4751]: I0131 14:55:28.570447 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-2" event={"ID":"3fcd9bac-c0cb-4de4-b630-0db07f110da7","Type":"ContainerStarted","Data":"4483e874a8f4e15e4dfcdca687206a7af35257a8c5ba1cb56d33195e769924f9"} Jan 31 14:55:29 crc kubenswrapper[4751]: I0131 14:55:29.865966 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-57f67fdff5-45pkl" Jan 31 14:55:31 crc kubenswrapper[4751]: I0131 14:55:31.590980 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/memcached-0"] Jan 31 14:55:31 crc kubenswrapper[4751]: I0131 14:55:31.591851 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/memcached-0" Jan 31 14:55:31 crc kubenswrapper[4751]: I0131 14:55:31.595352 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"memcached-config-data" Jan 31 14:55:31 crc kubenswrapper[4751]: I0131 14:55:31.595758 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"memcached-memcached-dockercfg-c4g7x" Jan 31 14:55:31 crc kubenswrapper[4751]: I0131 14:55:31.637250 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/memcached-0"] Jan 31 14:55:31 crc kubenswrapper[4751]: I0131 14:55:31.681257 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9dfaa3fc-8bf7-420f-8581-4e917bf3f41c-config-data\") pod \"memcached-0\" (UID: \"9dfaa3fc-8bf7-420f-8581-4e917bf3f41c\") " pod="glance-kuttl-tests/memcached-0" Jan 31 14:55:31 crc kubenswrapper[4751]: I0131 14:55:31.681308 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9dfaa3fc-8bf7-420f-8581-4e917bf3f41c-kolla-config\") pod \"memcached-0\" (UID: \"9dfaa3fc-8bf7-420f-8581-4e917bf3f41c\") " pod="glance-kuttl-tests/memcached-0" Jan 31 14:55:31 crc kubenswrapper[4751]: I0131 14:55:31.681340 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbtbr\" (UniqueName: \"kubernetes.io/projected/9dfaa3fc-8bf7-420f-8581-4e917bf3f41c-kube-api-access-gbtbr\") pod \"memcached-0\" (UID: \"9dfaa3fc-8bf7-420f-8581-4e917bf3f41c\") " pod="glance-kuttl-tests/memcached-0" Jan 31 14:55:31 crc kubenswrapper[4751]: I0131 14:55:31.782769 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9dfaa3fc-8bf7-420f-8581-4e917bf3f41c-config-data\") pod \"memcached-0\" (UID: \"9dfaa3fc-8bf7-420f-8581-4e917bf3f41c\") " pod="glance-kuttl-tests/memcached-0" Jan 31 14:55:31 crc kubenswrapper[4751]: I0131 14:55:31.782850 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9dfaa3fc-8bf7-420f-8581-4e917bf3f41c-kolla-config\") pod \"memcached-0\" (UID: \"9dfaa3fc-8bf7-420f-8581-4e917bf3f41c\") " pod="glance-kuttl-tests/memcached-0" Jan 31 14:55:31 crc kubenswrapper[4751]: I0131 14:55:31.782905 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbtbr\" (UniqueName: \"kubernetes.io/projected/9dfaa3fc-8bf7-420f-8581-4e917bf3f41c-kube-api-access-gbtbr\") pod \"memcached-0\" (UID: \"9dfaa3fc-8bf7-420f-8581-4e917bf3f41c\") " pod="glance-kuttl-tests/memcached-0" Jan 31 14:55:31 crc kubenswrapper[4751]: I0131 14:55:31.783692 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9dfaa3fc-8bf7-420f-8581-4e917bf3f41c-kolla-config\") pod \"memcached-0\" (UID: \"9dfaa3fc-8bf7-420f-8581-4e917bf3f41c\") " pod="glance-kuttl-tests/memcached-0" Jan 31 14:55:31 crc kubenswrapper[4751]: I0131 14:55:31.783718 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9dfaa3fc-8bf7-420f-8581-4e917bf3f41c-config-data\") pod \"memcached-0\" (UID: \"9dfaa3fc-8bf7-420f-8581-4e917bf3f41c\") " pod="glance-kuttl-tests/memcached-0" Jan 31 14:55:31 crc kubenswrapper[4751]: I0131 14:55:31.814328 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbtbr\" (UniqueName: \"kubernetes.io/projected/9dfaa3fc-8bf7-420f-8581-4e917bf3f41c-kube-api-access-gbtbr\") pod \"memcached-0\" (UID: \"9dfaa3fc-8bf7-420f-8581-4e917bf3f41c\") " pod="glance-kuttl-tests/memcached-0" Jan 31 14:55:31 crc kubenswrapper[4751]: I0131 14:55:31.910896 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/memcached-0" Jan 31 14:55:32 crc kubenswrapper[4751]: I0131 14:55:32.525348 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-2wvgm"] Jan 31 14:55:32 crc kubenswrapper[4751]: I0131 14:55:32.526351 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-2wvgm" Jan 31 14:55:32 crc kubenswrapper[4751]: I0131 14:55:32.535689 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-index-dockercfg-sm9tx" Jan 31 14:55:32 crc kubenswrapper[4751]: I0131 14:55:32.545871 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-2wvgm"] Jan 31 14:55:32 crc kubenswrapper[4751]: I0131 14:55:32.700472 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzzlf\" (UniqueName: \"kubernetes.io/projected/44c515c1-f30f-44da-8959-cfd2530b46b7-kube-api-access-rzzlf\") pod \"rabbitmq-cluster-operator-index-2wvgm\" (UID: \"44c515c1-f30f-44da-8959-cfd2530b46b7\") " pod="openstack-operators/rabbitmq-cluster-operator-index-2wvgm" Jan 31 14:55:32 crc kubenswrapper[4751]: I0131 14:55:32.802062 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzzlf\" (UniqueName: \"kubernetes.io/projected/44c515c1-f30f-44da-8959-cfd2530b46b7-kube-api-access-rzzlf\") pod \"rabbitmq-cluster-operator-index-2wvgm\" (UID: \"44c515c1-f30f-44da-8959-cfd2530b46b7\") " pod="openstack-operators/rabbitmq-cluster-operator-index-2wvgm" Jan 31 14:55:32 crc kubenswrapper[4751]: I0131 14:55:32.821416 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzzlf\" (UniqueName: \"kubernetes.io/projected/44c515c1-f30f-44da-8959-cfd2530b46b7-kube-api-access-rzzlf\") pod \"rabbitmq-cluster-operator-index-2wvgm\" (UID: \"44c515c1-f30f-44da-8959-cfd2530b46b7\") " pod="openstack-operators/rabbitmq-cluster-operator-index-2wvgm" Jan 31 14:55:32 crc kubenswrapper[4751]: I0131 14:55:32.854292 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-2wvgm" Jan 31 14:55:36 crc kubenswrapper[4751]: I0131 14:55:36.213327 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/memcached-0"] Jan 31 14:55:36 crc kubenswrapper[4751]: I0131 14:55:36.237736 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-2wvgm"] Jan 31 14:55:36 crc kubenswrapper[4751]: W0131 14:55:36.247294 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44c515c1_f30f_44da_8959_cfd2530b46b7.slice/crio-b07bdb6979a897c42db641896c014136c4b8817fab040635c752ccba6b137d19 WatchSource:0}: Error finding container b07bdb6979a897c42db641896c014136c4b8817fab040635c752ccba6b137d19: Status 404 returned error can't find the container with id b07bdb6979a897c42db641896c014136c4b8817fab040635c752ccba6b137d19 Jan 31 14:55:36 crc kubenswrapper[4751]: I0131 14:55:36.618897 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-2wvgm" event={"ID":"44c515c1-f30f-44da-8959-cfd2530b46b7","Type":"ContainerStarted","Data":"b07bdb6979a897c42db641896c014136c4b8817fab040635c752ccba6b137d19"} Jan 31 14:55:36 crc kubenswrapper[4751]: I0131 14:55:36.619813 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/memcached-0" event={"ID":"9dfaa3fc-8bf7-420f-8581-4e917bf3f41c","Type":"ContainerStarted","Data":"cf904354b92714c266cf175421ba71e5ed9cb49d7ba4bbc0c72df9a09635ce8a"} Jan 31 14:55:38 crc kubenswrapper[4751]: I0131 14:55:38.632709 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-0" event={"ID":"07a2906d-db30-4578-8b1e-088ca2f20ced","Type":"ContainerStarted","Data":"1dd59d047e2f99760bb45d01f43a08d4aeb1e5d45326b19b0123bcf023e41f96"} Jan 31 14:55:38 crc kubenswrapper[4751]: I0131 14:55:38.641220 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-1" event={"ID":"22459bcc-672e-4390-89ae-2b5fa48ded71","Type":"ContainerStarted","Data":"0844a74085d7d943d717fb7babb3a7b7db796dff92dfd7c3894a2eccb22eb987"} Jan 31 14:55:38 crc kubenswrapper[4751]: I0131 14:55:38.644189 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-2" event={"ID":"3fcd9bac-c0cb-4de4-b630-0db07f110da7","Type":"ContainerStarted","Data":"b0d3ea91f474d5f0241c4f1e0b20927cdf5d85e229fe91747902d0e90daf242d"} Jan 31 14:55:38 crc kubenswrapper[4751]: I0131 14:55:38.896811 4751 patch_prober.go:28] interesting pod/machine-config-daemon-2wpj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 14:55:38 crc kubenswrapper[4751]: I0131 14:55:38.896875 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podUID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 14:55:38 crc kubenswrapper[4751]: I0131 14:55:38.896926 4751 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" Jan 31 14:55:38 crc kubenswrapper[4751]: I0131 14:55:38.897493 4751 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f4d4f92719c72ec0adb31e02a10d5c8bcb4b1a9b3bfb5b0e7ed8cfdbc4a53235"} pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 31 14:55:38 crc kubenswrapper[4751]: I0131 14:55:38.897563 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podUID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" containerName="machine-config-daemon" containerID="cri-o://f4d4f92719c72ec0adb31e02a10d5c8bcb4b1a9b3bfb5b0e7ed8cfdbc4a53235" gracePeriod=600 Jan 31 14:55:39 crc kubenswrapper[4751]: I0131 14:55:39.652275 4751 generic.go:334] "Generic (PLEG): container finished" podID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" containerID="f4d4f92719c72ec0adb31e02a10d5c8bcb4b1a9b3bfb5b0e7ed8cfdbc4a53235" exitCode=0 Jan 31 14:55:39 crc kubenswrapper[4751]: I0131 14:55:39.652404 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" event={"ID":"b4c170e8-22c9-43a9-8b34-9d626c2ccddc","Type":"ContainerDied","Data":"f4d4f92719c72ec0adb31e02a10d5c8bcb4b1a9b3bfb5b0e7ed8cfdbc4a53235"} Jan 31 14:55:39 crc kubenswrapper[4751]: I0131 14:55:39.652593 4751 scope.go:117] "RemoveContainer" containerID="ef29f0f695de11b302d97f5ade678c0ae9fdc43953c2430b685d7fd276ee3217" Jan 31 14:55:40 crc kubenswrapper[4751]: I0131 14:55:40.669664 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" event={"ID":"b4c170e8-22c9-43a9-8b34-9d626c2ccddc","Type":"ContainerStarted","Data":"dc064826cd8a78005216541d25736856cc2dd920bfe44778b79dbfd2f76ed341"} Jan 31 14:55:41 crc kubenswrapper[4751]: I0131 14:55:41.675807 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/memcached-0" event={"ID":"9dfaa3fc-8bf7-420f-8581-4e917bf3f41c","Type":"ContainerStarted","Data":"7d9c0759f36bb098c88e33085270280041e2db4b3aa27d3f10dea45195deff2f"} Jan 31 14:55:41 crc kubenswrapper[4751]: I0131 14:55:41.676343 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/memcached-0" Jan 31 14:55:41 crc kubenswrapper[4751]: I0131 14:55:41.678408 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-2wvgm" event={"ID":"44c515c1-f30f-44da-8959-cfd2530b46b7","Type":"ContainerStarted","Data":"584fb5ac2ba8eb206c7f2718045838c65ff9fa87a8c959a34945da15defa3f15"} Jan 31 14:55:41 crc kubenswrapper[4751]: I0131 14:55:41.695116 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/memcached-0" podStartSLOduration=8.505498546 podStartE2EDuration="10.695098411s" podCreationTimestamp="2026-01-31 14:55:31 +0000 UTC" firstStartedPulling="2026-01-31 14:55:37.938743063 +0000 UTC m=+842.313455948" lastFinishedPulling="2026-01-31 14:55:40.128342928 +0000 UTC m=+844.503055813" observedRunningTime="2026-01-31 14:55:41.692266507 +0000 UTC m=+846.066979392" watchObservedRunningTime="2026-01-31 14:55:41.695098411 +0000 UTC m=+846.069811306" Jan 31 14:55:41 crc kubenswrapper[4751]: I0131 14:55:41.709604 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-index-2wvgm" podStartSLOduration=6.133324191 podStartE2EDuration="9.709585423s" podCreationTimestamp="2026-01-31 14:55:32 +0000 UTC" firstStartedPulling="2026-01-31 14:55:37.93366273 +0000 UTC m=+842.308375615" lastFinishedPulling="2026-01-31 14:55:41.509923962 +0000 UTC m=+845.884636847" observedRunningTime="2026-01-31 14:55:41.705718631 +0000 UTC m=+846.080431516" watchObservedRunningTime="2026-01-31 14:55:41.709585423 +0000 UTC m=+846.084298308" Jan 31 14:55:42 crc kubenswrapper[4751]: I0131 14:55:42.688352 4751 generic.go:334] "Generic (PLEG): container finished" podID="22459bcc-672e-4390-89ae-2b5fa48ded71" containerID="0844a74085d7d943d717fb7babb3a7b7db796dff92dfd7c3894a2eccb22eb987" exitCode=0 Jan 31 14:55:42 crc kubenswrapper[4751]: I0131 14:55:42.688424 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-1" event={"ID":"22459bcc-672e-4390-89ae-2b5fa48ded71","Type":"ContainerDied","Data":"0844a74085d7d943d717fb7babb3a7b7db796dff92dfd7c3894a2eccb22eb987"} Jan 31 14:55:42 crc kubenswrapper[4751]: I0131 14:55:42.695107 4751 generic.go:334] "Generic (PLEG): container finished" podID="3fcd9bac-c0cb-4de4-b630-0db07f110da7" containerID="b0d3ea91f474d5f0241c4f1e0b20927cdf5d85e229fe91747902d0e90daf242d" exitCode=0 Jan 31 14:55:42 crc kubenswrapper[4751]: I0131 14:55:42.695167 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-2" event={"ID":"3fcd9bac-c0cb-4de4-b630-0db07f110da7","Type":"ContainerDied","Data":"b0d3ea91f474d5f0241c4f1e0b20927cdf5d85e229fe91747902d0e90daf242d"} Jan 31 14:55:42 crc kubenswrapper[4751]: I0131 14:55:42.699269 4751 generic.go:334] "Generic (PLEG): container finished" podID="07a2906d-db30-4578-8b1e-088ca2f20ced" containerID="1dd59d047e2f99760bb45d01f43a08d4aeb1e5d45326b19b0123bcf023e41f96" exitCode=0 Jan 31 14:55:42 crc kubenswrapper[4751]: I0131 14:55:42.699604 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-0" event={"ID":"07a2906d-db30-4578-8b1e-088ca2f20ced","Type":"ContainerDied","Data":"1dd59d047e2f99760bb45d01f43a08d4aeb1e5d45326b19b0123bcf023e41f96"} Jan 31 14:55:42 crc kubenswrapper[4751]: I0131 14:55:42.855126 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/rabbitmq-cluster-operator-index-2wvgm" Jan 31 14:55:42 crc kubenswrapper[4751]: I0131 14:55:42.855325 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/rabbitmq-cluster-operator-index-2wvgm" Jan 31 14:55:42 crc kubenswrapper[4751]: I0131 14:55:42.895960 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/rabbitmq-cluster-operator-index-2wvgm" Jan 31 14:55:43 crc kubenswrapper[4751]: I0131 14:55:43.711494 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-2" event={"ID":"3fcd9bac-c0cb-4de4-b630-0db07f110da7","Type":"ContainerStarted","Data":"eeb0727f6d7a3d1d251766b50edc1058bc460aa581ba0d5f746de288b9b3f16b"} Jan 31 14:55:43 crc kubenswrapper[4751]: I0131 14:55:43.716465 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-0" event={"ID":"07a2906d-db30-4578-8b1e-088ca2f20ced","Type":"ContainerStarted","Data":"0e2fc16f141e03061cef807a2713d8f66a6c5d9ed59205690727526ba6a882ea"} Jan 31 14:55:43 crc kubenswrapper[4751]: I0131 14:55:43.721999 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-1" event={"ID":"22459bcc-672e-4390-89ae-2b5fa48ded71","Type":"ContainerStarted","Data":"6234bbbcfac3eddf715e4285a6b3d7b6a0aff6d850ad2df858a2deee34d9f571"} Jan 31 14:55:43 crc kubenswrapper[4751]: I0131 14:55:43.747491 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/openstack-galera-2" podStartSLOduration=7.989230855 podStartE2EDuration="17.74746622s" podCreationTimestamp="2026-01-31 14:55:26 +0000 UTC" firstStartedPulling="2026-01-31 14:55:28.18967361 +0000 UTC m=+832.564386495" lastFinishedPulling="2026-01-31 14:55:37.947908975 +0000 UTC m=+842.322621860" observedRunningTime="2026-01-31 14:55:43.742353895 +0000 UTC m=+848.117066830" watchObservedRunningTime="2026-01-31 14:55:43.74746622 +0000 UTC m=+848.122179135" Jan 31 14:55:43 crc kubenswrapper[4751]: I0131 14:55:43.783292 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/openstack-galera-1" podStartSLOduration=7.754811398 podStartE2EDuration="17.783263903s" podCreationTimestamp="2026-01-31 14:55:26 +0000 UTC" firstStartedPulling="2026-01-31 14:55:27.919408509 +0000 UTC m=+832.294121394" lastFinishedPulling="2026-01-31 14:55:37.947861014 +0000 UTC m=+842.322573899" observedRunningTime="2026-01-31 14:55:43.777272055 +0000 UTC m=+848.151985010" watchObservedRunningTime="2026-01-31 14:55:43.783263903 +0000 UTC m=+848.157976818" Jan 31 14:55:43 crc kubenswrapper[4751]: I0131 14:55:43.811753 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/openstack-galera-0" podStartSLOduration=7.705973131 podStartE2EDuration="17.811727043s" podCreationTimestamp="2026-01-31 14:55:26 +0000 UTC" firstStartedPulling="2026-01-31 14:55:27.885479135 +0000 UTC m=+832.260192020" lastFinishedPulling="2026-01-31 14:55:37.991233047 +0000 UTC m=+842.365945932" observedRunningTime="2026-01-31 14:55:43.802956022 +0000 UTC m=+848.177668937" watchObservedRunningTime="2026-01-31 14:55:43.811727043 +0000 UTC m=+848.186439958" Jan 31 14:55:46 crc kubenswrapper[4751]: I0131 14:55:46.916366 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/memcached-0" Jan 31 14:55:47 crc kubenswrapper[4751]: I0131 14:55:47.622859 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/openstack-galera-0" Jan 31 14:55:47 crc kubenswrapper[4751]: I0131 14:55:47.622917 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/openstack-galera-0" Jan 31 14:55:47 crc kubenswrapper[4751]: I0131 14:55:47.632136 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/openstack-galera-1" Jan 31 14:55:47 crc kubenswrapper[4751]: I0131 14:55:47.632181 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/openstack-galera-1" Jan 31 14:55:47 crc kubenswrapper[4751]: I0131 14:55:47.670403 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/openstack-galera-2" Jan 31 14:55:47 crc kubenswrapper[4751]: I0131 14:55:47.670441 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/openstack-galera-2" Jan 31 14:55:51 crc kubenswrapper[4751]: I0131 14:55:51.820857 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/openstack-galera-2" Jan 31 14:55:51 crc kubenswrapper[4751]: I0131 14:55:51.906632 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/openstack-galera-2" Jan 31 14:55:52 crc kubenswrapper[4751]: I0131 14:55:52.911223 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/rabbitmq-cluster-operator-index-2wvgm" Jan 31 14:55:56 crc kubenswrapper[4751]: I0131 14:55:56.305003 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/root-account-create-update-4gxnx"] Jan 31 14:55:56 crc kubenswrapper[4751]: I0131 14:55:56.306195 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/root-account-create-update-4gxnx" Jan 31 14:55:56 crc kubenswrapper[4751]: I0131 14:55:56.309938 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"openstack-mariadb-root-db-secret" Jan 31 14:55:56 crc kubenswrapper[4751]: I0131 14:55:56.315720 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/root-account-create-update-4gxnx"] Jan 31 14:55:56 crc kubenswrapper[4751]: I0131 14:55:56.340365 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b6442aa-bbe5-4658-bfe9-28ee4f9b3daf-operator-scripts\") pod \"root-account-create-update-4gxnx\" (UID: \"6b6442aa-bbe5-4658-bfe9-28ee4f9b3daf\") " pod="glance-kuttl-tests/root-account-create-update-4gxnx" Jan 31 14:55:56 crc kubenswrapper[4751]: I0131 14:55:56.340538 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fqsm\" (UniqueName: \"kubernetes.io/projected/6b6442aa-bbe5-4658-bfe9-28ee4f9b3daf-kube-api-access-2fqsm\") pod \"root-account-create-update-4gxnx\" (UID: \"6b6442aa-bbe5-4658-bfe9-28ee4f9b3daf\") " pod="glance-kuttl-tests/root-account-create-update-4gxnx" Jan 31 14:55:56 crc kubenswrapper[4751]: I0131 14:55:56.442624 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b6442aa-bbe5-4658-bfe9-28ee4f9b3daf-operator-scripts\") pod \"root-account-create-update-4gxnx\" (UID: \"6b6442aa-bbe5-4658-bfe9-28ee4f9b3daf\") " pod="glance-kuttl-tests/root-account-create-update-4gxnx" Jan 31 14:55:56 crc kubenswrapper[4751]: I0131 14:55:56.442814 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fqsm\" (UniqueName: \"kubernetes.io/projected/6b6442aa-bbe5-4658-bfe9-28ee4f9b3daf-kube-api-access-2fqsm\") pod \"root-account-create-update-4gxnx\" (UID: \"6b6442aa-bbe5-4658-bfe9-28ee4f9b3daf\") " pod="glance-kuttl-tests/root-account-create-update-4gxnx" Jan 31 14:55:56 crc kubenswrapper[4751]: I0131 14:55:56.443838 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b6442aa-bbe5-4658-bfe9-28ee4f9b3daf-operator-scripts\") pod \"root-account-create-update-4gxnx\" (UID: \"6b6442aa-bbe5-4658-bfe9-28ee4f9b3daf\") " pod="glance-kuttl-tests/root-account-create-update-4gxnx" Jan 31 14:55:56 crc kubenswrapper[4751]: I0131 14:55:56.466661 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fqsm\" (UniqueName: \"kubernetes.io/projected/6b6442aa-bbe5-4658-bfe9-28ee4f9b3daf-kube-api-access-2fqsm\") pod \"root-account-create-update-4gxnx\" (UID: \"6b6442aa-bbe5-4658-bfe9-28ee4f9b3daf\") " pod="glance-kuttl-tests/root-account-create-update-4gxnx" Jan 31 14:55:56 crc kubenswrapper[4751]: I0131 14:55:56.621657 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/root-account-create-update-4gxnx" Jan 31 14:55:56 crc kubenswrapper[4751]: I0131 14:55:56.889690 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/root-account-create-update-4gxnx"] Jan 31 14:55:57 crc kubenswrapper[4751]: I0131 14:55:57.818336 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/root-account-create-update-4gxnx" event={"ID":"6b6442aa-bbe5-4658-bfe9-28ee4f9b3daf","Type":"ContainerStarted","Data":"b26b741fdf290763b6328eb1a8c5b1a7f048f2aecba802a031d85386bf813c0e"} Jan 31 14:55:57 crc kubenswrapper[4751]: I0131 14:55:57.818726 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/root-account-create-update-4gxnx" event={"ID":"6b6442aa-bbe5-4658-bfe9-28ee4f9b3daf","Type":"ContainerStarted","Data":"0a2861c5cc0f595bf81985741b866cc835b0dcdfb494adb385879ca0b4137437"} Jan 31 14:55:57 crc kubenswrapper[4751]: I0131 14:55:57.844024 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/root-account-create-update-4gxnx" podStartSLOduration=1.8439852060000002 podStartE2EDuration="1.843985206s" podCreationTimestamp="2026-01-31 14:55:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:55:57.840985247 +0000 UTC m=+862.215698232" watchObservedRunningTime="2026-01-31 14:55:57.843985206 +0000 UTC m=+862.218698101" Jan 31 14:55:58 crc kubenswrapper[4751]: I0131 14:55:58.825669 4751 generic.go:334] "Generic (PLEG): container finished" podID="6b6442aa-bbe5-4658-bfe9-28ee4f9b3daf" containerID="b26b741fdf290763b6328eb1a8c5b1a7f048f2aecba802a031d85386bf813c0e" exitCode=0 Jan 31 14:55:58 crc kubenswrapper[4751]: I0131 14:55:58.825804 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/root-account-create-update-4gxnx" event={"ID":"6b6442aa-bbe5-4658-bfe9-28ee4f9b3daf","Type":"ContainerDied","Data":"b26b741fdf290763b6328eb1a8c5b1a7f048f2aecba802a031d85386bf813c0e"} Jan 31 14:56:00 crc kubenswrapper[4751]: I0131 14:56:00.328330 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9w6rf"] Jan 31 14:56:00 crc kubenswrapper[4751]: I0131 14:56:00.330899 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9w6rf" Jan 31 14:56:00 crc kubenswrapper[4751]: I0131 14:56:00.348175 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9w6rf"] Jan 31 14:56:00 crc kubenswrapper[4751]: I0131 14:56:00.404956 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db9636cf-e895-422b-8064-ce6d652a85d1-utilities\") pod \"redhat-operators-9w6rf\" (UID: \"db9636cf-e895-422b-8064-ce6d652a85d1\") " pod="openshift-marketplace/redhat-operators-9w6rf" Jan 31 14:56:00 crc kubenswrapper[4751]: I0131 14:56:00.405038 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2b4z\" (UniqueName: \"kubernetes.io/projected/db9636cf-e895-422b-8064-ce6d652a85d1-kube-api-access-w2b4z\") pod \"redhat-operators-9w6rf\" (UID: \"db9636cf-e895-422b-8064-ce6d652a85d1\") " pod="openshift-marketplace/redhat-operators-9w6rf" Jan 31 14:56:00 crc kubenswrapper[4751]: I0131 14:56:00.405112 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db9636cf-e895-422b-8064-ce6d652a85d1-catalog-content\") pod \"redhat-operators-9w6rf\" (UID: \"db9636cf-e895-422b-8064-ce6d652a85d1\") " pod="openshift-marketplace/redhat-operators-9w6rf" Jan 31 14:56:00 crc kubenswrapper[4751]: I0131 14:56:00.506816 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2b4z\" (UniqueName: \"kubernetes.io/projected/db9636cf-e895-422b-8064-ce6d652a85d1-kube-api-access-w2b4z\") pod \"redhat-operators-9w6rf\" (UID: \"db9636cf-e895-422b-8064-ce6d652a85d1\") " pod="openshift-marketplace/redhat-operators-9w6rf" Jan 31 14:56:00 crc kubenswrapper[4751]: I0131 14:56:00.506865 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db9636cf-e895-422b-8064-ce6d652a85d1-catalog-content\") pod \"redhat-operators-9w6rf\" (UID: \"db9636cf-e895-422b-8064-ce6d652a85d1\") " pod="openshift-marketplace/redhat-operators-9w6rf" Jan 31 14:56:00 crc kubenswrapper[4751]: I0131 14:56:00.506947 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db9636cf-e895-422b-8064-ce6d652a85d1-utilities\") pod \"redhat-operators-9w6rf\" (UID: \"db9636cf-e895-422b-8064-ce6d652a85d1\") " pod="openshift-marketplace/redhat-operators-9w6rf" Jan 31 14:56:00 crc kubenswrapper[4751]: I0131 14:56:00.507382 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db9636cf-e895-422b-8064-ce6d652a85d1-utilities\") pod \"redhat-operators-9w6rf\" (UID: \"db9636cf-e895-422b-8064-ce6d652a85d1\") " pod="openshift-marketplace/redhat-operators-9w6rf" Jan 31 14:56:00 crc kubenswrapper[4751]: I0131 14:56:00.507439 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db9636cf-e895-422b-8064-ce6d652a85d1-catalog-content\") pod \"redhat-operators-9w6rf\" (UID: \"db9636cf-e895-422b-8064-ce6d652a85d1\") " pod="openshift-marketplace/redhat-operators-9w6rf" Jan 31 14:56:00 crc kubenswrapper[4751]: I0131 14:56:00.527234 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2b4z\" (UniqueName: \"kubernetes.io/projected/db9636cf-e895-422b-8064-ce6d652a85d1-kube-api-access-w2b4z\") pod \"redhat-operators-9w6rf\" (UID: \"db9636cf-e895-422b-8064-ce6d652a85d1\") " pod="openshift-marketplace/redhat-operators-9w6rf" Jan 31 14:56:00 crc kubenswrapper[4751]: I0131 14:56:00.671923 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9w6rf" Jan 31 14:56:03 crc kubenswrapper[4751]: I0131 14:56:03.774527 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590swhdb"] Jan 31 14:56:03 crc kubenswrapper[4751]: I0131 14:56:03.776314 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590swhdb" Jan 31 14:56:03 crc kubenswrapper[4751]: I0131 14:56:03.783156 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-wxkjx" Jan 31 14:56:03 crc kubenswrapper[4751]: I0131 14:56:03.790167 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590swhdb"] Jan 31 14:56:03 crc kubenswrapper[4751]: I0131 14:56:03.850769 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a525382f-29ee-4393-9e5b-1b3e989a1bc3-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590swhdb\" (UID: \"a525382f-29ee-4393-9e5b-1b3e989a1bc3\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590swhdb" Jan 31 14:56:03 crc kubenswrapper[4751]: I0131 14:56:03.850847 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9mlm\" (UniqueName: \"kubernetes.io/projected/a525382f-29ee-4393-9e5b-1b3e989a1bc3-kube-api-access-k9mlm\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590swhdb\" (UID: \"a525382f-29ee-4393-9e5b-1b3e989a1bc3\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590swhdb" Jan 31 14:56:03 crc kubenswrapper[4751]: I0131 14:56:03.851036 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a525382f-29ee-4393-9e5b-1b3e989a1bc3-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590swhdb\" (UID: \"a525382f-29ee-4393-9e5b-1b3e989a1bc3\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590swhdb" Jan 31 14:56:03 crc kubenswrapper[4751]: I0131 14:56:03.952980 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a525382f-29ee-4393-9e5b-1b3e989a1bc3-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590swhdb\" (UID: \"a525382f-29ee-4393-9e5b-1b3e989a1bc3\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590swhdb" Jan 31 14:56:03 crc kubenswrapper[4751]: I0131 14:56:03.953050 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9mlm\" (UniqueName: \"kubernetes.io/projected/a525382f-29ee-4393-9e5b-1b3e989a1bc3-kube-api-access-k9mlm\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590swhdb\" (UID: \"a525382f-29ee-4393-9e5b-1b3e989a1bc3\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590swhdb" Jan 31 14:56:03 crc kubenswrapper[4751]: I0131 14:56:03.953132 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a525382f-29ee-4393-9e5b-1b3e989a1bc3-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590swhdb\" (UID: \"a525382f-29ee-4393-9e5b-1b3e989a1bc3\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590swhdb" Jan 31 14:56:03 crc kubenswrapper[4751]: I0131 14:56:03.953979 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a525382f-29ee-4393-9e5b-1b3e989a1bc3-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590swhdb\" (UID: \"a525382f-29ee-4393-9e5b-1b3e989a1bc3\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590swhdb" Jan 31 14:56:03 crc kubenswrapper[4751]: I0131 14:56:03.953986 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a525382f-29ee-4393-9e5b-1b3e989a1bc3-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590swhdb\" (UID: \"a525382f-29ee-4393-9e5b-1b3e989a1bc3\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590swhdb" Jan 31 14:56:03 crc kubenswrapper[4751]: I0131 14:56:03.975254 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9mlm\" (UniqueName: \"kubernetes.io/projected/a525382f-29ee-4393-9e5b-1b3e989a1bc3-kube-api-access-k9mlm\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590swhdb\" (UID: \"a525382f-29ee-4393-9e5b-1b3e989a1bc3\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590swhdb" Jan 31 14:56:04 crc kubenswrapper[4751]: I0131 14:56:04.035372 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/root-account-create-update-4gxnx" Jan 31 14:56:04 crc kubenswrapper[4751]: I0131 14:56:04.097731 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590swhdb" Jan 31 14:56:04 crc kubenswrapper[4751]: I0131 14:56:04.156507 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fqsm\" (UniqueName: \"kubernetes.io/projected/6b6442aa-bbe5-4658-bfe9-28ee4f9b3daf-kube-api-access-2fqsm\") pod \"6b6442aa-bbe5-4658-bfe9-28ee4f9b3daf\" (UID: \"6b6442aa-bbe5-4658-bfe9-28ee4f9b3daf\") " Jan 31 14:56:04 crc kubenswrapper[4751]: I0131 14:56:04.156562 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b6442aa-bbe5-4658-bfe9-28ee4f9b3daf-operator-scripts\") pod \"6b6442aa-bbe5-4658-bfe9-28ee4f9b3daf\" (UID: \"6b6442aa-bbe5-4658-bfe9-28ee4f9b3daf\") " Jan 31 14:56:04 crc kubenswrapper[4751]: I0131 14:56:04.157372 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b6442aa-bbe5-4658-bfe9-28ee4f9b3daf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6b6442aa-bbe5-4658-bfe9-28ee4f9b3daf" (UID: "6b6442aa-bbe5-4658-bfe9-28ee4f9b3daf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:56:04 crc kubenswrapper[4751]: I0131 14:56:04.162778 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b6442aa-bbe5-4658-bfe9-28ee4f9b3daf-kube-api-access-2fqsm" (OuterVolumeSpecName: "kube-api-access-2fqsm") pod "6b6442aa-bbe5-4658-bfe9-28ee4f9b3daf" (UID: "6b6442aa-bbe5-4658-bfe9-28ee4f9b3daf"). InnerVolumeSpecName "kube-api-access-2fqsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:56:04 crc kubenswrapper[4751]: I0131 14:56:04.258750 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fqsm\" (UniqueName: \"kubernetes.io/projected/6b6442aa-bbe5-4658-bfe9-28ee4f9b3daf-kube-api-access-2fqsm\") on node \"crc\" DevicePath \"\"" Jan 31 14:56:04 crc kubenswrapper[4751]: I0131 14:56:04.258982 4751 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b6442aa-bbe5-4658-bfe9-28ee4f9b3daf-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 14:56:04 crc kubenswrapper[4751]: I0131 14:56:04.395253 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9w6rf"] Jan 31 14:56:04 crc kubenswrapper[4751]: W0131 14:56:04.406313 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb9636cf_e895_422b_8064_ce6d652a85d1.slice/crio-dc0b582d511f1669c6b4a4462f76d30ea3f273270449a3ea86a4620ceb3d1f30 WatchSource:0}: Error finding container dc0b582d511f1669c6b4a4462f76d30ea3f273270449a3ea86a4620ceb3d1f30: Status 404 returned error can't find the container with id dc0b582d511f1669c6b4a4462f76d30ea3f273270449a3ea86a4620ceb3d1f30 Jan 31 14:56:04 crc kubenswrapper[4751]: I0131 14:56:04.499564 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590swhdb"] Jan 31 14:56:04 crc kubenswrapper[4751]: W0131 14:56:04.507766 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda525382f_29ee_4393_9e5b_1b3e989a1bc3.slice/crio-1f53cc0349cc95b3204177d9541764f06423422d60741c2505c0c3dc21ce5ef9 WatchSource:0}: Error finding container 1f53cc0349cc95b3204177d9541764f06423422d60741c2505c0c3dc21ce5ef9: Status 404 returned error can't find the container with id 1f53cc0349cc95b3204177d9541764f06423422d60741c2505c0c3dc21ce5ef9 Jan 31 14:56:04 crc kubenswrapper[4751]: I0131 14:56:04.860371 4751 generic.go:334] "Generic (PLEG): container finished" podID="a525382f-29ee-4393-9e5b-1b3e989a1bc3" containerID="ecd0273950524364ff0a405d7ba30af3f5ab2065b0d4986c88176cf55c6327d6" exitCode=0 Jan 31 14:56:04 crc kubenswrapper[4751]: I0131 14:56:04.860419 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590swhdb" event={"ID":"a525382f-29ee-4393-9e5b-1b3e989a1bc3","Type":"ContainerDied","Data":"ecd0273950524364ff0a405d7ba30af3f5ab2065b0d4986c88176cf55c6327d6"} Jan 31 14:56:04 crc kubenswrapper[4751]: I0131 14:56:04.860686 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590swhdb" event={"ID":"a525382f-29ee-4393-9e5b-1b3e989a1bc3","Type":"ContainerStarted","Data":"1f53cc0349cc95b3204177d9541764f06423422d60741c2505c0c3dc21ce5ef9"} Jan 31 14:56:04 crc kubenswrapper[4751]: I0131 14:56:04.863172 4751 generic.go:334] "Generic (PLEG): container finished" podID="db9636cf-e895-422b-8064-ce6d652a85d1" containerID="526b390ea0ece0e6fceffb4c1922e3d8cdf235036daf439a7f234526b708d9ea" exitCode=0 Jan 31 14:56:04 crc kubenswrapper[4751]: I0131 14:56:04.863323 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9w6rf" event={"ID":"db9636cf-e895-422b-8064-ce6d652a85d1","Type":"ContainerDied","Data":"526b390ea0ece0e6fceffb4c1922e3d8cdf235036daf439a7f234526b708d9ea"} Jan 31 14:56:04 crc kubenswrapper[4751]: I0131 14:56:04.863423 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9w6rf" event={"ID":"db9636cf-e895-422b-8064-ce6d652a85d1","Type":"ContainerStarted","Data":"dc0b582d511f1669c6b4a4462f76d30ea3f273270449a3ea86a4620ceb3d1f30"} Jan 31 14:56:04 crc kubenswrapper[4751]: I0131 14:56:04.866198 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/root-account-create-update-4gxnx" event={"ID":"6b6442aa-bbe5-4658-bfe9-28ee4f9b3daf","Type":"ContainerDied","Data":"0a2861c5cc0f595bf81985741b866cc835b0dcdfb494adb385879ca0b4137437"} Jan 31 14:56:04 crc kubenswrapper[4751]: I0131 14:56:04.866228 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a2861c5cc0f595bf81985741b866cc835b0dcdfb494adb385879ca0b4137437" Jan 31 14:56:04 crc kubenswrapper[4751]: I0131 14:56:04.866297 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/root-account-create-update-4gxnx" Jan 31 14:56:05 crc kubenswrapper[4751]: I0131 14:56:05.873644 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9w6rf" event={"ID":"db9636cf-e895-422b-8064-ce6d652a85d1","Type":"ContainerStarted","Data":"72edda77bd270bc034f43aee7ab36fd6299606986815e322ded2d2de14aa053b"} Jan 31 14:56:06 crc kubenswrapper[4751]: E0131 14:56:06.476646 4751 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.98:52020->38.102.83.98:44981: write tcp 38.102.83.98:52020->38.102.83.98:44981: write: broken pipe Jan 31 14:56:06 crc kubenswrapper[4751]: I0131 14:56:06.882425 4751 generic.go:334] "Generic (PLEG): container finished" podID="db9636cf-e895-422b-8064-ce6d652a85d1" containerID="72edda77bd270bc034f43aee7ab36fd6299606986815e322ded2d2de14aa053b" exitCode=0 Jan 31 14:56:06 crc kubenswrapper[4751]: I0131 14:56:06.882540 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9w6rf" event={"ID":"db9636cf-e895-422b-8064-ce6d652a85d1","Type":"ContainerDied","Data":"72edda77bd270bc034f43aee7ab36fd6299606986815e322ded2d2de14aa053b"} Jan 31 14:56:06 crc kubenswrapper[4751]: I0131 14:56:06.884787 4751 generic.go:334] "Generic (PLEG): container finished" podID="a525382f-29ee-4393-9e5b-1b3e989a1bc3" containerID="acab140e6ca6aa95c6844fc3952eecbc060037dc14e3f1b6a536e962fd34fb0c" exitCode=0 Jan 31 14:56:06 crc kubenswrapper[4751]: I0131 14:56:06.884835 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590swhdb" event={"ID":"a525382f-29ee-4393-9e5b-1b3e989a1bc3","Type":"ContainerDied","Data":"acab140e6ca6aa95c6844fc3952eecbc060037dc14e3f1b6a536e962fd34fb0c"} Jan 31 14:56:07 crc kubenswrapper[4751]: I0131 14:56:07.756552 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/openstack-galera-2" podUID="3fcd9bac-c0cb-4de4-b630-0db07f110da7" containerName="galera" probeResult="failure" output=< Jan 31 14:56:07 crc kubenswrapper[4751]: wsrep_local_state_comment (Donor/Desynced) differs from Synced Jan 31 14:56:07 crc kubenswrapper[4751]: > Jan 31 14:56:07 crc kubenswrapper[4751]: I0131 14:56:07.893895 4751 generic.go:334] "Generic (PLEG): container finished" podID="a525382f-29ee-4393-9e5b-1b3e989a1bc3" containerID="f84e5f08594d3f72dc6ce544065026534e30bfc6f05c4074d6d95900baad7f74" exitCode=0 Jan 31 14:56:07 crc kubenswrapper[4751]: I0131 14:56:07.893952 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590swhdb" event={"ID":"a525382f-29ee-4393-9e5b-1b3e989a1bc3","Type":"ContainerDied","Data":"f84e5f08594d3f72dc6ce544065026534e30bfc6f05c4074d6d95900baad7f74"} Jan 31 14:56:08 crc kubenswrapper[4751]: I0131 14:56:08.902144 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9w6rf" event={"ID":"db9636cf-e895-422b-8064-ce6d652a85d1","Type":"ContainerStarted","Data":"e33a9dabd0f9169517526bac63b6ef7237b0e3ac0120a0efb07aeb141081eeab"} Jan 31 14:56:08 crc kubenswrapper[4751]: I0131 14:56:08.923942 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9w6rf" podStartSLOduration=6.122838999 podStartE2EDuration="8.923923646s" podCreationTimestamp="2026-01-31 14:56:00 +0000 UTC" firstStartedPulling="2026-01-31 14:56:04.864267446 +0000 UTC m=+869.238980331" lastFinishedPulling="2026-01-31 14:56:07.665352053 +0000 UTC m=+872.040064978" observedRunningTime="2026-01-31 14:56:08.92253374 +0000 UTC m=+873.297246625" watchObservedRunningTime="2026-01-31 14:56:08.923923646 +0000 UTC m=+873.298636531" Jan 31 14:56:09 crc kubenswrapper[4751]: I0131 14:56:09.207932 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590swhdb" Jan 31 14:56:09 crc kubenswrapper[4751]: I0131 14:56:09.224031 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a525382f-29ee-4393-9e5b-1b3e989a1bc3-bundle\") pod \"a525382f-29ee-4393-9e5b-1b3e989a1bc3\" (UID: \"a525382f-29ee-4393-9e5b-1b3e989a1bc3\") " Jan 31 14:56:09 crc kubenswrapper[4751]: I0131 14:56:09.224195 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a525382f-29ee-4393-9e5b-1b3e989a1bc3-util\") pod \"a525382f-29ee-4393-9e5b-1b3e989a1bc3\" (UID: \"a525382f-29ee-4393-9e5b-1b3e989a1bc3\") " Jan 31 14:56:09 crc kubenswrapper[4751]: I0131 14:56:09.224249 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9mlm\" (UniqueName: \"kubernetes.io/projected/a525382f-29ee-4393-9e5b-1b3e989a1bc3-kube-api-access-k9mlm\") pod \"a525382f-29ee-4393-9e5b-1b3e989a1bc3\" (UID: \"a525382f-29ee-4393-9e5b-1b3e989a1bc3\") " Jan 31 14:56:09 crc kubenswrapper[4751]: I0131 14:56:09.224557 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a525382f-29ee-4393-9e5b-1b3e989a1bc3-bundle" (OuterVolumeSpecName: "bundle") pod "a525382f-29ee-4393-9e5b-1b3e989a1bc3" (UID: "a525382f-29ee-4393-9e5b-1b3e989a1bc3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 14:56:09 crc kubenswrapper[4751]: I0131 14:56:09.224745 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a525382f-29ee-4393-9e5b-1b3e989a1bc3-util" (OuterVolumeSpecName: "util") pod "a525382f-29ee-4393-9e5b-1b3e989a1bc3" (UID: "a525382f-29ee-4393-9e5b-1b3e989a1bc3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 14:56:09 crc kubenswrapper[4751]: I0131 14:56:09.231334 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a525382f-29ee-4393-9e5b-1b3e989a1bc3-kube-api-access-k9mlm" (OuterVolumeSpecName: "kube-api-access-k9mlm") pod "a525382f-29ee-4393-9e5b-1b3e989a1bc3" (UID: "a525382f-29ee-4393-9e5b-1b3e989a1bc3"). InnerVolumeSpecName "kube-api-access-k9mlm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:56:09 crc kubenswrapper[4751]: I0131 14:56:09.325879 4751 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a525382f-29ee-4393-9e5b-1b3e989a1bc3-util\") on node \"crc\" DevicePath \"\"" Jan 31 14:56:09 crc kubenswrapper[4751]: I0131 14:56:09.325921 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9mlm\" (UniqueName: \"kubernetes.io/projected/a525382f-29ee-4393-9e5b-1b3e989a1bc3-kube-api-access-k9mlm\") on node \"crc\" DevicePath \"\"" Jan 31 14:56:09 crc kubenswrapper[4751]: I0131 14:56:09.325938 4751 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a525382f-29ee-4393-9e5b-1b3e989a1bc3-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 14:56:09 crc kubenswrapper[4751]: I0131 14:56:09.909219 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590swhdb" event={"ID":"a525382f-29ee-4393-9e5b-1b3e989a1bc3","Type":"ContainerDied","Data":"1f53cc0349cc95b3204177d9541764f06423422d60741c2505c0c3dc21ce5ef9"} Jan 31 14:56:09 crc kubenswrapper[4751]: I0131 14:56:09.909264 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f53cc0349cc95b3204177d9541764f06423422d60741c2505c0c3dc21ce5ef9" Jan 31 14:56:09 crc kubenswrapper[4751]: I0131 14:56:09.909238 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590swhdb" Jan 31 14:56:10 crc kubenswrapper[4751]: I0131 14:56:10.673052 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9w6rf" Jan 31 14:56:10 crc kubenswrapper[4751]: I0131 14:56:10.673378 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9w6rf" Jan 31 14:56:11 crc kubenswrapper[4751]: I0131 14:56:11.735124 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9w6rf" podUID="db9636cf-e895-422b-8064-ce6d652a85d1" containerName="registry-server" probeResult="failure" output=< Jan 31 14:56:11 crc kubenswrapper[4751]: timeout: failed to connect service ":50051" within 1s Jan 31 14:56:11 crc kubenswrapper[4751]: > Jan 31 14:56:15 crc kubenswrapper[4751]: I0131 14:56:15.150855 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/openstack-galera-0" Jan 31 14:56:15 crc kubenswrapper[4751]: I0131 14:56:15.240686 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/openstack-galera-0" Jan 31 14:56:17 crc kubenswrapper[4751]: I0131 14:56:17.575611 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-fnbvg"] Jan 31 14:56:17 crc kubenswrapper[4751]: E0131 14:56:17.576313 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b6442aa-bbe5-4658-bfe9-28ee4f9b3daf" containerName="mariadb-account-create-update" Jan 31 14:56:17 crc kubenswrapper[4751]: I0131 14:56:17.576335 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b6442aa-bbe5-4658-bfe9-28ee4f9b3daf" containerName="mariadb-account-create-update" Jan 31 14:56:17 crc kubenswrapper[4751]: E0131 14:56:17.576349 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a525382f-29ee-4393-9e5b-1b3e989a1bc3" containerName="extract" Jan 31 14:56:17 crc kubenswrapper[4751]: I0131 14:56:17.576360 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="a525382f-29ee-4393-9e5b-1b3e989a1bc3" containerName="extract" Jan 31 14:56:17 crc kubenswrapper[4751]: E0131 14:56:17.576385 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a525382f-29ee-4393-9e5b-1b3e989a1bc3" containerName="pull" Jan 31 14:56:17 crc kubenswrapper[4751]: I0131 14:56:17.576395 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="a525382f-29ee-4393-9e5b-1b3e989a1bc3" containerName="pull" Jan 31 14:56:17 crc kubenswrapper[4751]: E0131 14:56:17.576419 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a525382f-29ee-4393-9e5b-1b3e989a1bc3" containerName="util" Jan 31 14:56:17 crc kubenswrapper[4751]: I0131 14:56:17.576432 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="a525382f-29ee-4393-9e5b-1b3e989a1bc3" containerName="util" Jan 31 14:56:17 crc kubenswrapper[4751]: I0131 14:56:17.576630 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="a525382f-29ee-4393-9e5b-1b3e989a1bc3" containerName="extract" Jan 31 14:56:17 crc kubenswrapper[4751]: I0131 14:56:17.576646 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b6442aa-bbe5-4658-bfe9-28ee4f9b3daf" containerName="mariadb-account-create-update" Jan 31 14:56:17 crc kubenswrapper[4751]: I0131 14:56:17.577312 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-fnbvg" Jan 31 14:56:17 crc kubenswrapper[4751]: I0131 14:56:17.579103 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-dockercfg-b8cw4" Jan 31 14:56:17 crc kubenswrapper[4751]: I0131 14:56:17.593405 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-fnbvg"] Jan 31 14:56:17 crc kubenswrapper[4751]: I0131 14:56:17.736531 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5bkt\" (UniqueName: \"kubernetes.io/projected/3b77f113-f8c0-47b8-ad79-d1be38bf6e09-kube-api-access-x5bkt\") pod \"rabbitmq-cluster-operator-779fc9694b-fnbvg\" (UID: \"3b77f113-f8c0-47b8-ad79-d1be38bf6e09\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-fnbvg" Jan 31 14:56:17 crc kubenswrapper[4751]: I0131 14:56:17.838278 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5bkt\" (UniqueName: \"kubernetes.io/projected/3b77f113-f8c0-47b8-ad79-d1be38bf6e09-kube-api-access-x5bkt\") pod \"rabbitmq-cluster-operator-779fc9694b-fnbvg\" (UID: \"3b77f113-f8c0-47b8-ad79-d1be38bf6e09\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-fnbvg" Jan 31 14:56:17 crc kubenswrapper[4751]: I0131 14:56:17.869967 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5bkt\" (UniqueName: \"kubernetes.io/projected/3b77f113-f8c0-47b8-ad79-d1be38bf6e09-kube-api-access-x5bkt\") pod \"rabbitmq-cluster-operator-779fc9694b-fnbvg\" (UID: \"3b77f113-f8c0-47b8-ad79-d1be38bf6e09\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-fnbvg" Jan 31 14:56:17 crc kubenswrapper[4751]: I0131 14:56:17.905443 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-fnbvg" Jan 31 14:56:18 crc kubenswrapper[4751]: I0131 14:56:18.421998 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-fnbvg"] Jan 31 14:56:18 crc kubenswrapper[4751]: I0131 14:56:18.503921 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/openstack-galera-1" Jan 31 14:56:18 crc kubenswrapper[4751]: I0131 14:56:18.584792 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/openstack-galera-1" Jan 31 14:56:18 crc kubenswrapper[4751]: I0131 14:56:18.980239 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-fnbvg" event={"ID":"3b77f113-f8c0-47b8-ad79-d1be38bf6e09","Type":"ContainerStarted","Data":"9417f04e17815ef9de6ec5d2357c85d9f600b65c7a818fc63c494820d893f560"} Jan 31 14:56:20 crc kubenswrapper[4751]: I0131 14:56:20.740216 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9w6rf" Jan 31 14:56:20 crc kubenswrapper[4751]: I0131 14:56:20.797224 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9w6rf" Jan 31 14:56:22 crc kubenswrapper[4751]: I0131 14:56:22.014313 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-fnbvg" event={"ID":"3b77f113-f8c0-47b8-ad79-d1be38bf6e09","Type":"ContainerStarted","Data":"669ca17a7f872d19f207a1db8b3eeef3d4392ee464f55983153e6452773c111f"} Jan 31 14:56:22 crc kubenswrapper[4751]: I0131 14:56:22.047511 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-fnbvg" podStartSLOduration=1.7411855950000001 podStartE2EDuration="5.047473754s" podCreationTimestamp="2026-01-31 14:56:17 +0000 UTC" firstStartedPulling="2026-01-31 14:56:18.435774268 +0000 UTC m=+882.810487153" lastFinishedPulling="2026-01-31 14:56:21.742062407 +0000 UTC m=+886.116775312" observedRunningTime="2026-01-31 14:56:22.036791303 +0000 UTC m=+886.411504228" watchObservedRunningTime="2026-01-31 14:56:22.047473754 +0000 UTC m=+886.422186679" Jan 31 14:56:24 crc kubenswrapper[4751]: I0131 14:56:24.003876 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/rabbitmq-server-0"] Jan 31 14:56:24 crc kubenswrapper[4751]: I0131 14:56:24.004998 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 14:56:24 crc kubenswrapper[4751]: I0131 14:56:24.007669 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"rabbitmq-server-conf" Jan 31 14:56:24 crc kubenswrapper[4751]: I0131 14:56:24.008041 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"rabbitmq-server-dockercfg-8vh75" Jan 31 14:56:24 crc kubenswrapper[4751]: I0131 14:56:24.008217 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"rabbitmq-default-user" Jan 31 14:56:24 crc kubenswrapper[4751]: I0131 14:56:24.008352 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"rabbitmq-erlang-cookie" Jan 31 14:56:24 crc kubenswrapper[4751]: I0131 14:56:24.008709 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"rabbitmq-plugins-conf" Jan 31 14:56:24 crc kubenswrapper[4751]: I0131 14:56:24.015175 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/rabbitmq-server-0"] Jan 31 14:56:24 crc kubenswrapper[4751]: I0131 14:56:24.148827 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/19317a08-b18b-42c9-bdc9-394e1e06257d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"19317a08-b18b-42c9-bdc9-394e1e06257d\") " pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 14:56:24 crc kubenswrapper[4751]: I0131 14:56:24.149039 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/19317a08-b18b-42c9-bdc9-394e1e06257d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"19317a08-b18b-42c9-bdc9-394e1e06257d\") " pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 14:56:24 crc kubenswrapper[4751]: I0131 14:56:24.149300 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/19317a08-b18b-42c9-bdc9-394e1e06257d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"19317a08-b18b-42c9-bdc9-394e1e06257d\") " pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 14:56:24 crc kubenswrapper[4751]: I0131 14:56:24.153042 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f145c232-830a-4841-bd1f-7c42e25cd443\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f145c232-830a-4841-bd1f-7c42e25cd443\") pod \"rabbitmq-server-0\" (UID: \"19317a08-b18b-42c9-bdc9-394e1e06257d\") " pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 14:56:24 crc kubenswrapper[4751]: I0131 14:56:24.153160 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/19317a08-b18b-42c9-bdc9-394e1e06257d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"19317a08-b18b-42c9-bdc9-394e1e06257d\") " pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 14:56:24 crc kubenswrapper[4751]: I0131 14:56:24.153310 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrbjl\" (UniqueName: \"kubernetes.io/projected/19317a08-b18b-42c9-bdc9-394e1e06257d-kube-api-access-zrbjl\") pod \"rabbitmq-server-0\" (UID: \"19317a08-b18b-42c9-bdc9-394e1e06257d\") " pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 14:56:24 crc kubenswrapper[4751]: I0131 14:56:24.153370 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/19317a08-b18b-42c9-bdc9-394e1e06257d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"19317a08-b18b-42c9-bdc9-394e1e06257d\") " pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 14:56:24 crc kubenswrapper[4751]: I0131 14:56:24.153488 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/19317a08-b18b-42c9-bdc9-394e1e06257d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"19317a08-b18b-42c9-bdc9-394e1e06257d\") " pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 14:56:24 crc kubenswrapper[4751]: I0131 14:56:24.254602 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/19317a08-b18b-42c9-bdc9-394e1e06257d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"19317a08-b18b-42c9-bdc9-394e1e06257d\") " pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 14:56:24 crc kubenswrapper[4751]: I0131 14:56:24.254679 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f145c232-830a-4841-bd1f-7c42e25cd443\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f145c232-830a-4841-bd1f-7c42e25cd443\") pod \"rabbitmq-server-0\" (UID: \"19317a08-b18b-42c9-bdc9-394e1e06257d\") " pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 14:56:24 crc kubenswrapper[4751]: I0131 14:56:24.255357 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/19317a08-b18b-42c9-bdc9-394e1e06257d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"19317a08-b18b-42c9-bdc9-394e1e06257d\") " pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 14:56:24 crc kubenswrapper[4751]: I0131 14:56:24.255433 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/19317a08-b18b-42c9-bdc9-394e1e06257d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"19317a08-b18b-42c9-bdc9-394e1e06257d\") " pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 14:56:24 crc kubenswrapper[4751]: I0131 14:56:24.255466 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrbjl\" (UniqueName: \"kubernetes.io/projected/19317a08-b18b-42c9-bdc9-394e1e06257d-kube-api-access-zrbjl\") pod \"rabbitmq-server-0\" (UID: \"19317a08-b18b-42c9-bdc9-394e1e06257d\") " pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 14:56:24 crc kubenswrapper[4751]: I0131 14:56:24.255917 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/19317a08-b18b-42c9-bdc9-394e1e06257d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"19317a08-b18b-42c9-bdc9-394e1e06257d\") " pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 14:56:24 crc kubenswrapper[4751]: I0131 14:56:24.256035 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/19317a08-b18b-42c9-bdc9-394e1e06257d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"19317a08-b18b-42c9-bdc9-394e1e06257d\") " pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 14:56:24 crc kubenswrapper[4751]: I0131 14:56:24.256536 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/19317a08-b18b-42c9-bdc9-394e1e06257d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"19317a08-b18b-42c9-bdc9-394e1e06257d\") " pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 14:56:24 crc kubenswrapper[4751]: I0131 14:56:24.257056 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/19317a08-b18b-42c9-bdc9-394e1e06257d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"19317a08-b18b-42c9-bdc9-394e1e06257d\") " pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 14:56:24 crc kubenswrapper[4751]: I0131 14:56:24.257188 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/19317a08-b18b-42c9-bdc9-394e1e06257d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"19317a08-b18b-42c9-bdc9-394e1e06257d\") " pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 14:56:24 crc kubenswrapper[4751]: I0131 14:56:24.257241 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/19317a08-b18b-42c9-bdc9-394e1e06257d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"19317a08-b18b-42c9-bdc9-394e1e06257d\") " pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 14:56:24 crc kubenswrapper[4751]: I0131 14:56:24.258257 4751 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 31 14:56:24 crc kubenswrapper[4751]: I0131 14:56:24.258305 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f145c232-830a-4841-bd1f-7c42e25cd443\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f145c232-830a-4841-bd1f-7c42e25cd443\") pod \"rabbitmq-server-0\" (UID: \"19317a08-b18b-42c9-bdc9-394e1e06257d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/477b9b8ecff0fd3c5085f9312279f3fdf5646254d348b830ad73ff5a0f99fc7f/globalmount\"" pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 14:56:24 crc kubenswrapper[4751]: I0131 14:56:24.262963 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/19317a08-b18b-42c9-bdc9-394e1e06257d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"19317a08-b18b-42c9-bdc9-394e1e06257d\") " pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 14:56:24 crc kubenswrapper[4751]: I0131 14:56:24.265654 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/19317a08-b18b-42c9-bdc9-394e1e06257d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"19317a08-b18b-42c9-bdc9-394e1e06257d\") " pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 14:56:24 crc kubenswrapper[4751]: I0131 14:56:24.278428 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/19317a08-b18b-42c9-bdc9-394e1e06257d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"19317a08-b18b-42c9-bdc9-394e1e06257d\") " pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 14:56:24 crc kubenswrapper[4751]: I0131 14:56:24.284132 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrbjl\" (UniqueName: \"kubernetes.io/projected/19317a08-b18b-42c9-bdc9-394e1e06257d-kube-api-access-zrbjl\") pod \"rabbitmq-server-0\" (UID: \"19317a08-b18b-42c9-bdc9-394e1e06257d\") " pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 14:56:24 crc kubenswrapper[4751]: I0131 14:56:24.294979 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f145c232-830a-4841-bd1f-7c42e25cd443\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f145c232-830a-4841-bd1f-7c42e25cd443\") pod \"rabbitmq-server-0\" (UID: \"19317a08-b18b-42c9-bdc9-394e1e06257d\") " pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 14:56:24 crc kubenswrapper[4751]: I0131 14:56:24.328098 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 14:56:24 crc kubenswrapper[4751]: I0131 14:56:24.646788 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/rabbitmq-server-0"] Jan 31 14:56:24 crc kubenswrapper[4751]: W0131 14:56:24.656211 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19317a08_b18b_42c9_bdc9_394e1e06257d.slice/crio-f6c134f960dca8717c0eb288c9e0a54cef2dc5968f5f68b04744d850b9ec573e WatchSource:0}: Error finding container f6c134f960dca8717c0eb288c9e0a54cef2dc5968f5f68b04744d850b9ec573e: Status 404 returned error can't find the container with id f6c134f960dca8717c0eb288c9e0a54cef2dc5968f5f68b04744d850b9ec573e Jan 31 14:56:24 crc kubenswrapper[4751]: I0131 14:56:24.709535 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9w6rf"] Jan 31 14:56:24 crc kubenswrapper[4751]: I0131 14:56:24.709751 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9w6rf" podUID="db9636cf-e895-422b-8064-ce6d652a85d1" containerName="registry-server" containerID="cri-o://e33a9dabd0f9169517526bac63b6ef7237b0e3ac0120a0efb07aeb141081eeab" gracePeriod=2 Jan 31 14:56:25 crc kubenswrapper[4751]: I0131 14:56:25.057140 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/rabbitmq-server-0" event={"ID":"19317a08-b18b-42c9-bdc9-394e1e06257d","Type":"ContainerStarted","Data":"f6c134f960dca8717c0eb288c9e0a54cef2dc5968f5f68b04744d850b9ec573e"} Jan 31 14:56:25 crc kubenswrapper[4751]: I0131 14:56:25.061447 4751 generic.go:334] "Generic (PLEG): container finished" podID="db9636cf-e895-422b-8064-ce6d652a85d1" containerID="e33a9dabd0f9169517526bac63b6ef7237b0e3ac0120a0efb07aeb141081eeab" exitCode=0 Jan 31 14:56:25 crc kubenswrapper[4751]: I0131 14:56:25.061495 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9w6rf" event={"ID":"db9636cf-e895-422b-8064-ce6d652a85d1","Type":"ContainerDied","Data":"e33a9dabd0f9169517526bac63b6ef7237b0e3ac0120a0efb07aeb141081eeab"} Jan 31 14:56:25 crc kubenswrapper[4751]: I0131 14:56:25.654211 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9w6rf" Jan 31 14:56:25 crc kubenswrapper[4751]: I0131 14:56:25.679954 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db9636cf-e895-422b-8064-ce6d652a85d1-catalog-content\") pod \"db9636cf-e895-422b-8064-ce6d652a85d1\" (UID: \"db9636cf-e895-422b-8064-ce6d652a85d1\") " Jan 31 14:56:25 crc kubenswrapper[4751]: I0131 14:56:25.680047 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db9636cf-e895-422b-8064-ce6d652a85d1-utilities\") pod \"db9636cf-e895-422b-8064-ce6d652a85d1\" (UID: \"db9636cf-e895-422b-8064-ce6d652a85d1\") " Jan 31 14:56:25 crc kubenswrapper[4751]: I0131 14:56:25.680088 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2b4z\" (UniqueName: \"kubernetes.io/projected/db9636cf-e895-422b-8064-ce6d652a85d1-kube-api-access-w2b4z\") pod \"db9636cf-e895-422b-8064-ce6d652a85d1\" (UID: \"db9636cf-e895-422b-8064-ce6d652a85d1\") " Jan 31 14:56:25 crc kubenswrapper[4751]: I0131 14:56:25.681206 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db9636cf-e895-422b-8064-ce6d652a85d1-utilities" (OuterVolumeSpecName: "utilities") pod "db9636cf-e895-422b-8064-ce6d652a85d1" (UID: "db9636cf-e895-422b-8064-ce6d652a85d1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 14:56:25 crc kubenswrapper[4751]: I0131 14:56:25.685702 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db9636cf-e895-422b-8064-ce6d652a85d1-kube-api-access-w2b4z" (OuterVolumeSpecName: "kube-api-access-w2b4z") pod "db9636cf-e895-422b-8064-ce6d652a85d1" (UID: "db9636cf-e895-422b-8064-ce6d652a85d1"). InnerVolumeSpecName "kube-api-access-w2b4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:56:25 crc kubenswrapper[4751]: I0131 14:56:25.781653 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db9636cf-e895-422b-8064-ce6d652a85d1-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 14:56:25 crc kubenswrapper[4751]: I0131 14:56:25.781952 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2b4z\" (UniqueName: \"kubernetes.io/projected/db9636cf-e895-422b-8064-ce6d652a85d1-kube-api-access-w2b4z\") on node \"crc\" DevicePath \"\"" Jan 31 14:56:25 crc kubenswrapper[4751]: I0131 14:56:25.819789 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db9636cf-e895-422b-8064-ce6d652a85d1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "db9636cf-e895-422b-8064-ce6d652a85d1" (UID: "db9636cf-e895-422b-8064-ce6d652a85d1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 14:56:25 crc kubenswrapper[4751]: I0131 14:56:25.883408 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db9636cf-e895-422b-8064-ce6d652a85d1-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 14:56:26 crc kubenswrapper[4751]: I0131 14:56:26.071632 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9w6rf" event={"ID":"db9636cf-e895-422b-8064-ce6d652a85d1","Type":"ContainerDied","Data":"dc0b582d511f1669c6b4a4462f76d30ea3f273270449a3ea86a4620ceb3d1f30"} Jan 31 14:56:26 crc kubenswrapper[4751]: I0131 14:56:26.071719 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9w6rf" Jan 31 14:56:26 crc kubenswrapper[4751]: I0131 14:56:26.072104 4751 scope.go:117] "RemoveContainer" containerID="e33a9dabd0f9169517526bac63b6ef7237b0e3ac0120a0efb07aeb141081eeab" Jan 31 14:56:26 crc kubenswrapper[4751]: I0131 14:56:26.099689 4751 scope.go:117] "RemoveContainer" containerID="72edda77bd270bc034f43aee7ab36fd6299606986815e322ded2d2de14aa053b" Jan 31 14:56:26 crc kubenswrapper[4751]: I0131 14:56:26.102050 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9w6rf"] Jan 31 14:56:26 crc kubenswrapper[4751]: I0131 14:56:26.108478 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9w6rf"] Jan 31 14:56:26 crc kubenswrapper[4751]: I0131 14:56:26.128058 4751 scope.go:117] "RemoveContainer" containerID="526b390ea0ece0e6fceffb4c1922e3d8cdf235036daf439a7f234526b708d9ea" Jan 31 14:56:26 crc kubenswrapper[4751]: I0131 14:56:26.413256 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db9636cf-e895-422b-8064-ce6d652a85d1" path="/var/lib/kubelet/pods/db9636cf-e895-422b-8064-ce6d652a85d1/volumes" Jan 31 14:56:26 crc kubenswrapper[4751]: I0131 14:56:26.743654 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-index-6bwnv"] Jan 31 14:56:26 crc kubenswrapper[4751]: E0131 14:56:26.745032 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db9636cf-e895-422b-8064-ce6d652a85d1" containerName="registry-server" Jan 31 14:56:26 crc kubenswrapper[4751]: I0131 14:56:26.745111 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="db9636cf-e895-422b-8064-ce6d652a85d1" containerName="registry-server" Jan 31 14:56:26 crc kubenswrapper[4751]: E0131 14:56:26.745161 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db9636cf-e895-422b-8064-ce6d652a85d1" containerName="extract-utilities" Jan 31 14:56:26 crc kubenswrapper[4751]: I0131 14:56:26.745184 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="db9636cf-e895-422b-8064-ce6d652a85d1" containerName="extract-utilities" Jan 31 14:56:26 crc kubenswrapper[4751]: E0131 14:56:26.745236 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db9636cf-e895-422b-8064-ce6d652a85d1" containerName="extract-content" Jan 31 14:56:26 crc kubenswrapper[4751]: I0131 14:56:26.745255 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="db9636cf-e895-422b-8064-ce6d652a85d1" containerName="extract-content" Jan 31 14:56:26 crc kubenswrapper[4751]: I0131 14:56:26.746220 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="db9636cf-e895-422b-8064-ce6d652a85d1" containerName="registry-server" Jan 31 14:56:26 crc kubenswrapper[4751]: I0131 14:56:26.748439 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-6bwnv" Jan 31 14:56:26 crc kubenswrapper[4751]: I0131 14:56:26.753642 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-index-dockercfg-x7dlp" Jan 31 14:56:26 crc kubenswrapper[4751]: I0131 14:56:26.776123 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-6bwnv"] Jan 31 14:56:26 crc kubenswrapper[4751]: I0131 14:56:26.796139 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msbp2\" (UniqueName: \"kubernetes.io/projected/08530f42-16c5-4253-a623-2a032aeb95a7-kube-api-access-msbp2\") pod \"keystone-operator-index-6bwnv\" (UID: \"08530f42-16c5-4253-a623-2a032aeb95a7\") " pod="openstack-operators/keystone-operator-index-6bwnv" Jan 31 14:56:26 crc kubenswrapper[4751]: I0131 14:56:26.897294 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msbp2\" (UniqueName: \"kubernetes.io/projected/08530f42-16c5-4253-a623-2a032aeb95a7-kube-api-access-msbp2\") pod \"keystone-operator-index-6bwnv\" (UID: \"08530f42-16c5-4253-a623-2a032aeb95a7\") " pod="openstack-operators/keystone-operator-index-6bwnv" Jan 31 14:56:26 crc kubenswrapper[4751]: I0131 14:56:26.915516 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msbp2\" (UniqueName: \"kubernetes.io/projected/08530f42-16c5-4253-a623-2a032aeb95a7-kube-api-access-msbp2\") pod \"keystone-operator-index-6bwnv\" (UID: \"08530f42-16c5-4253-a623-2a032aeb95a7\") " pod="openstack-operators/keystone-operator-index-6bwnv" Jan 31 14:56:27 crc kubenswrapper[4751]: I0131 14:56:27.074008 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-6bwnv" Jan 31 14:56:27 crc kubenswrapper[4751]: I0131 14:56:27.503229 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-6bwnv"] Jan 31 14:56:28 crc kubenswrapper[4751]: I0131 14:56:28.091546 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-6bwnv" event={"ID":"08530f42-16c5-4253-a623-2a032aeb95a7","Type":"ContainerStarted","Data":"287dcdc51fb3cdd7484c91633318b58c60ad6d8b753d031c65761b77a7b8670b"} Jan 31 14:56:30 crc kubenswrapper[4751]: I0131 14:56:30.110487 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-6bwnv" event={"ID":"08530f42-16c5-4253-a623-2a032aeb95a7","Type":"ContainerStarted","Data":"3035639c5750cf779b9b57b5d0ade23abfc3c28de57f8e43e074a91f02a62e68"} Jan 31 14:56:30 crc kubenswrapper[4751]: I0131 14:56:30.132366 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-index-6bwnv" podStartSLOduration=3.17874397 podStartE2EDuration="4.132337466s" podCreationTimestamp="2026-01-31 14:56:26 +0000 UTC" firstStartedPulling="2026-01-31 14:56:27.52521991 +0000 UTC m=+891.899932835" lastFinishedPulling="2026-01-31 14:56:28.478813426 +0000 UTC m=+892.853526331" observedRunningTime="2026-01-31 14:56:30.123470623 +0000 UTC m=+894.498183518" watchObservedRunningTime="2026-01-31 14:56:30.132337466 +0000 UTC m=+894.507050381" Jan 31 14:56:35 crc kubenswrapper[4751]: I0131 14:56:35.144977 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/rabbitmq-server-0" event={"ID":"19317a08-b18b-42c9-bdc9-394e1e06257d","Type":"ContainerStarted","Data":"505748b1e10e777b66b173a6705d54ff333de5b60e6cd125a1cf81bd7167e586"} Jan 31 14:56:36 crc kubenswrapper[4751]: I0131 14:56:36.927831 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-g8hgc"] Jan 31 14:56:36 crc kubenswrapper[4751]: I0131 14:56:36.930289 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g8hgc" Jan 31 14:56:36 crc kubenswrapper[4751]: I0131 14:56:36.951177 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g8hgc"] Jan 31 14:56:37 crc kubenswrapper[4751]: I0131 14:56:37.048815 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb6fb532-641c-459e-bb99-ba0f9779510c-utilities\") pod \"redhat-marketplace-g8hgc\" (UID: \"eb6fb532-641c-459e-bb99-ba0f9779510c\") " pod="openshift-marketplace/redhat-marketplace-g8hgc" Jan 31 14:56:37 crc kubenswrapper[4751]: I0131 14:56:37.048886 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjhf4\" (UniqueName: \"kubernetes.io/projected/eb6fb532-641c-459e-bb99-ba0f9779510c-kube-api-access-pjhf4\") pod \"redhat-marketplace-g8hgc\" (UID: \"eb6fb532-641c-459e-bb99-ba0f9779510c\") " pod="openshift-marketplace/redhat-marketplace-g8hgc" Jan 31 14:56:37 crc kubenswrapper[4751]: I0131 14:56:37.048912 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb6fb532-641c-459e-bb99-ba0f9779510c-catalog-content\") pod \"redhat-marketplace-g8hgc\" (UID: \"eb6fb532-641c-459e-bb99-ba0f9779510c\") " pod="openshift-marketplace/redhat-marketplace-g8hgc" Jan 31 14:56:37 crc kubenswrapper[4751]: I0131 14:56:37.074572 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/keystone-operator-index-6bwnv" Jan 31 14:56:37 crc kubenswrapper[4751]: I0131 14:56:37.074604 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-index-6bwnv" Jan 31 14:56:37 crc kubenswrapper[4751]: I0131 14:56:37.099301 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/keystone-operator-index-6bwnv" Jan 31 14:56:37 crc kubenswrapper[4751]: I0131 14:56:37.150486 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb6fb532-641c-459e-bb99-ba0f9779510c-utilities\") pod \"redhat-marketplace-g8hgc\" (UID: \"eb6fb532-641c-459e-bb99-ba0f9779510c\") " pod="openshift-marketplace/redhat-marketplace-g8hgc" Jan 31 14:56:37 crc kubenswrapper[4751]: I0131 14:56:37.150594 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjhf4\" (UniqueName: \"kubernetes.io/projected/eb6fb532-641c-459e-bb99-ba0f9779510c-kube-api-access-pjhf4\") pod \"redhat-marketplace-g8hgc\" (UID: \"eb6fb532-641c-459e-bb99-ba0f9779510c\") " pod="openshift-marketplace/redhat-marketplace-g8hgc" Jan 31 14:56:37 crc kubenswrapper[4751]: I0131 14:56:37.150630 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb6fb532-641c-459e-bb99-ba0f9779510c-catalog-content\") pod \"redhat-marketplace-g8hgc\" (UID: \"eb6fb532-641c-459e-bb99-ba0f9779510c\") " pod="openshift-marketplace/redhat-marketplace-g8hgc" Jan 31 14:56:37 crc kubenswrapper[4751]: I0131 14:56:37.150897 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb6fb532-641c-459e-bb99-ba0f9779510c-utilities\") pod \"redhat-marketplace-g8hgc\" (UID: \"eb6fb532-641c-459e-bb99-ba0f9779510c\") " pod="openshift-marketplace/redhat-marketplace-g8hgc" Jan 31 14:56:37 crc kubenswrapper[4751]: I0131 14:56:37.151044 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb6fb532-641c-459e-bb99-ba0f9779510c-catalog-content\") pod \"redhat-marketplace-g8hgc\" (UID: \"eb6fb532-641c-459e-bb99-ba0f9779510c\") " pod="openshift-marketplace/redhat-marketplace-g8hgc" Jan 31 14:56:37 crc kubenswrapper[4751]: I0131 14:56:37.172786 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjhf4\" (UniqueName: \"kubernetes.io/projected/eb6fb532-641c-459e-bb99-ba0f9779510c-kube-api-access-pjhf4\") pod \"redhat-marketplace-g8hgc\" (UID: \"eb6fb532-641c-459e-bb99-ba0f9779510c\") " pod="openshift-marketplace/redhat-marketplace-g8hgc" Jan 31 14:56:37 crc kubenswrapper[4751]: I0131 14:56:37.197804 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-index-6bwnv" Jan 31 14:56:37 crc kubenswrapper[4751]: I0131 14:56:37.262677 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g8hgc" Jan 31 14:56:37 crc kubenswrapper[4751]: I0131 14:56:37.711413 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g8hgc"] Jan 31 14:56:37 crc kubenswrapper[4751]: W0131 14:56:37.719571 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb6fb532_641c_459e_bb99_ba0f9779510c.slice/crio-b08d3a83c80ca64edae98efdfdbc3ea7e3985a0df616ed7d528a097908ab573a WatchSource:0}: Error finding container b08d3a83c80ca64edae98efdfdbc3ea7e3985a0df616ed7d528a097908ab573a: Status 404 returned error can't find the container with id b08d3a83c80ca64edae98efdfdbc3ea7e3985a0df616ed7d528a097908ab573a Jan 31 14:56:38 crc kubenswrapper[4751]: I0131 14:56:38.171731 4751 generic.go:334] "Generic (PLEG): container finished" podID="eb6fb532-641c-459e-bb99-ba0f9779510c" containerID="f76a9ace8d656c05224f8d7e9efd34944bad63abdbcc558bcbf72f5ac86facb5" exitCode=0 Jan 31 14:56:38 crc kubenswrapper[4751]: I0131 14:56:38.172983 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g8hgc" event={"ID":"eb6fb532-641c-459e-bb99-ba0f9779510c","Type":"ContainerDied","Data":"f76a9ace8d656c05224f8d7e9efd34944bad63abdbcc558bcbf72f5ac86facb5"} Jan 31 14:56:38 crc kubenswrapper[4751]: I0131 14:56:38.173011 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g8hgc" event={"ID":"eb6fb532-641c-459e-bb99-ba0f9779510c","Type":"ContainerStarted","Data":"b08d3a83c80ca64edae98efdfdbc3ea7e3985a0df616ed7d528a097908ab573a"} Jan 31 14:56:39 crc kubenswrapper[4751]: I0131 14:56:39.180658 4751 generic.go:334] "Generic (PLEG): container finished" podID="eb6fb532-641c-459e-bb99-ba0f9779510c" containerID="573923f23b16312c13dc5ab0d41de7478d1a0111af903d7fde9c795304380992" exitCode=0 Jan 31 14:56:39 crc kubenswrapper[4751]: I0131 14:56:39.180892 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g8hgc" event={"ID":"eb6fb532-641c-459e-bb99-ba0f9779510c","Type":"ContainerDied","Data":"573923f23b16312c13dc5ab0d41de7478d1a0111af903d7fde9c795304380992"} Jan 31 14:56:40 crc kubenswrapper[4751]: I0131 14:56:40.191509 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g8hgc" event={"ID":"eb6fb532-641c-459e-bb99-ba0f9779510c","Type":"ContainerStarted","Data":"91f8565e18d5814616edf1f9308b1748710df7aa51f507a293df07392a1fe336"} Jan 31 14:56:40 crc kubenswrapper[4751]: I0131 14:56:40.214149 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-g8hgc" podStartSLOduration=2.588494662 podStartE2EDuration="4.214128406s" podCreationTimestamp="2026-01-31 14:56:36 +0000 UTC" firstStartedPulling="2026-01-31 14:56:38.173655881 +0000 UTC m=+902.548368766" lastFinishedPulling="2026-01-31 14:56:39.799289625 +0000 UTC m=+904.174002510" observedRunningTime="2026-01-31 14:56:40.208736964 +0000 UTC m=+904.583449869" watchObservedRunningTime="2026-01-31 14:56:40.214128406 +0000 UTC m=+904.588841301" Jan 31 14:56:41 crc kubenswrapper[4751]: I0131 14:56:41.967983 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4ef6sfsr"] Jan 31 14:56:41 crc kubenswrapper[4751]: I0131 14:56:41.970194 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4ef6sfsr" Jan 31 14:56:41 crc kubenswrapper[4751]: I0131 14:56:41.976008 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-wxkjx" Jan 31 14:56:41 crc kubenswrapper[4751]: I0131 14:56:41.979658 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4ef6sfsr"] Jan 31 14:56:42 crc kubenswrapper[4751]: I0131 14:56:42.114457 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/772cd794-fe9a-4ac3-8df8-e7f29edb85bf-bundle\") pod \"b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4ef6sfsr\" (UID: \"772cd794-fe9a-4ac3-8df8-e7f29edb85bf\") " pod="openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4ef6sfsr" Jan 31 14:56:42 crc kubenswrapper[4751]: I0131 14:56:42.114521 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/772cd794-fe9a-4ac3-8df8-e7f29edb85bf-util\") pod \"b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4ef6sfsr\" (UID: \"772cd794-fe9a-4ac3-8df8-e7f29edb85bf\") " pod="openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4ef6sfsr" Jan 31 14:56:42 crc kubenswrapper[4751]: I0131 14:56:42.114578 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pss5r\" (UniqueName: \"kubernetes.io/projected/772cd794-fe9a-4ac3-8df8-e7f29edb85bf-kube-api-access-pss5r\") pod \"b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4ef6sfsr\" (UID: \"772cd794-fe9a-4ac3-8df8-e7f29edb85bf\") " pod="openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4ef6sfsr" Jan 31 14:56:42 crc kubenswrapper[4751]: I0131 14:56:42.216697 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/772cd794-fe9a-4ac3-8df8-e7f29edb85bf-bundle\") pod \"b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4ef6sfsr\" (UID: \"772cd794-fe9a-4ac3-8df8-e7f29edb85bf\") " pod="openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4ef6sfsr" Jan 31 14:56:42 crc kubenswrapper[4751]: I0131 14:56:42.216836 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/772cd794-fe9a-4ac3-8df8-e7f29edb85bf-util\") pod \"b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4ef6sfsr\" (UID: \"772cd794-fe9a-4ac3-8df8-e7f29edb85bf\") " pod="openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4ef6sfsr" Jan 31 14:56:42 crc kubenswrapper[4751]: I0131 14:56:42.217006 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pss5r\" (UniqueName: \"kubernetes.io/projected/772cd794-fe9a-4ac3-8df8-e7f29edb85bf-kube-api-access-pss5r\") pod \"b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4ef6sfsr\" (UID: \"772cd794-fe9a-4ac3-8df8-e7f29edb85bf\") " pod="openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4ef6sfsr" Jan 31 14:56:42 crc kubenswrapper[4751]: I0131 14:56:42.218030 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/772cd794-fe9a-4ac3-8df8-e7f29edb85bf-bundle\") pod \"b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4ef6sfsr\" (UID: \"772cd794-fe9a-4ac3-8df8-e7f29edb85bf\") " pod="openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4ef6sfsr" Jan 31 14:56:42 crc kubenswrapper[4751]: I0131 14:56:42.218322 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/772cd794-fe9a-4ac3-8df8-e7f29edb85bf-util\") pod \"b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4ef6sfsr\" (UID: \"772cd794-fe9a-4ac3-8df8-e7f29edb85bf\") " pod="openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4ef6sfsr" Jan 31 14:56:42 crc kubenswrapper[4751]: I0131 14:56:42.238306 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pss5r\" (UniqueName: \"kubernetes.io/projected/772cd794-fe9a-4ac3-8df8-e7f29edb85bf-kube-api-access-pss5r\") pod \"b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4ef6sfsr\" (UID: \"772cd794-fe9a-4ac3-8df8-e7f29edb85bf\") " pod="openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4ef6sfsr" Jan 31 14:56:42 crc kubenswrapper[4751]: I0131 14:56:42.301940 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4ef6sfsr" Jan 31 14:56:42 crc kubenswrapper[4751]: I0131 14:56:42.778974 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4ef6sfsr"] Jan 31 14:56:42 crc kubenswrapper[4751]: W0131 14:56:42.792214 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod772cd794_fe9a_4ac3_8df8_e7f29edb85bf.slice/crio-2fb806f8b06eb6a79ed37ad373eb1b349cbf24edb359d55111d6e03a3fda0ba6 WatchSource:0}: Error finding container 2fb806f8b06eb6a79ed37ad373eb1b349cbf24edb359d55111d6e03a3fda0ba6: Status 404 returned error can't find the container with id 2fb806f8b06eb6a79ed37ad373eb1b349cbf24edb359d55111d6e03a3fda0ba6 Jan 31 14:56:43 crc kubenswrapper[4751]: I0131 14:56:43.213652 4751 generic.go:334] "Generic (PLEG): container finished" podID="772cd794-fe9a-4ac3-8df8-e7f29edb85bf" containerID="8f2f8355ecce67c5c0aa186fe2a2c3a5d75143a19a9cc7d982cad7e44dc2d94f" exitCode=0 Jan 31 14:56:43 crc kubenswrapper[4751]: I0131 14:56:43.213863 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4ef6sfsr" event={"ID":"772cd794-fe9a-4ac3-8df8-e7f29edb85bf","Type":"ContainerDied","Data":"8f2f8355ecce67c5c0aa186fe2a2c3a5d75143a19a9cc7d982cad7e44dc2d94f"} Jan 31 14:56:43 crc kubenswrapper[4751]: I0131 14:56:43.213928 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4ef6sfsr" event={"ID":"772cd794-fe9a-4ac3-8df8-e7f29edb85bf","Type":"ContainerStarted","Data":"2fb806f8b06eb6a79ed37ad373eb1b349cbf24edb359d55111d6e03a3fda0ba6"} Jan 31 14:56:44 crc kubenswrapper[4751]: I0131 14:56:44.221176 4751 generic.go:334] "Generic (PLEG): container finished" podID="772cd794-fe9a-4ac3-8df8-e7f29edb85bf" containerID="24848de7678f7cd58f76b4f47400dce420906e54dfe8d1ef4c220211c4bbb57e" exitCode=0 Jan 31 14:56:44 crc kubenswrapper[4751]: I0131 14:56:44.221295 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4ef6sfsr" event={"ID":"772cd794-fe9a-4ac3-8df8-e7f29edb85bf","Type":"ContainerDied","Data":"24848de7678f7cd58f76b4f47400dce420906e54dfe8d1ef4c220211c4bbb57e"} Jan 31 14:56:45 crc kubenswrapper[4751]: I0131 14:56:45.231363 4751 generic.go:334] "Generic (PLEG): container finished" podID="772cd794-fe9a-4ac3-8df8-e7f29edb85bf" containerID="8784247046f02ab2d8c0a52ce8233e64d23a7cd286c98e45a4c36115e6daf6d3" exitCode=0 Jan 31 14:56:45 crc kubenswrapper[4751]: I0131 14:56:45.231407 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4ef6sfsr" event={"ID":"772cd794-fe9a-4ac3-8df8-e7f29edb85bf","Type":"ContainerDied","Data":"8784247046f02ab2d8c0a52ce8233e64d23a7cd286c98e45a4c36115e6daf6d3"} Jan 31 14:56:46 crc kubenswrapper[4751]: I0131 14:56:46.519894 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4ef6sfsr" Jan 31 14:56:46 crc kubenswrapper[4751]: I0131 14:56:46.698272 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/772cd794-fe9a-4ac3-8df8-e7f29edb85bf-util\") pod \"772cd794-fe9a-4ac3-8df8-e7f29edb85bf\" (UID: \"772cd794-fe9a-4ac3-8df8-e7f29edb85bf\") " Jan 31 14:56:46 crc kubenswrapper[4751]: I0131 14:56:46.698385 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/772cd794-fe9a-4ac3-8df8-e7f29edb85bf-bundle\") pod \"772cd794-fe9a-4ac3-8df8-e7f29edb85bf\" (UID: \"772cd794-fe9a-4ac3-8df8-e7f29edb85bf\") " Jan 31 14:56:46 crc kubenswrapper[4751]: I0131 14:56:46.698540 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pss5r\" (UniqueName: \"kubernetes.io/projected/772cd794-fe9a-4ac3-8df8-e7f29edb85bf-kube-api-access-pss5r\") pod \"772cd794-fe9a-4ac3-8df8-e7f29edb85bf\" (UID: \"772cd794-fe9a-4ac3-8df8-e7f29edb85bf\") " Jan 31 14:56:46 crc kubenswrapper[4751]: I0131 14:56:46.700231 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/772cd794-fe9a-4ac3-8df8-e7f29edb85bf-bundle" (OuterVolumeSpecName: "bundle") pod "772cd794-fe9a-4ac3-8df8-e7f29edb85bf" (UID: "772cd794-fe9a-4ac3-8df8-e7f29edb85bf"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 14:56:46 crc kubenswrapper[4751]: I0131 14:56:46.705057 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/772cd794-fe9a-4ac3-8df8-e7f29edb85bf-kube-api-access-pss5r" (OuterVolumeSpecName: "kube-api-access-pss5r") pod "772cd794-fe9a-4ac3-8df8-e7f29edb85bf" (UID: "772cd794-fe9a-4ac3-8df8-e7f29edb85bf"). InnerVolumeSpecName "kube-api-access-pss5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:56:46 crc kubenswrapper[4751]: I0131 14:56:46.731058 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/772cd794-fe9a-4ac3-8df8-e7f29edb85bf-util" (OuterVolumeSpecName: "util") pod "772cd794-fe9a-4ac3-8df8-e7f29edb85bf" (UID: "772cd794-fe9a-4ac3-8df8-e7f29edb85bf"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 14:56:46 crc kubenswrapper[4751]: I0131 14:56:46.800554 4751 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/772cd794-fe9a-4ac3-8df8-e7f29edb85bf-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 14:56:46 crc kubenswrapper[4751]: I0131 14:56:46.800590 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pss5r\" (UniqueName: \"kubernetes.io/projected/772cd794-fe9a-4ac3-8df8-e7f29edb85bf-kube-api-access-pss5r\") on node \"crc\" DevicePath \"\"" Jan 31 14:56:46 crc kubenswrapper[4751]: I0131 14:56:46.800603 4751 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/772cd794-fe9a-4ac3-8df8-e7f29edb85bf-util\") on node \"crc\" DevicePath \"\"" Jan 31 14:56:47 crc kubenswrapper[4751]: I0131 14:56:47.246150 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4ef6sfsr" event={"ID":"772cd794-fe9a-4ac3-8df8-e7f29edb85bf","Type":"ContainerDied","Data":"2fb806f8b06eb6a79ed37ad373eb1b349cbf24edb359d55111d6e03a3fda0ba6"} Jan 31 14:56:47 crc kubenswrapper[4751]: I0131 14:56:47.246463 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2fb806f8b06eb6a79ed37ad373eb1b349cbf24edb359d55111d6e03a3fda0ba6" Jan 31 14:56:47 crc kubenswrapper[4751]: I0131 14:56:47.246212 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4ef6sfsr" Jan 31 14:56:47 crc kubenswrapper[4751]: I0131 14:56:47.263183 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-g8hgc" Jan 31 14:56:47 crc kubenswrapper[4751]: I0131 14:56:47.263366 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-g8hgc" Jan 31 14:56:47 crc kubenswrapper[4751]: I0131 14:56:47.305171 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-g8hgc" Jan 31 14:56:47 crc kubenswrapper[4751]: I0131 14:56:47.527182 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-m96j8"] Jan 31 14:56:47 crc kubenswrapper[4751]: E0131 14:56:47.527478 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="772cd794-fe9a-4ac3-8df8-e7f29edb85bf" containerName="extract" Jan 31 14:56:47 crc kubenswrapper[4751]: I0131 14:56:47.527495 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="772cd794-fe9a-4ac3-8df8-e7f29edb85bf" containerName="extract" Jan 31 14:56:47 crc kubenswrapper[4751]: E0131 14:56:47.527517 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="772cd794-fe9a-4ac3-8df8-e7f29edb85bf" containerName="pull" Jan 31 14:56:47 crc kubenswrapper[4751]: I0131 14:56:47.527527 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="772cd794-fe9a-4ac3-8df8-e7f29edb85bf" containerName="pull" Jan 31 14:56:47 crc kubenswrapper[4751]: E0131 14:56:47.527547 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="772cd794-fe9a-4ac3-8df8-e7f29edb85bf" containerName="util" Jan 31 14:56:47 crc kubenswrapper[4751]: I0131 14:56:47.527555 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="772cd794-fe9a-4ac3-8df8-e7f29edb85bf" containerName="util" Jan 31 14:56:47 crc kubenswrapper[4751]: I0131 14:56:47.527697 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="772cd794-fe9a-4ac3-8df8-e7f29edb85bf" containerName="extract" Jan 31 14:56:47 crc kubenswrapper[4751]: I0131 14:56:47.528773 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m96j8" Jan 31 14:56:47 crc kubenswrapper[4751]: I0131 14:56:47.548425 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m96j8"] Jan 31 14:56:47 crc kubenswrapper[4751]: I0131 14:56:47.711102 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbf9b8e1-7d1f-4fe4-9111-c040d8842d40-catalog-content\") pod \"community-operators-m96j8\" (UID: \"cbf9b8e1-7d1f-4fe4-9111-c040d8842d40\") " pod="openshift-marketplace/community-operators-m96j8" Jan 31 14:56:47 crc kubenswrapper[4751]: I0131 14:56:47.711164 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbf9b8e1-7d1f-4fe4-9111-c040d8842d40-utilities\") pod \"community-operators-m96j8\" (UID: \"cbf9b8e1-7d1f-4fe4-9111-c040d8842d40\") " pod="openshift-marketplace/community-operators-m96j8" Jan 31 14:56:47 crc kubenswrapper[4751]: I0131 14:56:47.711334 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lgqw\" (UniqueName: \"kubernetes.io/projected/cbf9b8e1-7d1f-4fe4-9111-c040d8842d40-kube-api-access-7lgqw\") pod \"community-operators-m96j8\" (UID: \"cbf9b8e1-7d1f-4fe4-9111-c040d8842d40\") " pod="openshift-marketplace/community-operators-m96j8" Jan 31 14:56:47 crc kubenswrapper[4751]: I0131 14:56:47.812975 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lgqw\" (UniqueName: \"kubernetes.io/projected/cbf9b8e1-7d1f-4fe4-9111-c040d8842d40-kube-api-access-7lgqw\") pod \"community-operators-m96j8\" (UID: \"cbf9b8e1-7d1f-4fe4-9111-c040d8842d40\") " pod="openshift-marketplace/community-operators-m96j8" Jan 31 14:56:47 crc kubenswrapper[4751]: I0131 14:56:47.813113 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbf9b8e1-7d1f-4fe4-9111-c040d8842d40-catalog-content\") pod \"community-operators-m96j8\" (UID: \"cbf9b8e1-7d1f-4fe4-9111-c040d8842d40\") " pod="openshift-marketplace/community-operators-m96j8" Jan 31 14:56:47 crc kubenswrapper[4751]: I0131 14:56:47.813138 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbf9b8e1-7d1f-4fe4-9111-c040d8842d40-utilities\") pod \"community-operators-m96j8\" (UID: \"cbf9b8e1-7d1f-4fe4-9111-c040d8842d40\") " pod="openshift-marketplace/community-operators-m96j8" Jan 31 14:56:47 crc kubenswrapper[4751]: I0131 14:56:47.813525 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbf9b8e1-7d1f-4fe4-9111-c040d8842d40-utilities\") pod \"community-operators-m96j8\" (UID: \"cbf9b8e1-7d1f-4fe4-9111-c040d8842d40\") " pod="openshift-marketplace/community-operators-m96j8" Jan 31 14:56:47 crc kubenswrapper[4751]: I0131 14:56:47.813652 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbf9b8e1-7d1f-4fe4-9111-c040d8842d40-catalog-content\") pod \"community-operators-m96j8\" (UID: \"cbf9b8e1-7d1f-4fe4-9111-c040d8842d40\") " pod="openshift-marketplace/community-operators-m96j8" Jan 31 14:56:47 crc kubenswrapper[4751]: I0131 14:56:47.833310 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lgqw\" (UniqueName: \"kubernetes.io/projected/cbf9b8e1-7d1f-4fe4-9111-c040d8842d40-kube-api-access-7lgqw\") pod \"community-operators-m96j8\" (UID: \"cbf9b8e1-7d1f-4fe4-9111-c040d8842d40\") " pod="openshift-marketplace/community-operators-m96j8" Jan 31 14:56:47 crc kubenswrapper[4751]: I0131 14:56:47.845740 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m96j8" Jan 31 14:56:48 crc kubenswrapper[4751]: I0131 14:56:48.290344 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m96j8"] Jan 31 14:56:48 crc kubenswrapper[4751]: I0131 14:56:48.319582 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-g8hgc" Jan 31 14:56:49 crc kubenswrapper[4751]: I0131 14:56:49.259446 4751 generic.go:334] "Generic (PLEG): container finished" podID="cbf9b8e1-7d1f-4fe4-9111-c040d8842d40" containerID="c5d30bd3425343861aefae2acc945d17403c59649b3737361473864cd06659ea" exitCode=0 Jan 31 14:56:49 crc kubenswrapper[4751]: I0131 14:56:49.259506 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m96j8" event={"ID":"cbf9b8e1-7d1f-4fe4-9111-c040d8842d40","Type":"ContainerDied","Data":"c5d30bd3425343861aefae2acc945d17403c59649b3737361473864cd06659ea"} Jan 31 14:56:49 crc kubenswrapper[4751]: I0131 14:56:49.259895 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m96j8" event={"ID":"cbf9b8e1-7d1f-4fe4-9111-c040d8842d40","Type":"ContainerStarted","Data":"7a18a3e9ad73c9dedb3fcbe2dae3a06a1766a1c7a3e3e74f8e52e63336ce4c6a"} Jan 31 14:56:49 crc kubenswrapper[4751]: I0131 14:56:49.736966 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jqv5m"] Jan 31 14:56:49 crc kubenswrapper[4751]: I0131 14:56:49.797760 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jqv5m"] Jan 31 14:56:49 crc kubenswrapper[4751]: I0131 14:56:49.797864 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jqv5m" Jan 31 14:56:49 crc kubenswrapper[4751]: I0131 14:56:49.946500 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10399bf7-0161-488c-8001-e6ba927889e5-utilities\") pod \"certified-operators-jqv5m\" (UID: \"10399bf7-0161-488c-8001-e6ba927889e5\") " pod="openshift-marketplace/certified-operators-jqv5m" Jan 31 14:56:49 crc kubenswrapper[4751]: I0131 14:56:49.946557 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10399bf7-0161-488c-8001-e6ba927889e5-catalog-content\") pod \"certified-operators-jqv5m\" (UID: \"10399bf7-0161-488c-8001-e6ba927889e5\") " pod="openshift-marketplace/certified-operators-jqv5m" Jan 31 14:56:49 crc kubenswrapper[4751]: I0131 14:56:49.946595 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqkft\" (UniqueName: \"kubernetes.io/projected/10399bf7-0161-488c-8001-e6ba927889e5-kube-api-access-cqkft\") pod \"certified-operators-jqv5m\" (UID: \"10399bf7-0161-488c-8001-e6ba927889e5\") " pod="openshift-marketplace/certified-operators-jqv5m" Jan 31 14:56:50 crc kubenswrapper[4751]: I0131 14:56:50.048384 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqkft\" (UniqueName: \"kubernetes.io/projected/10399bf7-0161-488c-8001-e6ba927889e5-kube-api-access-cqkft\") pod \"certified-operators-jqv5m\" (UID: \"10399bf7-0161-488c-8001-e6ba927889e5\") " pod="openshift-marketplace/certified-operators-jqv5m" Jan 31 14:56:50 crc kubenswrapper[4751]: I0131 14:56:50.048489 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10399bf7-0161-488c-8001-e6ba927889e5-utilities\") pod \"certified-operators-jqv5m\" (UID: \"10399bf7-0161-488c-8001-e6ba927889e5\") " pod="openshift-marketplace/certified-operators-jqv5m" Jan 31 14:56:50 crc kubenswrapper[4751]: I0131 14:56:50.048519 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10399bf7-0161-488c-8001-e6ba927889e5-catalog-content\") pod \"certified-operators-jqv5m\" (UID: \"10399bf7-0161-488c-8001-e6ba927889e5\") " pod="openshift-marketplace/certified-operators-jqv5m" Jan 31 14:56:50 crc kubenswrapper[4751]: I0131 14:56:50.048907 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10399bf7-0161-488c-8001-e6ba927889e5-catalog-content\") pod \"certified-operators-jqv5m\" (UID: \"10399bf7-0161-488c-8001-e6ba927889e5\") " pod="openshift-marketplace/certified-operators-jqv5m" Jan 31 14:56:50 crc kubenswrapper[4751]: I0131 14:56:50.049303 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10399bf7-0161-488c-8001-e6ba927889e5-utilities\") pod \"certified-operators-jqv5m\" (UID: \"10399bf7-0161-488c-8001-e6ba927889e5\") " pod="openshift-marketplace/certified-operators-jqv5m" Jan 31 14:56:50 crc kubenswrapper[4751]: I0131 14:56:50.067480 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqkft\" (UniqueName: \"kubernetes.io/projected/10399bf7-0161-488c-8001-e6ba927889e5-kube-api-access-cqkft\") pod \"certified-operators-jqv5m\" (UID: \"10399bf7-0161-488c-8001-e6ba927889e5\") " pod="openshift-marketplace/certified-operators-jqv5m" Jan 31 14:56:50 crc kubenswrapper[4751]: I0131 14:56:50.141763 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jqv5m" Jan 31 14:56:50 crc kubenswrapper[4751]: I0131 14:56:50.582650 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jqv5m"] Jan 31 14:56:51 crc kubenswrapper[4751]: I0131 14:56:51.279008 4751 generic.go:334] "Generic (PLEG): container finished" podID="10399bf7-0161-488c-8001-e6ba927889e5" containerID="e8bb73c8fea313592a221df9e5dbff86d9e1c2c8a380afe92fecdcb99f557785" exitCode=0 Jan 31 14:56:51 crc kubenswrapper[4751]: I0131 14:56:51.279082 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jqv5m" event={"ID":"10399bf7-0161-488c-8001-e6ba927889e5","Type":"ContainerDied","Data":"e8bb73c8fea313592a221df9e5dbff86d9e1c2c8a380afe92fecdcb99f557785"} Jan 31 14:56:51 crc kubenswrapper[4751]: I0131 14:56:51.279107 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jqv5m" event={"ID":"10399bf7-0161-488c-8001-e6ba927889e5","Type":"ContainerStarted","Data":"1db0e98d5840943d47fedc71d5069e41ebcf9dcd1cc035ff41c419ba526b7bb7"} Jan 31 14:56:51 crc kubenswrapper[4751]: I0131 14:56:51.284388 4751 generic.go:334] "Generic (PLEG): container finished" podID="cbf9b8e1-7d1f-4fe4-9111-c040d8842d40" containerID="0a2dcc31122c7c5482843a5e80399a6846c7271da25c796eef9ce298a6180701" exitCode=0 Jan 31 14:56:51 crc kubenswrapper[4751]: I0131 14:56:51.284431 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m96j8" event={"ID":"cbf9b8e1-7d1f-4fe4-9111-c040d8842d40","Type":"ContainerDied","Data":"0a2dcc31122c7c5482843a5e80399a6846c7271da25c796eef9ce298a6180701"} Jan 31 14:56:52 crc kubenswrapper[4751]: I0131 14:56:52.112158 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-g8hgc"] Jan 31 14:56:52 crc kubenswrapper[4751]: I0131 14:56:52.112672 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-g8hgc" podUID="eb6fb532-641c-459e-bb99-ba0f9779510c" containerName="registry-server" containerID="cri-o://91f8565e18d5814616edf1f9308b1748710df7aa51f507a293df07392a1fe336" gracePeriod=2 Jan 31 14:56:52 crc kubenswrapper[4751]: I0131 14:56:52.292771 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jqv5m" event={"ID":"10399bf7-0161-488c-8001-e6ba927889e5","Type":"ContainerStarted","Data":"9f513ef2d1f9802f83bbe5839191b499dfe0fb39587e4e549696c1f2a0b3b5c7"} Jan 31 14:56:52 crc kubenswrapper[4751]: I0131 14:56:52.296169 4751 generic.go:334] "Generic (PLEG): container finished" podID="eb6fb532-641c-459e-bb99-ba0f9779510c" containerID="91f8565e18d5814616edf1f9308b1748710df7aa51f507a293df07392a1fe336" exitCode=0 Jan 31 14:56:52 crc kubenswrapper[4751]: I0131 14:56:52.296249 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g8hgc" event={"ID":"eb6fb532-641c-459e-bb99-ba0f9779510c","Type":"ContainerDied","Data":"91f8565e18d5814616edf1f9308b1748710df7aa51f507a293df07392a1fe336"} Jan 31 14:56:52 crc kubenswrapper[4751]: I0131 14:56:52.299397 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m96j8" event={"ID":"cbf9b8e1-7d1f-4fe4-9111-c040d8842d40","Type":"ContainerStarted","Data":"8f5e6c80881d23c78dc00e4be207273e5eb7f1474c90cd90d5b02783a4206916"} Jan 31 14:56:52 crc kubenswrapper[4751]: I0131 14:56:52.330342 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-m96j8" podStartSLOduration=2.742024047 podStartE2EDuration="5.330328138s" podCreationTimestamp="2026-01-31 14:56:47 +0000 UTC" firstStartedPulling="2026-01-31 14:56:49.263432813 +0000 UTC m=+913.638145708" lastFinishedPulling="2026-01-31 14:56:51.851736874 +0000 UTC m=+916.226449799" observedRunningTime="2026-01-31 14:56:52.327888464 +0000 UTC m=+916.702601349" watchObservedRunningTime="2026-01-31 14:56:52.330328138 +0000 UTC m=+916.705041023" Jan 31 14:56:52 crc kubenswrapper[4751]: I0131 14:56:52.569706 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g8hgc" Jan 31 14:56:52 crc kubenswrapper[4751]: I0131 14:56:52.683039 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjhf4\" (UniqueName: \"kubernetes.io/projected/eb6fb532-641c-459e-bb99-ba0f9779510c-kube-api-access-pjhf4\") pod \"eb6fb532-641c-459e-bb99-ba0f9779510c\" (UID: \"eb6fb532-641c-459e-bb99-ba0f9779510c\") " Jan 31 14:56:52 crc kubenswrapper[4751]: I0131 14:56:52.683212 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb6fb532-641c-459e-bb99-ba0f9779510c-catalog-content\") pod \"eb6fb532-641c-459e-bb99-ba0f9779510c\" (UID: \"eb6fb532-641c-459e-bb99-ba0f9779510c\") " Jan 31 14:56:52 crc kubenswrapper[4751]: I0131 14:56:52.683239 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb6fb532-641c-459e-bb99-ba0f9779510c-utilities\") pod \"eb6fb532-641c-459e-bb99-ba0f9779510c\" (UID: \"eb6fb532-641c-459e-bb99-ba0f9779510c\") " Jan 31 14:56:52 crc kubenswrapper[4751]: I0131 14:56:52.684021 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb6fb532-641c-459e-bb99-ba0f9779510c-utilities" (OuterVolumeSpecName: "utilities") pod "eb6fb532-641c-459e-bb99-ba0f9779510c" (UID: "eb6fb532-641c-459e-bb99-ba0f9779510c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 14:56:52 crc kubenswrapper[4751]: I0131 14:56:52.689789 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb6fb532-641c-459e-bb99-ba0f9779510c-kube-api-access-pjhf4" (OuterVolumeSpecName: "kube-api-access-pjhf4") pod "eb6fb532-641c-459e-bb99-ba0f9779510c" (UID: "eb6fb532-641c-459e-bb99-ba0f9779510c"). InnerVolumeSpecName "kube-api-access-pjhf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:56:52 crc kubenswrapper[4751]: I0131 14:56:52.703389 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb6fb532-641c-459e-bb99-ba0f9779510c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eb6fb532-641c-459e-bb99-ba0f9779510c" (UID: "eb6fb532-641c-459e-bb99-ba0f9779510c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 14:56:52 crc kubenswrapper[4751]: I0131 14:56:52.785177 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb6fb532-641c-459e-bb99-ba0f9779510c-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 14:56:52 crc kubenswrapper[4751]: I0131 14:56:52.785402 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb6fb532-641c-459e-bb99-ba0f9779510c-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 14:56:52 crc kubenswrapper[4751]: I0131 14:56:52.785462 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjhf4\" (UniqueName: \"kubernetes.io/projected/eb6fb532-641c-459e-bb99-ba0f9779510c-kube-api-access-pjhf4\") on node \"crc\" DevicePath \"\"" Jan 31 14:56:53 crc kubenswrapper[4751]: I0131 14:56:53.306945 4751 generic.go:334] "Generic (PLEG): container finished" podID="10399bf7-0161-488c-8001-e6ba927889e5" containerID="9f513ef2d1f9802f83bbe5839191b499dfe0fb39587e4e549696c1f2a0b3b5c7" exitCode=0 Jan 31 14:56:53 crc kubenswrapper[4751]: I0131 14:56:53.306979 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jqv5m" event={"ID":"10399bf7-0161-488c-8001-e6ba927889e5","Type":"ContainerDied","Data":"9f513ef2d1f9802f83bbe5839191b499dfe0fb39587e4e549696c1f2a0b3b5c7"} Jan 31 14:56:53 crc kubenswrapper[4751]: I0131 14:56:53.309722 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g8hgc" Jan 31 14:56:53 crc kubenswrapper[4751]: I0131 14:56:53.317480 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g8hgc" event={"ID":"eb6fb532-641c-459e-bb99-ba0f9779510c","Type":"ContainerDied","Data":"b08d3a83c80ca64edae98efdfdbc3ea7e3985a0df616ed7d528a097908ab573a"} Jan 31 14:56:53 crc kubenswrapper[4751]: I0131 14:56:53.317544 4751 scope.go:117] "RemoveContainer" containerID="91f8565e18d5814616edf1f9308b1748710df7aa51f507a293df07392a1fe336" Jan 31 14:56:53 crc kubenswrapper[4751]: I0131 14:56:53.348754 4751 scope.go:117] "RemoveContainer" containerID="573923f23b16312c13dc5ab0d41de7478d1a0111af903d7fde9c795304380992" Jan 31 14:56:53 crc kubenswrapper[4751]: I0131 14:56:53.374310 4751 scope.go:117] "RemoveContainer" containerID="f76a9ace8d656c05224f8d7e9efd34944bad63abdbcc558bcbf72f5ac86facb5" Jan 31 14:56:53 crc kubenswrapper[4751]: I0131 14:56:53.381593 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-g8hgc"] Jan 31 14:56:53 crc kubenswrapper[4751]: I0131 14:56:53.388909 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-g8hgc"] Jan 31 14:56:53 crc kubenswrapper[4751]: I0131 14:56:53.724167 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7f68887647-qvqrq"] Jan 31 14:56:53 crc kubenswrapper[4751]: E0131 14:56:53.724491 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb6fb532-641c-459e-bb99-ba0f9779510c" containerName="registry-server" Jan 31 14:56:53 crc kubenswrapper[4751]: I0131 14:56:53.724513 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb6fb532-641c-459e-bb99-ba0f9779510c" containerName="registry-server" Jan 31 14:56:53 crc kubenswrapper[4751]: E0131 14:56:53.724536 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb6fb532-641c-459e-bb99-ba0f9779510c" containerName="extract-utilities" Jan 31 14:56:53 crc kubenswrapper[4751]: I0131 14:56:53.724547 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb6fb532-641c-459e-bb99-ba0f9779510c" containerName="extract-utilities" Jan 31 14:56:53 crc kubenswrapper[4751]: E0131 14:56:53.724563 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb6fb532-641c-459e-bb99-ba0f9779510c" containerName="extract-content" Jan 31 14:56:53 crc kubenswrapper[4751]: I0131 14:56:53.724570 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb6fb532-641c-459e-bb99-ba0f9779510c" containerName="extract-content" Jan 31 14:56:53 crc kubenswrapper[4751]: I0131 14:56:53.724707 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb6fb532-641c-459e-bb99-ba0f9779510c" containerName="registry-server" Jan 31 14:56:53 crc kubenswrapper[4751]: I0131 14:56:53.725282 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7f68887647-qvqrq" Jan 31 14:56:53 crc kubenswrapper[4751]: I0131 14:56:53.727434 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-service-cert" Jan 31 14:56:53 crc kubenswrapper[4751]: I0131 14:56:53.727566 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-fpgv8" Jan 31 14:56:53 crc kubenswrapper[4751]: I0131 14:56:53.737788 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7f68887647-qvqrq"] Jan 31 14:56:53 crc kubenswrapper[4751]: I0131 14:56:53.900818 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/49ea8aae-ad89-4383-8f2f-ba35872fd605-webhook-cert\") pod \"keystone-operator-controller-manager-7f68887647-qvqrq\" (UID: \"49ea8aae-ad89-4383-8f2f-ba35872fd605\") " pod="openstack-operators/keystone-operator-controller-manager-7f68887647-qvqrq" Jan 31 14:56:53 crc kubenswrapper[4751]: I0131 14:56:53.901124 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/49ea8aae-ad89-4383-8f2f-ba35872fd605-apiservice-cert\") pod \"keystone-operator-controller-manager-7f68887647-qvqrq\" (UID: \"49ea8aae-ad89-4383-8f2f-ba35872fd605\") " pod="openstack-operators/keystone-operator-controller-manager-7f68887647-qvqrq" Jan 31 14:56:53 crc kubenswrapper[4751]: I0131 14:56:53.901236 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvdb6\" (UniqueName: \"kubernetes.io/projected/49ea8aae-ad89-4383-8f2f-ba35872fd605-kube-api-access-mvdb6\") pod \"keystone-operator-controller-manager-7f68887647-qvqrq\" (UID: \"49ea8aae-ad89-4383-8f2f-ba35872fd605\") " pod="openstack-operators/keystone-operator-controller-manager-7f68887647-qvqrq" Jan 31 14:56:54 crc kubenswrapper[4751]: I0131 14:56:54.002734 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/49ea8aae-ad89-4383-8f2f-ba35872fd605-apiservice-cert\") pod \"keystone-operator-controller-manager-7f68887647-qvqrq\" (UID: \"49ea8aae-ad89-4383-8f2f-ba35872fd605\") " pod="openstack-operators/keystone-operator-controller-manager-7f68887647-qvqrq" Jan 31 14:56:54 crc kubenswrapper[4751]: I0131 14:56:54.002824 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvdb6\" (UniqueName: \"kubernetes.io/projected/49ea8aae-ad89-4383-8f2f-ba35872fd605-kube-api-access-mvdb6\") pod \"keystone-operator-controller-manager-7f68887647-qvqrq\" (UID: \"49ea8aae-ad89-4383-8f2f-ba35872fd605\") " pod="openstack-operators/keystone-operator-controller-manager-7f68887647-qvqrq" Jan 31 14:56:54 crc kubenswrapper[4751]: I0131 14:56:54.002892 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/49ea8aae-ad89-4383-8f2f-ba35872fd605-webhook-cert\") pod \"keystone-operator-controller-manager-7f68887647-qvqrq\" (UID: \"49ea8aae-ad89-4383-8f2f-ba35872fd605\") " pod="openstack-operators/keystone-operator-controller-manager-7f68887647-qvqrq" Jan 31 14:56:54 crc kubenswrapper[4751]: I0131 14:56:54.008557 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/49ea8aae-ad89-4383-8f2f-ba35872fd605-webhook-cert\") pod \"keystone-operator-controller-manager-7f68887647-qvqrq\" (UID: \"49ea8aae-ad89-4383-8f2f-ba35872fd605\") " pod="openstack-operators/keystone-operator-controller-manager-7f68887647-qvqrq" Jan 31 14:56:54 crc kubenswrapper[4751]: I0131 14:56:54.008723 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/49ea8aae-ad89-4383-8f2f-ba35872fd605-apiservice-cert\") pod \"keystone-operator-controller-manager-7f68887647-qvqrq\" (UID: \"49ea8aae-ad89-4383-8f2f-ba35872fd605\") " pod="openstack-operators/keystone-operator-controller-manager-7f68887647-qvqrq" Jan 31 14:56:54 crc kubenswrapper[4751]: I0131 14:56:54.019661 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvdb6\" (UniqueName: \"kubernetes.io/projected/49ea8aae-ad89-4383-8f2f-ba35872fd605-kube-api-access-mvdb6\") pod \"keystone-operator-controller-manager-7f68887647-qvqrq\" (UID: \"49ea8aae-ad89-4383-8f2f-ba35872fd605\") " pod="openstack-operators/keystone-operator-controller-manager-7f68887647-qvqrq" Jan 31 14:56:54 crc kubenswrapper[4751]: I0131 14:56:54.050608 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7f68887647-qvqrq" Jan 31 14:56:54 crc kubenswrapper[4751]: I0131 14:56:54.322033 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jqv5m" event={"ID":"10399bf7-0161-488c-8001-e6ba927889e5","Type":"ContainerStarted","Data":"d6f182b4c85d9e139e51b5b38efb79b57d02af9a0357ca70a93a9631b5745566"} Jan 31 14:56:54 crc kubenswrapper[4751]: I0131 14:56:54.350447 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jqv5m" podStartSLOduration=2.787879903 podStartE2EDuration="5.350425956s" podCreationTimestamp="2026-01-31 14:56:49 +0000 UTC" firstStartedPulling="2026-01-31 14:56:51.281017395 +0000 UTC m=+915.655730280" lastFinishedPulling="2026-01-31 14:56:53.843563448 +0000 UTC m=+918.218276333" observedRunningTime="2026-01-31 14:56:54.346699998 +0000 UTC m=+918.721412903" watchObservedRunningTime="2026-01-31 14:56:54.350425956 +0000 UTC m=+918.725138851" Jan 31 14:56:54 crc kubenswrapper[4751]: I0131 14:56:54.416935 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb6fb532-641c-459e-bb99-ba0f9779510c" path="/var/lib/kubelet/pods/eb6fb532-641c-459e-bb99-ba0f9779510c/volumes" Jan 31 14:56:54 crc kubenswrapper[4751]: I0131 14:56:54.539925 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7f68887647-qvqrq"] Jan 31 14:56:54 crc kubenswrapper[4751]: W0131 14:56:54.546307 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49ea8aae_ad89_4383_8f2f_ba35872fd605.slice/crio-57b02fb0aa9fe148da3716e9376b1f52be8527b141c4d304bf8040ea0b69e451 WatchSource:0}: Error finding container 57b02fb0aa9fe148da3716e9376b1f52be8527b141c4d304bf8040ea0b69e451: Status 404 returned error can't find the container with id 57b02fb0aa9fe148da3716e9376b1f52be8527b141c4d304bf8040ea0b69e451 Jan 31 14:56:55 crc kubenswrapper[4751]: I0131 14:56:55.372977 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7f68887647-qvqrq" event={"ID":"49ea8aae-ad89-4383-8f2f-ba35872fd605","Type":"ContainerStarted","Data":"57b02fb0aa9fe148da3716e9376b1f52be8527b141c4d304bf8040ea0b69e451"} Jan 31 14:56:57 crc kubenswrapper[4751]: I0131 14:56:57.846255 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-m96j8" Jan 31 14:56:57 crc kubenswrapper[4751]: I0131 14:56:57.846576 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-m96j8" Jan 31 14:56:57 crc kubenswrapper[4751]: I0131 14:56:57.886282 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-m96j8" Jan 31 14:56:58 crc kubenswrapper[4751]: I0131 14:56:58.449506 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-m96j8" Jan 31 14:56:59 crc kubenswrapper[4751]: I0131 14:56:59.402298 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7f68887647-qvqrq" event={"ID":"49ea8aae-ad89-4383-8f2f-ba35872fd605","Type":"ContainerStarted","Data":"0c37d2b2bcc47557f4028d9e251b0db237f4b56ff1d49ca666627d0449655ab2"} Jan 31 14:56:59 crc kubenswrapper[4751]: I0131 14:56:59.422120 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7f68887647-qvqrq" podStartSLOduration=1.859547294 podStartE2EDuration="6.422101378s" podCreationTimestamp="2026-01-31 14:56:53 +0000 UTC" firstStartedPulling="2026-01-31 14:56:54.549121985 +0000 UTC m=+918.923834880" lastFinishedPulling="2026-01-31 14:56:59.111676079 +0000 UTC m=+923.486388964" observedRunningTime="2026-01-31 14:56:59.419689264 +0000 UTC m=+923.794402179" watchObservedRunningTime="2026-01-31 14:56:59.422101378 +0000 UTC m=+923.796814273" Jan 31 14:57:00 crc kubenswrapper[4751]: I0131 14:57:00.142117 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jqv5m" Jan 31 14:57:00 crc kubenswrapper[4751]: I0131 14:57:00.142177 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jqv5m" Jan 31 14:57:00 crc kubenswrapper[4751]: I0131 14:57:00.217144 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jqv5m" Jan 31 14:57:00 crc kubenswrapper[4751]: I0131 14:57:00.413764 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7f68887647-qvqrq" Jan 31 14:57:00 crc kubenswrapper[4751]: I0131 14:57:00.459790 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jqv5m" Jan 31 14:57:02 crc kubenswrapper[4751]: I0131 14:57:02.911761 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m96j8"] Jan 31 14:57:02 crc kubenswrapper[4751]: I0131 14:57:02.912243 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-m96j8" podUID="cbf9b8e1-7d1f-4fe4-9111-c040d8842d40" containerName="registry-server" containerID="cri-o://8f5e6c80881d23c78dc00e4be207273e5eb7f1474c90cd90d5b02783a4206916" gracePeriod=2 Jan 31 14:57:04 crc kubenswrapper[4751]: I0131 14:57:04.055023 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7f68887647-qvqrq" Jan 31 14:57:04 crc kubenswrapper[4751]: I0131 14:57:04.434638 4751 generic.go:334] "Generic (PLEG): container finished" podID="cbf9b8e1-7d1f-4fe4-9111-c040d8842d40" containerID="8f5e6c80881d23c78dc00e4be207273e5eb7f1474c90cd90d5b02783a4206916" exitCode=0 Jan 31 14:57:04 crc kubenswrapper[4751]: I0131 14:57:04.434689 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m96j8" event={"ID":"cbf9b8e1-7d1f-4fe4-9111-c040d8842d40","Type":"ContainerDied","Data":"8f5e6c80881d23c78dc00e4be207273e5eb7f1474c90cd90d5b02783a4206916"} Jan 31 14:57:04 crc kubenswrapper[4751]: I0131 14:57:04.434713 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m96j8" event={"ID":"cbf9b8e1-7d1f-4fe4-9111-c040d8842d40","Type":"ContainerDied","Data":"7a18a3e9ad73c9dedb3fcbe2dae3a06a1766a1c7a3e3e74f8e52e63336ce4c6a"} Jan 31 14:57:04 crc kubenswrapper[4751]: I0131 14:57:04.434724 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a18a3e9ad73c9dedb3fcbe2dae3a06a1766a1c7a3e3e74f8e52e63336ce4c6a" Jan 31 14:57:04 crc kubenswrapper[4751]: I0131 14:57:04.439382 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m96j8" Jan 31 14:57:04 crc kubenswrapper[4751]: I0131 14:57:04.524956 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbf9b8e1-7d1f-4fe4-9111-c040d8842d40-utilities\") pod \"cbf9b8e1-7d1f-4fe4-9111-c040d8842d40\" (UID: \"cbf9b8e1-7d1f-4fe4-9111-c040d8842d40\") " Jan 31 14:57:04 crc kubenswrapper[4751]: I0131 14:57:04.525052 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbf9b8e1-7d1f-4fe4-9111-c040d8842d40-catalog-content\") pod \"cbf9b8e1-7d1f-4fe4-9111-c040d8842d40\" (UID: \"cbf9b8e1-7d1f-4fe4-9111-c040d8842d40\") " Jan 31 14:57:04 crc kubenswrapper[4751]: I0131 14:57:04.525095 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lgqw\" (UniqueName: \"kubernetes.io/projected/cbf9b8e1-7d1f-4fe4-9111-c040d8842d40-kube-api-access-7lgqw\") pod \"cbf9b8e1-7d1f-4fe4-9111-c040d8842d40\" (UID: \"cbf9b8e1-7d1f-4fe4-9111-c040d8842d40\") " Jan 31 14:57:04 crc kubenswrapper[4751]: I0131 14:57:04.527315 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbf9b8e1-7d1f-4fe4-9111-c040d8842d40-utilities" (OuterVolumeSpecName: "utilities") pod "cbf9b8e1-7d1f-4fe4-9111-c040d8842d40" (UID: "cbf9b8e1-7d1f-4fe4-9111-c040d8842d40"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 14:57:04 crc kubenswrapper[4751]: I0131 14:57:04.544850 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbf9b8e1-7d1f-4fe4-9111-c040d8842d40-kube-api-access-7lgqw" (OuterVolumeSpecName: "kube-api-access-7lgqw") pod "cbf9b8e1-7d1f-4fe4-9111-c040d8842d40" (UID: "cbf9b8e1-7d1f-4fe4-9111-c040d8842d40"). InnerVolumeSpecName "kube-api-access-7lgqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:57:04 crc kubenswrapper[4751]: I0131 14:57:04.590760 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbf9b8e1-7d1f-4fe4-9111-c040d8842d40-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cbf9b8e1-7d1f-4fe4-9111-c040d8842d40" (UID: "cbf9b8e1-7d1f-4fe4-9111-c040d8842d40"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 14:57:04 crc kubenswrapper[4751]: I0131 14:57:04.626453 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbf9b8e1-7d1f-4fe4-9111-c040d8842d40-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 14:57:04 crc kubenswrapper[4751]: I0131 14:57:04.626489 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7lgqw\" (UniqueName: \"kubernetes.io/projected/cbf9b8e1-7d1f-4fe4-9111-c040d8842d40-kube-api-access-7lgqw\") on node \"crc\" DevicePath \"\"" Jan 31 14:57:04 crc kubenswrapper[4751]: I0131 14:57:04.626503 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbf9b8e1-7d1f-4fe4-9111-c040d8842d40-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 14:57:05 crc kubenswrapper[4751]: I0131 14:57:05.439901 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m96j8" Jan 31 14:57:05 crc kubenswrapper[4751]: I0131 14:57:05.497159 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m96j8"] Jan 31 14:57:05 crc kubenswrapper[4751]: I0131 14:57:05.505174 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-m96j8"] Jan 31 14:57:06 crc kubenswrapper[4751]: I0131 14:57:06.309881 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jqv5m"] Jan 31 14:57:06 crc kubenswrapper[4751]: I0131 14:57:06.310126 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jqv5m" podUID="10399bf7-0161-488c-8001-e6ba927889e5" containerName="registry-server" containerID="cri-o://d6f182b4c85d9e139e51b5b38efb79b57d02af9a0357ca70a93a9631b5745566" gracePeriod=2 Jan 31 14:57:06 crc kubenswrapper[4751]: I0131 14:57:06.411963 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbf9b8e1-7d1f-4fe4-9111-c040d8842d40" path="/var/lib/kubelet/pods/cbf9b8e1-7d1f-4fe4-9111-c040d8842d40/volumes" Jan 31 14:57:06 crc kubenswrapper[4751]: I0131 14:57:06.446686 4751 generic.go:334] "Generic (PLEG): container finished" podID="10399bf7-0161-488c-8001-e6ba927889e5" containerID="d6f182b4c85d9e139e51b5b38efb79b57d02af9a0357ca70a93a9631b5745566" exitCode=0 Jan 31 14:57:06 crc kubenswrapper[4751]: I0131 14:57:06.446767 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jqv5m" event={"ID":"10399bf7-0161-488c-8001-e6ba927889e5","Type":"ContainerDied","Data":"d6f182b4c85d9e139e51b5b38efb79b57d02af9a0357ca70a93a9631b5745566"} Jan 31 14:57:06 crc kubenswrapper[4751]: I0131 14:57:06.447923 4751 generic.go:334] "Generic (PLEG): container finished" podID="19317a08-b18b-42c9-bdc9-394e1e06257d" containerID="505748b1e10e777b66b173a6705d54ff333de5b60e6cd125a1cf81bd7167e586" exitCode=0 Jan 31 14:57:06 crc kubenswrapper[4751]: I0131 14:57:06.447961 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/rabbitmq-server-0" event={"ID":"19317a08-b18b-42c9-bdc9-394e1e06257d","Type":"ContainerDied","Data":"505748b1e10e777b66b173a6705d54ff333de5b60e6cd125a1cf81bd7167e586"} Jan 31 14:57:06 crc kubenswrapper[4751]: I0131 14:57:06.713104 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jqv5m" Jan 31 14:57:06 crc kubenswrapper[4751]: I0131 14:57:06.851859 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqkft\" (UniqueName: \"kubernetes.io/projected/10399bf7-0161-488c-8001-e6ba927889e5-kube-api-access-cqkft\") pod \"10399bf7-0161-488c-8001-e6ba927889e5\" (UID: \"10399bf7-0161-488c-8001-e6ba927889e5\") " Jan 31 14:57:06 crc kubenswrapper[4751]: I0131 14:57:06.851902 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10399bf7-0161-488c-8001-e6ba927889e5-catalog-content\") pod \"10399bf7-0161-488c-8001-e6ba927889e5\" (UID: \"10399bf7-0161-488c-8001-e6ba927889e5\") " Jan 31 14:57:06 crc kubenswrapper[4751]: I0131 14:57:06.851938 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10399bf7-0161-488c-8001-e6ba927889e5-utilities\") pod \"10399bf7-0161-488c-8001-e6ba927889e5\" (UID: \"10399bf7-0161-488c-8001-e6ba927889e5\") " Jan 31 14:57:06 crc kubenswrapper[4751]: I0131 14:57:06.852689 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10399bf7-0161-488c-8001-e6ba927889e5-utilities" (OuterVolumeSpecName: "utilities") pod "10399bf7-0161-488c-8001-e6ba927889e5" (UID: "10399bf7-0161-488c-8001-e6ba927889e5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 14:57:06 crc kubenswrapper[4751]: I0131 14:57:06.878358 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10399bf7-0161-488c-8001-e6ba927889e5-kube-api-access-cqkft" (OuterVolumeSpecName: "kube-api-access-cqkft") pod "10399bf7-0161-488c-8001-e6ba927889e5" (UID: "10399bf7-0161-488c-8001-e6ba927889e5"). InnerVolumeSpecName "kube-api-access-cqkft". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:57:06 crc kubenswrapper[4751]: I0131 14:57:06.910181 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10399bf7-0161-488c-8001-e6ba927889e5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "10399bf7-0161-488c-8001-e6ba927889e5" (UID: "10399bf7-0161-488c-8001-e6ba927889e5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 14:57:06 crc kubenswrapper[4751]: I0131 14:57:06.953614 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqkft\" (UniqueName: \"kubernetes.io/projected/10399bf7-0161-488c-8001-e6ba927889e5-kube-api-access-cqkft\") on node \"crc\" DevicePath \"\"" Jan 31 14:57:06 crc kubenswrapper[4751]: I0131 14:57:06.953641 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10399bf7-0161-488c-8001-e6ba927889e5-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 14:57:06 crc kubenswrapper[4751]: I0131 14:57:06.953650 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10399bf7-0161-488c-8001-e6ba927889e5-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 14:57:07 crc kubenswrapper[4751]: I0131 14:57:07.459573 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/rabbitmq-server-0" event={"ID":"19317a08-b18b-42c9-bdc9-394e1e06257d","Type":"ContainerStarted","Data":"07f687eb09cbc17ef2ede020cb3e1c35352131bf2222486a5b70524349e266f9"} Jan 31 14:57:07 crc kubenswrapper[4751]: I0131 14:57:07.459802 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 14:57:07 crc kubenswrapper[4751]: I0131 14:57:07.461683 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jqv5m" event={"ID":"10399bf7-0161-488c-8001-e6ba927889e5","Type":"ContainerDied","Data":"1db0e98d5840943d47fedc71d5069e41ebcf9dcd1cc035ff41c419ba526b7bb7"} Jan 31 14:57:07 crc kubenswrapper[4751]: I0131 14:57:07.461743 4751 scope.go:117] "RemoveContainer" containerID="d6f182b4c85d9e139e51b5b38efb79b57d02af9a0357ca70a93a9631b5745566" Jan 31 14:57:07 crc kubenswrapper[4751]: I0131 14:57:07.461773 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jqv5m" Jan 31 14:57:07 crc kubenswrapper[4751]: I0131 14:57:07.484755 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/rabbitmq-server-0" podStartSLOduration=36.722083716 podStartE2EDuration="45.484731496s" podCreationTimestamp="2026-01-31 14:56:22 +0000 UTC" firstStartedPulling="2026-01-31 14:56:24.659287145 +0000 UTC m=+889.034000040" lastFinishedPulling="2026-01-31 14:56:33.421934935 +0000 UTC m=+897.796647820" observedRunningTime="2026-01-31 14:57:07.477700771 +0000 UTC m=+931.852413666" watchObservedRunningTime="2026-01-31 14:57:07.484731496 +0000 UTC m=+931.859444401" Jan 31 14:57:07 crc kubenswrapper[4751]: I0131 14:57:07.496600 4751 scope.go:117] "RemoveContainer" containerID="9f513ef2d1f9802f83bbe5839191b499dfe0fb39587e4e549696c1f2a0b3b5c7" Jan 31 14:57:07 crc kubenswrapper[4751]: I0131 14:57:07.510996 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jqv5m"] Jan 31 14:57:07 crc kubenswrapper[4751]: I0131 14:57:07.516866 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jqv5m"] Jan 31 14:57:07 crc kubenswrapper[4751]: I0131 14:57:07.519104 4751 scope.go:117] "RemoveContainer" containerID="e8bb73c8fea313592a221df9e5dbff86d9e1c2c8a380afe92fecdcb99f557785" Jan 31 14:57:08 crc kubenswrapper[4751]: I0131 14:57:08.413980 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10399bf7-0161-488c-8001-e6ba927889e5" path="/var/lib/kubelet/pods/10399bf7-0161-488c-8001-e6ba927889e5/volumes" Jan 31 14:57:14 crc kubenswrapper[4751]: I0131 14:57:14.934119 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-index-vjs56"] Jan 31 14:57:14 crc kubenswrapper[4751]: E0131 14:57:14.935257 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10399bf7-0161-488c-8001-e6ba927889e5" containerName="registry-server" Jan 31 14:57:14 crc kubenswrapper[4751]: I0131 14:57:14.935290 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="10399bf7-0161-488c-8001-e6ba927889e5" containerName="registry-server" Jan 31 14:57:14 crc kubenswrapper[4751]: E0131 14:57:14.935324 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10399bf7-0161-488c-8001-e6ba927889e5" containerName="extract-content" Jan 31 14:57:14 crc kubenswrapper[4751]: I0131 14:57:14.935341 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="10399bf7-0161-488c-8001-e6ba927889e5" containerName="extract-content" Jan 31 14:57:14 crc kubenswrapper[4751]: E0131 14:57:14.935387 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbf9b8e1-7d1f-4fe4-9111-c040d8842d40" containerName="registry-server" Jan 31 14:57:14 crc kubenswrapper[4751]: I0131 14:57:14.935405 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbf9b8e1-7d1f-4fe4-9111-c040d8842d40" containerName="registry-server" Jan 31 14:57:14 crc kubenswrapper[4751]: E0131 14:57:14.935431 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10399bf7-0161-488c-8001-e6ba927889e5" containerName="extract-utilities" Jan 31 14:57:14 crc kubenswrapper[4751]: I0131 14:57:14.935449 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="10399bf7-0161-488c-8001-e6ba927889e5" containerName="extract-utilities" Jan 31 14:57:14 crc kubenswrapper[4751]: E0131 14:57:14.935467 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbf9b8e1-7d1f-4fe4-9111-c040d8842d40" containerName="extract-content" Jan 31 14:57:14 crc kubenswrapper[4751]: I0131 14:57:14.935483 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbf9b8e1-7d1f-4fe4-9111-c040d8842d40" containerName="extract-content" Jan 31 14:57:14 crc kubenswrapper[4751]: E0131 14:57:14.935512 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbf9b8e1-7d1f-4fe4-9111-c040d8842d40" containerName="extract-utilities" Jan 31 14:57:14 crc kubenswrapper[4751]: I0131 14:57:14.935528 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbf9b8e1-7d1f-4fe4-9111-c040d8842d40" containerName="extract-utilities" Jan 31 14:57:14 crc kubenswrapper[4751]: I0131 14:57:14.935755 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="10399bf7-0161-488c-8001-e6ba927889e5" containerName="registry-server" Jan 31 14:57:14 crc kubenswrapper[4751]: I0131 14:57:14.935783 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbf9b8e1-7d1f-4fe4-9111-c040d8842d40" containerName="registry-server" Jan 31 14:57:14 crc kubenswrapper[4751]: I0131 14:57:14.936589 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-index-vjs56" Jan 31 14:57:14 crc kubenswrapper[4751]: I0131 14:57:14.941298 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-index-dockercfg-4qrpl" Jan 31 14:57:14 crc kubenswrapper[4751]: I0131 14:57:14.950640 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-index-vjs56"] Jan 31 14:57:15 crc kubenswrapper[4751]: I0131 14:57:15.071820 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqp67\" (UniqueName: \"kubernetes.io/projected/95bedc09-cab6-4e6b-a210-8cb1f8b39601-kube-api-access-vqp67\") pod \"horizon-operator-index-vjs56\" (UID: \"95bedc09-cab6-4e6b-a210-8cb1f8b39601\") " pod="openstack-operators/horizon-operator-index-vjs56" Jan 31 14:57:15 crc kubenswrapper[4751]: I0131 14:57:15.173005 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqp67\" (UniqueName: \"kubernetes.io/projected/95bedc09-cab6-4e6b-a210-8cb1f8b39601-kube-api-access-vqp67\") pod \"horizon-operator-index-vjs56\" (UID: \"95bedc09-cab6-4e6b-a210-8cb1f8b39601\") " pod="openstack-operators/horizon-operator-index-vjs56" Jan 31 14:57:15 crc kubenswrapper[4751]: I0131 14:57:15.207301 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqp67\" (UniqueName: \"kubernetes.io/projected/95bedc09-cab6-4e6b-a210-8cb1f8b39601-kube-api-access-vqp67\") pod \"horizon-operator-index-vjs56\" (UID: \"95bedc09-cab6-4e6b-a210-8cb1f8b39601\") " pod="openstack-operators/horizon-operator-index-vjs56" Jan 31 14:57:15 crc kubenswrapper[4751]: I0131 14:57:15.270259 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-index-vjs56" Jan 31 14:57:15 crc kubenswrapper[4751]: I0131 14:57:15.779509 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-index-vjs56"] Jan 31 14:57:15 crc kubenswrapper[4751]: W0131 14:57:15.792742 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95bedc09_cab6_4e6b_a210_8cb1f8b39601.slice/crio-480ede065aba964366fb7112751f9b1d88da7e06ab1cbbbb4f9f2d5f7ef6e632 WatchSource:0}: Error finding container 480ede065aba964366fb7112751f9b1d88da7e06ab1cbbbb4f9f2d5f7ef6e632: Status 404 returned error can't find the container with id 480ede065aba964366fb7112751f9b1d88da7e06ab1cbbbb4f9f2d5f7ef6e632 Jan 31 14:57:16 crc kubenswrapper[4751]: I0131 14:57:16.538984 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-index-vjs56" event={"ID":"95bedc09-cab6-4e6b-a210-8cb1f8b39601","Type":"ContainerStarted","Data":"480ede065aba964366fb7112751f9b1d88da7e06ab1cbbbb4f9f2d5f7ef6e632"} Jan 31 14:57:17 crc kubenswrapper[4751]: I0131 14:57:17.549224 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-index-vjs56" event={"ID":"95bedc09-cab6-4e6b-a210-8cb1f8b39601","Type":"ContainerStarted","Data":"4a2e12dfd21dff2f78267376159e9e47a8fbc75b43ecb1ce27474c3a534a3f4b"} Jan 31 14:57:17 crc kubenswrapper[4751]: I0131 14:57:17.577701 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-index-vjs56" podStartSLOduration=2.486879434 podStartE2EDuration="3.577676138s" podCreationTimestamp="2026-01-31 14:57:14 +0000 UTC" firstStartedPulling="2026-01-31 14:57:15.796728132 +0000 UTC m=+940.171441057" lastFinishedPulling="2026-01-31 14:57:16.887524876 +0000 UTC m=+941.262237761" observedRunningTime="2026-01-31 14:57:17.569540454 +0000 UTC m=+941.944253399" watchObservedRunningTime="2026-01-31 14:57:17.577676138 +0000 UTC m=+941.952389063" Jan 31 14:57:19 crc kubenswrapper[4751]: I0131 14:57:19.736168 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-index-75pvx"] Jan 31 14:57:19 crc kubenswrapper[4751]: I0131 14:57:19.738433 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-index-75pvx" Jan 31 14:57:19 crc kubenswrapper[4751]: I0131 14:57:19.741645 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-index-dockercfg-npvsh" Jan 31 14:57:19 crc kubenswrapper[4751]: I0131 14:57:19.747762 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-index-75pvx"] Jan 31 14:57:19 crc kubenswrapper[4751]: I0131 14:57:19.846134 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpg2k\" (UniqueName: \"kubernetes.io/projected/065b8624-7cdb-463c-9636-d3e980119eb7-kube-api-access-qpg2k\") pod \"swift-operator-index-75pvx\" (UID: \"065b8624-7cdb-463c-9636-d3e980119eb7\") " pod="openstack-operators/swift-operator-index-75pvx" Jan 31 14:57:19 crc kubenswrapper[4751]: I0131 14:57:19.947366 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpg2k\" (UniqueName: \"kubernetes.io/projected/065b8624-7cdb-463c-9636-d3e980119eb7-kube-api-access-qpg2k\") pod \"swift-operator-index-75pvx\" (UID: \"065b8624-7cdb-463c-9636-d3e980119eb7\") " pod="openstack-operators/swift-operator-index-75pvx" Jan 31 14:57:19 crc kubenswrapper[4751]: I0131 14:57:19.973049 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpg2k\" (UniqueName: \"kubernetes.io/projected/065b8624-7cdb-463c-9636-d3e980119eb7-kube-api-access-qpg2k\") pod \"swift-operator-index-75pvx\" (UID: \"065b8624-7cdb-463c-9636-d3e980119eb7\") " pod="openstack-operators/swift-operator-index-75pvx" Jan 31 14:57:20 crc kubenswrapper[4751]: I0131 14:57:20.108936 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-index-75pvx" Jan 31 14:57:20 crc kubenswrapper[4751]: I0131 14:57:20.588027 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-index-75pvx"] Jan 31 14:57:20 crc kubenswrapper[4751]: W0131 14:57:20.595846 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod065b8624_7cdb_463c_9636_d3e980119eb7.slice/crio-5f493f4e9467a7936b5f9e1ffc78338268f76a1484833cfafa1962d6944fc1c3 WatchSource:0}: Error finding container 5f493f4e9467a7936b5f9e1ffc78338268f76a1484833cfafa1962d6944fc1c3: Status 404 returned error can't find the container with id 5f493f4e9467a7936b5f9e1ffc78338268f76a1484833cfafa1962d6944fc1c3 Jan 31 14:57:21 crc kubenswrapper[4751]: I0131 14:57:21.579986 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-index-75pvx" event={"ID":"065b8624-7cdb-463c-9636-d3e980119eb7","Type":"ContainerStarted","Data":"5f493f4e9467a7936b5f9e1ffc78338268f76a1484833cfafa1962d6944fc1c3"} Jan 31 14:57:22 crc kubenswrapper[4751]: I0131 14:57:22.591971 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-index-75pvx" event={"ID":"065b8624-7cdb-463c-9636-d3e980119eb7","Type":"ContainerStarted","Data":"d60e188fde30ce119895fe465702862991673f5195ee276a966d98efdbbb7cf3"} Jan 31 14:57:22 crc kubenswrapper[4751]: I0131 14:57:22.613548 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-index-75pvx" podStartSLOduration=2.71122359 podStartE2EDuration="3.613529075s" podCreationTimestamp="2026-01-31 14:57:19 +0000 UTC" firstStartedPulling="2026-01-31 14:57:20.598819818 +0000 UTC m=+944.973532703" lastFinishedPulling="2026-01-31 14:57:21.501125303 +0000 UTC m=+945.875838188" observedRunningTime="2026-01-31 14:57:22.607295811 +0000 UTC m=+946.982008696" watchObservedRunningTime="2026-01-31 14:57:22.613529075 +0000 UTC m=+946.988241960" Jan 31 14:57:24 crc kubenswrapper[4751]: I0131 14:57:24.331367 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 14:57:25 crc kubenswrapper[4751]: I0131 14:57:25.271031 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/horizon-operator-index-vjs56" Jan 31 14:57:25 crc kubenswrapper[4751]: I0131 14:57:25.271341 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-index-vjs56" Jan 31 14:57:25 crc kubenswrapper[4751]: I0131 14:57:25.312692 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/horizon-operator-index-vjs56" Jan 31 14:57:25 crc kubenswrapper[4751]: I0131 14:57:25.633203 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-index-vjs56" Jan 31 14:57:28 crc kubenswrapper[4751]: I0131 14:57:28.659951 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-db-create-pl5bs"] Jan 31 14:57:28 crc kubenswrapper[4751]: I0131 14:57:28.661233 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-create-pl5bs" Jan 31 14:57:28 crc kubenswrapper[4751]: I0131 14:57:28.665851 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-dfde-account-create-update-vbcnd"] Jan 31 14:57:28 crc kubenswrapper[4751]: I0131 14:57:28.667040 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-dfde-account-create-update-vbcnd" Jan 31 14:57:28 crc kubenswrapper[4751]: I0131 14:57:28.671408 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-db-secret" Jan 31 14:57:28 crc kubenswrapper[4751]: I0131 14:57:28.676850 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-db-create-pl5bs"] Jan 31 14:57:28 crc kubenswrapper[4751]: I0131 14:57:28.680179 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-dfde-account-create-update-vbcnd"] Jan 31 14:57:28 crc kubenswrapper[4751]: I0131 14:57:28.770847 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06a47516-5cf6-431b-86ee-7732bd88fed4-operator-scripts\") pod \"keystone-dfde-account-create-update-vbcnd\" (UID: \"06a47516-5cf6-431b-86ee-7732bd88fed4\") " pod="glance-kuttl-tests/keystone-dfde-account-create-update-vbcnd" Jan 31 14:57:28 crc kubenswrapper[4751]: I0131 14:57:28.771462 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/568d26c9-1fe8-4e01-a7c0-cbe91951fe60-operator-scripts\") pod \"keystone-db-create-pl5bs\" (UID: \"568d26c9-1fe8-4e01-a7c0-cbe91951fe60\") " pod="glance-kuttl-tests/keystone-db-create-pl5bs" Jan 31 14:57:28 crc kubenswrapper[4751]: I0131 14:57:28.772131 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbsp6\" (UniqueName: \"kubernetes.io/projected/568d26c9-1fe8-4e01-a7c0-cbe91951fe60-kube-api-access-mbsp6\") pod \"keystone-db-create-pl5bs\" (UID: \"568d26c9-1fe8-4e01-a7c0-cbe91951fe60\") " pod="glance-kuttl-tests/keystone-db-create-pl5bs" Jan 31 14:57:28 crc kubenswrapper[4751]: I0131 14:57:28.772214 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmt5z\" (UniqueName: \"kubernetes.io/projected/06a47516-5cf6-431b-86ee-7732bd88fed4-kube-api-access-xmt5z\") pod \"keystone-dfde-account-create-update-vbcnd\" (UID: \"06a47516-5cf6-431b-86ee-7732bd88fed4\") " pod="glance-kuttl-tests/keystone-dfde-account-create-update-vbcnd" Jan 31 14:57:28 crc kubenswrapper[4751]: I0131 14:57:28.873149 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbsp6\" (UniqueName: \"kubernetes.io/projected/568d26c9-1fe8-4e01-a7c0-cbe91951fe60-kube-api-access-mbsp6\") pod \"keystone-db-create-pl5bs\" (UID: \"568d26c9-1fe8-4e01-a7c0-cbe91951fe60\") " pod="glance-kuttl-tests/keystone-db-create-pl5bs" Jan 31 14:57:28 crc kubenswrapper[4751]: I0131 14:57:28.873202 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmt5z\" (UniqueName: \"kubernetes.io/projected/06a47516-5cf6-431b-86ee-7732bd88fed4-kube-api-access-xmt5z\") pod \"keystone-dfde-account-create-update-vbcnd\" (UID: \"06a47516-5cf6-431b-86ee-7732bd88fed4\") " pod="glance-kuttl-tests/keystone-dfde-account-create-update-vbcnd" Jan 31 14:57:28 crc kubenswrapper[4751]: I0131 14:57:28.873246 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06a47516-5cf6-431b-86ee-7732bd88fed4-operator-scripts\") pod \"keystone-dfde-account-create-update-vbcnd\" (UID: \"06a47516-5cf6-431b-86ee-7732bd88fed4\") " pod="glance-kuttl-tests/keystone-dfde-account-create-update-vbcnd" Jan 31 14:57:28 crc kubenswrapper[4751]: I0131 14:57:28.873318 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/568d26c9-1fe8-4e01-a7c0-cbe91951fe60-operator-scripts\") pod \"keystone-db-create-pl5bs\" (UID: \"568d26c9-1fe8-4e01-a7c0-cbe91951fe60\") " pod="glance-kuttl-tests/keystone-db-create-pl5bs" Jan 31 14:57:28 crc kubenswrapper[4751]: I0131 14:57:28.874259 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06a47516-5cf6-431b-86ee-7732bd88fed4-operator-scripts\") pod \"keystone-dfde-account-create-update-vbcnd\" (UID: \"06a47516-5cf6-431b-86ee-7732bd88fed4\") " pod="glance-kuttl-tests/keystone-dfde-account-create-update-vbcnd" Jan 31 14:57:28 crc kubenswrapper[4751]: I0131 14:57:28.874272 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/568d26c9-1fe8-4e01-a7c0-cbe91951fe60-operator-scripts\") pod \"keystone-db-create-pl5bs\" (UID: \"568d26c9-1fe8-4e01-a7c0-cbe91951fe60\") " pod="glance-kuttl-tests/keystone-db-create-pl5bs" Jan 31 14:57:28 crc kubenswrapper[4751]: I0131 14:57:28.892016 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmt5z\" (UniqueName: \"kubernetes.io/projected/06a47516-5cf6-431b-86ee-7732bd88fed4-kube-api-access-xmt5z\") pod \"keystone-dfde-account-create-update-vbcnd\" (UID: \"06a47516-5cf6-431b-86ee-7732bd88fed4\") " pod="glance-kuttl-tests/keystone-dfde-account-create-update-vbcnd" Jan 31 14:57:28 crc kubenswrapper[4751]: I0131 14:57:28.892060 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbsp6\" (UniqueName: \"kubernetes.io/projected/568d26c9-1fe8-4e01-a7c0-cbe91951fe60-kube-api-access-mbsp6\") pod \"keystone-db-create-pl5bs\" (UID: \"568d26c9-1fe8-4e01-a7c0-cbe91951fe60\") " pod="glance-kuttl-tests/keystone-db-create-pl5bs" Jan 31 14:57:29 crc kubenswrapper[4751]: I0131 14:57:29.046324 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-dfde-account-create-update-vbcnd" Jan 31 14:57:29 crc kubenswrapper[4751]: I0131 14:57:29.047696 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-create-pl5bs" Jan 31 14:57:29 crc kubenswrapper[4751]: I0131 14:57:29.520235 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-dfde-account-create-update-vbcnd"] Jan 31 14:57:29 crc kubenswrapper[4751]: I0131 14:57:29.570848 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-db-create-pl5bs"] Jan 31 14:57:29 crc kubenswrapper[4751]: W0131 14:57:29.603189 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod568d26c9_1fe8_4e01_a7c0_cbe91951fe60.slice/crio-bd00717c7915ab2ac4085f5f187393778b80d29e2118670743465637f5e79db6 WatchSource:0}: Error finding container bd00717c7915ab2ac4085f5f187393778b80d29e2118670743465637f5e79db6: Status 404 returned error can't find the container with id bd00717c7915ab2ac4085f5f187393778b80d29e2118670743465637f5e79db6 Jan 31 14:57:29 crc kubenswrapper[4751]: I0131 14:57:29.644800 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-create-pl5bs" event={"ID":"568d26c9-1fe8-4e01-a7c0-cbe91951fe60","Type":"ContainerStarted","Data":"bd00717c7915ab2ac4085f5f187393778b80d29e2118670743465637f5e79db6"} Jan 31 14:57:29 crc kubenswrapper[4751]: I0131 14:57:29.646601 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-dfde-account-create-update-vbcnd" event={"ID":"06a47516-5cf6-431b-86ee-7732bd88fed4","Type":"ContainerStarted","Data":"4717c7f2329c1c3fcafc8e5559236099ccd5c56439f278f17fdefcf2a479b42c"} Jan 31 14:57:30 crc kubenswrapper[4751]: I0131 14:57:30.109617 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/swift-operator-index-75pvx" Jan 31 14:57:30 crc kubenswrapper[4751]: I0131 14:57:30.109966 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-index-75pvx" Jan 31 14:57:30 crc kubenswrapper[4751]: I0131 14:57:30.149028 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/swift-operator-index-75pvx" Jan 31 14:57:30 crc kubenswrapper[4751]: I0131 14:57:30.655899 4751 generic.go:334] "Generic (PLEG): container finished" podID="568d26c9-1fe8-4e01-a7c0-cbe91951fe60" containerID="457ea80a5f749ea606e6892b07ad8e22c7b832800f0f223bc54849035a17270d" exitCode=0 Jan 31 14:57:30 crc kubenswrapper[4751]: I0131 14:57:30.655951 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-create-pl5bs" event={"ID":"568d26c9-1fe8-4e01-a7c0-cbe91951fe60","Type":"ContainerDied","Data":"457ea80a5f749ea606e6892b07ad8e22c7b832800f0f223bc54849035a17270d"} Jan 31 14:57:30 crc kubenswrapper[4751]: I0131 14:57:30.657896 4751 generic.go:334] "Generic (PLEG): container finished" podID="06a47516-5cf6-431b-86ee-7732bd88fed4" containerID="eec75dcec16927bdd78c685c8995e59bfecf459a9739faabb410481b5046b1fb" exitCode=0 Jan 31 14:57:30 crc kubenswrapper[4751]: I0131 14:57:30.657935 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-dfde-account-create-update-vbcnd" event={"ID":"06a47516-5cf6-431b-86ee-7732bd88fed4","Type":"ContainerDied","Data":"eec75dcec16927bdd78c685c8995e59bfecf459a9739faabb410481b5046b1fb"} Jan 31 14:57:30 crc kubenswrapper[4751]: I0131 14:57:30.690336 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-index-75pvx" Jan 31 14:57:32 crc kubenswrapper[4751]: I0131 14:57:32.053914 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-dfde-account-create-update-vbcnd" Jan 31 14:57:32 crc kubenswrapper[4751]: I0131 14:57:32.062294 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-create-pl5bs" Jan 31 14:57:32 crc kubenswrapper[4751]: I0131 14:57:32.122322 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06a47516-5cf6-431b-86ee-7732bd88fed4-operator-scripts\") pod \"06a47516-5cf6-431b-86ee-7732bd88fed4\" (UID: \"06a47516-5cf6-431b-86ee-7732bd88fed4\") " Jan 31 14:57:32 crc kubenswrapper[4751]: I0131 14:57:32.122369 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbsp6\" (UniqueName: \"kubernetes.io/projected/568d26c9-1fe8-4e01-a7c0-cbe91951fe60-kube-api-access-mbsp6\") pod \"568d26c9-1fe8-4e01-a7c0-cbe91951fe60\" (UID: \"568d26c9-1fe8-4e01-a7c0-cbe91951fe60\") " Jan 31 14:57:32 crc kubenswrapper[4751]: I0131 14:57:32.122407 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/568d26c9-1fe8-4e01-a7c0-cbe91951fe60-operator-scripts\") pod \"568d26c9-1fe8-4e01-a7c0-cbe91951fe60\" (UID: \"568d26c9-1fe8-4e01-a7c0-cbe91951fe60\") " Jan 31 14:57:32 crc kubenswrapper[4751]: I0131 14:57:32.122446 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmt5z\" (UniqueName: \"kubernetes.io/projected/06a47516-5cf6-431b-86ee-7732bd88fed4-kube-api-access-xmt5z\") pod \"06a47516-5cf6-431b-86ee-7732bd88fed4\" (UID: \"06a47516-5cf6-431b-86ee-7732bd88fed4\") " Jan 31 14:57:32 crc kubenswrapper[4751]: I0131 14:57:32.123581 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/568d26c9-1fe8-4e01-a7c0-cbe91951fe60-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "568d26c9-1fe8-4e01-a7c0-cbe91951fe60" (UID: "568d26c9-1fe8-4e01-a7c0-cbe91951fe60"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:57:32 crc kubenswrapper[4751]: I0131 14:57:32.124625 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06a47516-5cf6-431b-86ee-7732bd88fed4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "06a47516-5cf6-431b-86ee-7732bd88fed4" (UID: "06a47516-5cf6-431b-86ee-7732bd88fed4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:57:32 crc kubenswrapper[4751]: I0131 14:57:32.130453 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/568d26c9-1fe8-4e01-a7c0-cbe91951fe60-kube-api-access-mbsp6" (OuterVolumeSpecName: "kube-api-access-mbsp6") pod "568d26c9-1fe8-4e01-a7c0-cbe91951fe60" (UID: "568d26c9-1fe8-4e01-a7c0-cbe91951fe60"). InnerVolumeSpecName "kube-api-access-mbsp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:57:32 crc kubenswrapper[4751]: I0131 14:57:32.132370 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06a47516-5cf6-431b-86ee-7732bd88fed4-kube-api-access-xmt5z" (OuterVolumeSpecName: "kube-api-access-xmt5z") pod "06a47516-5cf6-431b-86ee-7732bd88fed4" (UID: "06a47516-5cf6-431b-86ee-7732bd88fed4"). InnerVolumeSpecName "kube-api-access-xmt5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:57:32 crc kubenswrapper[4751]: I0131 14:57:32.224126 4751 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06a47516-5cf6-431b-86ee-7732bd88fed4-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 14:57:32 crc kubenswrapper[4751]: I0131 14:57:32.224205 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbsp6\" (UniqueName: \"kubernetes.io/projected/568d26c9-1fe8-4e01-a7c0-cbe91951fe60-kube-api-access-mbsp6\") on node \"crc\" DevicePath \"\"" Jan 31 14:57:32 crc kubenswrapper[4751]: I0131 14:57:32.224219 4751 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/568d26c9-1fe8-4e01-a7c0-cbe91951fe60-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 14:57:32 crc kubenswrapper[4751]: I0131 14:57:32.224232 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmt5z\" (UniqueName: \"kubernetes.io/projected/06a47516-5cf6-431b-86ee-7732bd88fed4-kube-api-access-xmt5z\") on node \"crc\" DevicePath \"\"" Jan 31 14:57:32 crc kubenswrapper[4751]: I0131 14:57:32.676559 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-dfde-account-create-update-vbcnd" event={"ID":"06a47516-5cf6-431b-86ee-7732bd88fed4","Type":"ContainerDied","Data":"4717c7f2329c1c3fcafc8e5559236099ccd5c56439f278f17fdefcf2a479b42c"} Jan 31 14:57:32 crc kubenswrapper[4751]: I0131 14:57:32.676632 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-dfde-account-create-update-vbcnd" Jan 31 14:57:32 crc kubenswrapper[4751]: I0131 14:57:32.676636 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4717c7f2329c1c3fcafc8e5559236099ccd5c56439f278f17fdefcf2a479b42c" Jan 31 14:57:32 crc kubenswrapper[4751]: I0131 14:57:32.681160 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-create-pl5bs" event={"ID":"568d26c9-1fe8-4e01-a7c0-cbe91951fe60","Type":"ContainerDied","Data":"bd00717c7915ab2ac4085f5f187393778b80d29e2118670743465637f5e79db6"} Jan 31 14:57:32 crc kubenswrapper[4751]: I0131 14:57:32.681443 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd00717c7915ab2ac4085f5f187393778b80d29e2118670743465637f5e79db6" Jan 31 14:57:32 crc kubenswrapper[4751]: I0131 14:57:32.681213 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-create-pl5bs" Jan 31 14:57:34 crc kubenswrapper[4751]: I0131 14:57:34.237106 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-db-sync-56bwv"] Jan 31 14:57:34 crc kubenswrapper[4751]: E0131 14:57:34.238537 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="568d26c9-1fe8-4e01-a7c0-cbe91951fe60" containerName="mariadb-database-create" Jan 31 14:57:34 crc kubenswrapper[4751]: I0131 14:57:34.238630 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="568d26c9-1fe8-4e01-a7c0-cbe91951fe60" containerName="mariadb-database-create" Jan 31 14:57:34 crc kubenswrapper[4751]: E0131 14:57:34.238723 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06a47516-5cf6-431b-86ee-7732bd88fed4" containerName="mariadb-account-create-update" Jan 31 14:57:34 crc kubenswrapper[4751]: I0131 14:57:34.238797 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="06a47516-5cf6-431b-86ee-7732bd88fed4" containerName="mariadb-account-create-update" Jan 31 14:57:34 crc kubenswrapper[4751]: I0131 14:57:34.238996 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="06a47516-5cf6-431b-86ee-7732bd88fed4" containerName="mariadb-account-create-update" Jan 31 14:57:34 crc kubenswrapper[4751]: I0131 14:57:34.239149 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="568d26c9-1fe8-4e01-a7c0-cbe91951fe60" containerName="mariadb-database-create" Jan 31 14:57:34 crc kubenswrapper[4751]: I0131 14:57:34.239781 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-sync-56bwv" Jan 31 14:57:34 crc kubenswrapper[4751]: I0131 14:57:34.243051 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-scripts" Jan 31 14:57:34 crc kubenswrapper[4751]: I0131 14:57:34.243291 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-config-data" Jan 31 14:57:34 crc kubenswrapper[4751]: I0131 14:57:34.243521 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-keystone-dockercfg-szsvc" Jan 31 14:57:34 crc kubenswrapper[4751]: I0131 14:57:34.245650 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone" Jan 31 14:57:34 crc kubenswrapper[4751]: I0131 14:57:34.252607 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-db-sync-56bwv"] Jan 31 14:57:34 crc kubenswrapper[4751]: I0131 14:57:34.355668 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hq52\" (UniqueName: \"kubernetes.io/projected/ff5e8bad-e481-445e-99e8-5a5487e908d8-kube-api-access-7hq52\") pod \"keystone-db-sync-56bwv\" (UID: \"ff5e8bad-e481-445e-99e8-5a5487e908d8\") " pod="glance-kuttl-tests/keystone-db-sync-56bwv" Jan 31 14:57:34 crc kubenswrapper[4751]: I0131 14:57:34.355935 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff5e8bad-e481-445e-99e8-5a5487e908d8-config-data\") pod \"keystone-db-sync-56bwv\" (UID: \"ff5e8bad-e481-445e-99e8-5a5487e908d8\") " pod="glance-kuttl-tests/keystone-db-sync-56bwv" Jan 31 14:57:34 crc kubenswrapper[4751]: I0131 14:57:34.457863 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hq52\" (UniqueName: \"kubernetes.io/projected/ff5e8bad-e481-445e-99e8-5a5487e908d8-kube-api-access-7hq52\") pod \"keystone-db-sync-56bwv\" (UID: \"ff5e8bad-e481-445e-99e8-5a5487e908d8\") " pod="glance-kuttl-tests/keystone-db-sync-56bwv" Jan 31 14:57:34 crc kubenswrapper[4751]: I0131 14:57:34.458014 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff5e8bad-e481-445e-99e8-5a5487e908d8-config-data\") pod \"keystone-db-sync-56bwv\" (UID: \"ff5e8bad-e481-445e-99e8-5a5487e908d8\") " pod="glance-kuttl-tests/keystone-db-sync-56bwv" Jan 31 14:57:34 crc kubenswrapper[4751]: I0131 14:57:34.468279 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff5e8bad-e481-445e-99e8-5a5487e908d8-config-data\") pod \"keystone-db-sync-56bwv\" (UID: \"ff5e8bad-e481-445e-99e8-5a5487e908d8\") " pod="glance-kuttl-tests/keystone-db-sync-56bwv" Jan 31 14:57:34 crc kubenswrapper[4751]: I0131 14:57:34.488000 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hq52\" (UniqueName: \"kubernetes.io/projected/ff5e8bad-e481-445e-99e8-5a5487e908d8-kube-api-access-7hq52\") pod \"keystone-db-sync-56bwv\" (UID: \"ff5e8bad-e481-445e-99e8-5a5487e908d8\") " pod="glance-kuttl-tests/keystone-db-sync-56bwv" Jan 31 14:57:34 crc kubenswrapper[4751]: I0131 14:57:34.555635 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-sync-56bwv" Jan 31 14:57:34 crc kubenswrapper[4751]: I0131 14:57:34.784563 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-db-sync-56bwv"] Jan 31 14:57:35 crc kubenswrapper[4751]: I0131 14:57:35.703219 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-sync-56bwv" event={"ID":"ff5e8bad-e481-445e-99e8-5a5487e908d8","Type":"ContainerStarted","Data":"ecc8a96f4fea25c974208298436bf510666bd64b0748730b17c3c0f483b01a86"} Jan 31 14:57:36 crc kubenswrapper[4751]: I0131 14:57:36.558210 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b4bbztq"] Jan 31 14:57:36 crc kubenswrapper[4751]: I0131 14:57:36.560192 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b4bbztq" Jan 31 14:57:36 crc kubenswrapper[4751]: I0131 14:57:36.566183 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-wxkjx" Jan 31 14:57:36 crc kubenswrapper[4751]: I0131 14:57:36.566809 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b4bbztq"] Jan 31 14:57:36 crc kubenswrapper[4751]: I0131 14:57:36.692869 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnpjj\" (UniqueName: \"kubernetes.io/projected/886303a3-d05b-4551-bd03-ebc2e2aef77c-kube-api-access-rnpjj\") pod \"70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b4bbztq\" (UID: \"886303a3-d05b-4551-bd03-ebc2e2aef77c\") " pod="openstack-operators/70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b4bbztq" Jan 31 14:57:36 crc kubenswrapper[4751]: I0131 14:57:36.692948 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/886303a3-d05b-4551-bd03-ebc2e2aef77c-bundle\") pod \"70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b4bbztq\" (UID: \"886303a3-d05b-4551-bd03-ebc2e2aef77c\") " pod="openstack-operators/70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b4bbztq" Jan 31 14:57:36 crc kubenswrapper[4751]: I0131 14:57:36.692969 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/886303a3-d05b-4551-bd03-ebc2e2aef77c-util\") pod \"70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b4bbztq\" (UID: \"886303a3-d05b-4551-bd03-ebc2e2aef77c\") " pod="openstack-operators/70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b4bbztq" Jan 31 14:57:36 crc kubenswrapper[4751]: I0131 14:57:36.793929 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnpjj\" (UniqueName: \"kubernetes.io/projected/886303a3-d05b-4551-bd03-ebc2e2aef77c-kube-api-access-rnpjj\") pod \"70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b4bbztq\" (UID: \"886303a3-d05b-4551-bd03-ebc2e2aef77c\") " pod="openstack-operators/70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b4bbztq" Jan 31 14:57:36 crc kubenswrapper[4751]: I0131 14:57:36.793997 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/886303a3-d05b-4551-bd03-ebc2e2aef77c-bundle\") pod \"70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b4bbztq\" (UID: \"886303a3-d05b-4551-bd03-ebc2e2aef77c\") " pod="openstack-operators/70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b4bbztq" Jan 31 14:57:36 crc kubenswrapper[4751]: I0131 14:57:36.794020 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/886303a3-d05b-4551-bd03-ebc2e2aef77c-util\") pod \"70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b4bbztq\" (UID: \"886303a3-d05b-4551-bd03-ebc2e2aef77c\") " pod="openstack-operators/70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b4bbztq" Jan 31 14:57:36 crc kubenswrapper[4751]: I0131 14:57:36.794498 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/886303a3-d05b-4551-bd03-ebc2e2aef77c-util\") pod \"70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b4bbztq\" (UID: \"886303a3-d05b-4551-bd03-ebc2e2aef77c\") " pod="openstack-operators/70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b4bbztq" Jan 31 14:57:36 crc kubenswrapper[4751]: I0131 14:57:36.795036 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/886303a3-d05b-4551-bd03-ebc2e2aef77c-bundle\") pod \"70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b4bbztq\" (UID: \"886303a3-d05b-4551-bd03-ebc2e2aef77c\") " pod="openstack-operators/70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b4bbztq" Jan 31 14:57:36 crc kubenswrapper[4751]: I0131 14:57:36.816956 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnpjj\" (UniqueName: \"kubernetes.io/projected/886303a3-d05b-4551-bd03-ebc2e2aef77c-kube-api-access-rnpjj\") pod \"70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b4bbztq\" (UID: \"886303a3-d05b-4551-bd03-ebc2e2aef77c\") " pod="openstack-operators/70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b4bbztq" Jan 31 14:57:36 crc kubenswrapper[4751]: I0131 14:57:36.892510 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b4bbztq" Jan 31 14:57:37 crc kubenswrapper[4751]: I0131 14:57:37.382762 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b4bbztq"] Jan 31 14:57:37 crc kubenswrapper[4751]: W0131 14:57:37.396191 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod886303a3_d05b_4551_bd03_ebc2e2aef77c.slice/crio-afefcd8155138e543666c28dd288356aaa1ab98ddb01c246f7d57e4269dc77ee WatchSource:0}: Error finding container afefcd8155138e543666c28dd288356aaa1ab98ddb01c246f7d57e4269dc77ee: Status 404 returned error can't find the container with id afefcd8155138e543666c28dd288356aaa1ab98ddb01c246f7d57e4269dc77ee Jan 31 14:57:37 crc kubenswrapper[4751]: I0131 14:57:37.545550 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926c857f"] Jan 31 14:57:37 crc kubenswrapper[4751]: I0131 14:57:37.546930 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926c857f" Jan 31 14:57:37 crc kubenswrapper[4751]: I0131 14:57:37.559342 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926c857f"] Jan 31 14:57:37 crc kubenswrapper[4751]: I0131 14:57:37.605049 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eec59a88-8f4d-4482-aa2a-11a508cc3a79-util\") pod \"920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926c857f\" (UID: \"eec59a88-8f4d-4482-aa2a-11a508cc3a79\") " pod="openstack-operators/920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926c857f" Jan 31 14:57:37 crc kubenswrapper[4751]: I0131 14:57:37.605252 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eec59a88-8f4d-4482-aa2a-11a508cc3a79-bundle\") pod \"920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926c857f\" (UID: \"eec59a88-8f4d-4482-aa2a-11a508cc3a79\") " pod="openstack-operators/920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926c857f" Jan 31 14:57:37 crc kubenswrapper[4751]: I0131 14:57:37.605308 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs89j\" (UniqueName: \"kubernetes.io/projected/eec59a88-8f4d-4482-aa2a-11a508cc3a79-kube-api-access-gs89j\") pod \"920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926c857f\" (UID: \"eec59a88-8f4d-4482-aa2a-11a508cc3a79\") " pod="openstack-operators/920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926c857f" Jan 31 14:57:37 crc kubenswrapper[4751]: I0131 14:57:37.706296 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eec59a88-8f4d-4482-aa2a-11a508cc3a79-bundle\") pod \"920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926c857f\" (UID: \"eec59a88-8f4d-4482-aa2a-11a508cc3a79\") " pod="openstack-operators/920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926c857f" Jan 31 14:57:37 crc kubenswrapper[4751]: I0131 14:57:37.706347 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gs89j\" (UniqueName: \"kubernetes.io/projected/eec59a88-8f4d-4482-aa2a-11a508cc3a79-kube-api-access-gs89j\") pod \"920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926c857f\" (UID: \"eec59a88-8f4d-4482-aa2a-11a508cc3a79\") " pod="openstack-operators/920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926c857f" Jan 31 14:57:37 crc kubenswrapper[4751]: I0131 14:57:37.706389 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eec59a88-8f4d-4482-aa2a-11a508cc3a79-util\") pod \"920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926c857f\" (UID: \"eec59a88-8f4d-4482-aa2a-11a508cc3a79\") " pod="openstack-operators/920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926c857f" Jan 31 14:57:37 crc kubenswrapper[4751]: I0131 14:57:37.706761 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eec59a88-8f4d-4482-aa2a-11a508cc3a79-util\") pod \"920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926c857f\" (UID: \"eec59a88-8f4d-4482-aa2a-11a508cc3a79\") " pod="openstack-operators/920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926c857f" Jan 31 14:57:37 crc kubenswrapper[4751]: I0131 14:57:37.706841 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eec59a88-8f4d-4482-aa2a-11a508cc3a79-bundle\") pod \"920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926c857f\" (UID: \"eec59a88-8f4d-4482-aa2a-11a508cc3a79\") " pod="openstack-operators/920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926c857f" Jan 31 14:57:37 crc kubenswrapper[4751]: I0131 14:57:37.719221 4751 generic.go:334] "Generic (PLEG): container finished" podID="886303a3-d05b-4551-bd03-ebc2e2aef77c" containerID="ac644719d568c7b156ce9cbb766a2f8c70e69f2f94ca1bad0488a7736c5cd6c9" exitCode=0 Jan 31 14:57:37 crc kubenswrapper[4751]: I0131 14:57:37.719266 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b4bbztq" event={"ID":"886303a3-d05b-4551-bd03-ebc2e2aef77c","Type":"ContainerDied","Data":"ac644719d568c7b156ce9cbb766a2f8c70e69f2f94ca1bad0488a7736c5cd6c9"} Jan 31 14:57:37 crc kubenswrapper[4751]: I0131 14:57:37.719291 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b4bbztq" event={"ID":"886303a3-d05b-4551-bd03-ebc2e2aef77c","Type":"ContainerStarted","Data":"afefcd8155138e543666c28dd288356aaa1ab98ddb01c246f7d57e4269dc77ee"} Jan 31 14:57:37 crc kubenswrapper[4751]: I0131 14:57:37.727005 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs89j\" (UniqueName: \"kubernetes.io/projected/eec59a88-8f4d-4482-aa2a-11a508cc3a79-kube-api-access-gs89j\") pod \"920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926c857f\" (UID: \"eec59a88-8f4d-4482-aa2a-11a508cc3a79\") " pod="openstack-operators/920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926c857f" Jan 31 14:57:37 crc kubenswrapper[4751]: I0131 14:57:37.881858 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926c857f" Jan 31 14:57:45 crc kubenswrapper[4751]: I0131 14:57:45.782435 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-sync-56bwv" event={"ID":"ff5e8bad-e481-445e-99e8-5a5487e908d8","Type":"ContainerStarted","Data":"5d457b880e70ab7d7bdcd88eb562c916f03e6b62d577ebf9192cc4974cd177f7"} Jan 31 14:57:45 crc kubenswrapper[4751]: I0131 14:57:45.786353 4751 generic.go:334] "Generic (PLEG): container finished" podID="886303a3-d05b-4551-bd03-ebc2e2aef77c" containerID="1985ee06fa1b0e5b47503229ec369a787fff12bff875d4cad0ea6a84e35d2169" exitCode=0 Jan 31 14:57:45 crc kubenswrapper[4751]: I0131 14:57:45.786401 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b4bbztq" event={"ID":"886303a3-d05b-4551-bd03-ebc2e2aef77c","Type":"ContainerDied","Data":"1985ee06fa1b0e5b47503229ec369a787fff12bff875d4cad0ea6a84e35d2169"} Jan 31 14:57:45 crc kubenswrapper[4751]: I0131 14:57:45.803185 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/keystone-db-sync-56bwv" podStartSLOduration=1.105160552 podStartE2EDuration="11.803169889s" podCreationTimestamp="2026-01-31 14:57:34 +0000 UTC" firstStartedPulling="2026-01-31 14:57:34.796032357 +0000 UTC m=+959.170745252" lastFinishedPulling="2026-01-31 14:57:45.494041684 +0000 UTC m=+969.868754589" observedRunningTime="2026-01-31 14:57:45.801412813 +0000 UTC m=+970.176125698" watchObservedRunningTime="2026-01-31 14:57:45.803169889 +0000 UTC m=+970.177882774" Jan 31 14:57:45 crc kubenswrapper[4751]: W0131 14:57:45.896280 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeec59a88_8f4d_4482_aa2a_11a508cc3a79.slice/crio-a5c216ff73ca50f7f26c6f01ee85cb8e63a571945ededa582f156f42cc594b11 WatchSource:0}: Error finding container a5c216ff73ca50f7f26c6f01ee85cb8e63a571945ededa582f156f42cc594b11: Status 404 returned error can't find the container with id a5c216ff73ca50f7f26c6f01ee85cb8e63a571945ededa582f156f42cc594b11 Jan 31 14:57:45 crc kubenswrapper[4751]: I0131 14:57:45.897500 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926c857f"] Jan 31 14:57:46 crc kubenswrapper[4751]: I0131 14:57:46.807938 4751 generic.go:334] "Generic (PLEG): container finished" podID="eec59a88-8f4d-4482-aa2a-11a508cc3a79" containerID="f300eeacb21ac9e2fbb188d6628873c007703196505315fe182c22ec9d5b15ea" exitCode=0 Jan 31 14:57:46 crc kubenswrapper[4751]: I0131 14:57:46.808005 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926c857f" event={"ID":"eec59a88-8f4d-4482-aa2a-11a508cc3a79","Type":"ContainerDied","Data":"f300eeacb21ac9e2fbb188d6628873c007703196505315fe182c22ec9d5b15ea"} Jan 31 14:57:46 crc kubenswrapper[4751]: I0131 14:57:46.808579 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926c857f" event={"ID":"eec59a88-8f4d-4482-aa2a-11a508cc3a79","Type":"ContainerStarted","Data":"a5c216ff73ca50f7f26c6f01ee85cb8e63a571945ededa582f156f42cc594b11"} Jan 31 14:57:46 crc kubenswrapper[4751]: I0131 14:57:46.811560 4751 generic.go:334] "Generic (PLEG): container finished" podID="886303a3-d05b-4551-bd03-ebc2e2aef77c" containerID="48586bec329cecb88f31df9f626d414b524092e8f0898f91d2fb0a6740d113ca" exitCode=0 Jan 31 14:57:46 crc kubenswrapper[4751]: I0131 14:57:46.811922 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b4bbztq" event={"ID":"886303a3-d05b-4551-bd03-ebc2e2aef77c","Type":"ContainerDied","Data":"48586bec329cecb88f31df9f626d414b524092e8f0898f91d2fb0a6740d113ca"} Jan 31 14:57:47 crc kubenswrapper[4751]: I0131 14:57:47.819873 4751 generic.go:334] "Generic (PLEG): container finished" podID="eec59a88-8f4d-4482-aa2a-11a508cc3a79" containerID="e4090a5a7ccdd198b52342527d7dfe4217aa94455211ced22fbdbfbfcf820855" exitCode=0 Jan 31 14:57:47 crc kubenswrapper[4751]: I0131 14:57:47.819964 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926c857f" event={"ID":"eec59a88-8f4d-4482-aa2a-11a508cc3a79","Type":"ContainerDied","Data":"e4090a5a7ccdd198b52342527d7dfe4217aa94455211ced22fbdbfbfcf820855"} Jan 31 14:57:48 crc kubenswrapper[4751]: I0131 14:57:48.138252 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b4bbztq" Jan 31 14:57:48 crc kubenswrapper[4751]: I0131 14:57:48.165554 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/886303a3-d05b-4551-bd03-ebc2e2aef77c-bundle\") pod \"886303a3-d05b-4551-bd03-ebc2e2aef77c\" (UID: \"886303a3-d05b-4551-bd03-ebc2e2aef77c\") " Jan 31 14:57:48 crc kubenswrapper[4751]: I0131 14:57:48.165615 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/886303a3-d05b-4551-bd03-ebc2e2aef77c-util\") pod \"886303a3-d05b-4551-bd03-ebc2e2aef77c\" (UID: \"886303a3-d05b-4551-bd03-ebc2e2aef77c\") " Jan 31 14:57:48 crc kubenswrapper[4751]: I0131 14:57:48.165684 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnpjj\" (UniqueName: \"kubernetes.io/projected/886303a3-d05b-4551-bd03-ebc2e2aef77c-kube-api-access-rnpjj\") pod \"886303a3-d05b-4551-bd03-ebc2e2aef77c\" (UID: \"886303a3-d05b-4551-bd03-ebc2e2aef77c\") " Jan 31 14:57:48 crc kubenswrapper[4751]: I0131 14:57:48.167080 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/886303a3-d05b-4551-bd03-ebc2e2aef77c-bundle" (OuterVolumeSpecName: "bundle") pod "886303a3-d05b-4551-bd03-ebc2e2aef77c" (UID: "886303a3-d05b-4551-bd03-ebc2e2aef77c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 14:57:48 crc kubenswrapper[4751]: I0131 14:57:48.171622 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/886303a3-d05b-4551-bd03-ebc2e2aef77c-kube-api-access-rnpjj" (OuterVolumeSpecName: "kube-api-access-rnpjj") pod "886303a3-d05b-4551-bd03-ebc2e2aef77c" (UID: "886303a3-d05b-4551-bd03-ebc2e2aef77c"). InnerVolumeSpecName "kube-api-access-rnpjj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:57:48 crc kubenswrapper[4751]: I0131 14:57:48.175984 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/886303a3-d05b-4551-bd03-ebc2e2aef77c-util" (OuterVolumeSpecName: "util") pod "886303a3-d05b-4551-bd03-ebc2e2aef77c" (UID: "886303a3-d05b-4551-bd03-ebc2e2aef77c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 14:57:48 crc kubenswrapper[4751]: I0131 14:57:48.267862 4751 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/886303a3-d05b-4551-bd03-ebc2e2aef77c-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 14:57:48 crc kubenswrapper[4751]: I0131 14:57:48.267906 4751 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/886303a3-d05b-4551-bd03-ebc2e2aef77c-util\") on node \"crc\" DevicePath \"\"" Jan 31 14:57:48 crc kubenswrapper[4751]: I0131 14:57:48.267918 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnpjj\" (UniqueName: \"kubernetes.io/projected/886303a3-d05b-4551-bd03-ebc2e2aef77c-kube-api-access-rnpjj\") on node \"crc\" DevicePath \"\"" Jan 31 14:57:48 crc kubenswrapper[4751]: I0131 14:57:48.828462 4751 generic.go:334] "Generic (PLEG): container finished" podID="eec59a88-8f4d-4482-aa2a-11a508cc3a79" containerID="b51b06f6e48ff45871305e4af164ad58191ccac9b7a5f2b2bea9c0fdbdc14454" exitCode=0 Jan 31 14:57:48 crc kubenswrapper[4751]: I0131 14:57:48.828585 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926c857f" event={"ID":"eec59a88-8f4d-4482-aa2a-11a508cc3a79","Type":"ContainerDied","Data":"b51b06f6e48ff45871305e4af164ad58191ccac9b7a5f2b2bea9c0fdbdc14454"} Jan 31 14:57:48 crc kubenswrapper[4751]: I0131 14:57:48.831377 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b4bbztq" event={"ID":"886303a3-d05b-4551-bd03-ebc2e2aef77c","Type":"ContainerDied","Data":"afefcd8155138e543666c28dd288356aaa1ab98ddb01c246f7d57e4269dc77ee"} Jan 31 14:57:48 crc kubenswrapper[4751]: I0131 14:57:48.831409 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afefcd8155138e543666c28dd288356aaa1ab98ddb01c246f7d57e4269dc77ee" Jan 31 14:57:48 crc kubenswrapper[4751]: I0131 14:57:48.831416 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b4bbztq" Jan 31 14:57:49 crc kubenswrapper[4751]: I0131 14:57:49.838362 4751 generic.go:334] "Generic (PLEG): container finished" podID="ff5e8bad-e481-445e-99e8-5a5487e908d8" containerID="5d457b880e70ab7d7bdcd88eb562c916f03e6b62d577ebf9192cc4974cd177f7" exitCode=0 Jan 31 14:57:49 crc kubenswrapper[4751]: I0131 14:57:49.838446 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-sync-56bwv" event={"ID":"ff5e8bad-e481-445e-99e8-5a5487e908d8","Type":"ContainerDied","Data":"5d457b880e70ab7d7bdcd88eb562c916f03e6b62d577ebf9192cc4974cd177f7"} Jan 31 14:57:50 crc kubenswrapper[4751]: I0131 14:57:50.142172 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926c857f" Jan 31 14:57:50 crc kubenswrapper[4751]: I0131 14:57:50.192318 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gs89j\" (UniqueName: \"kubernetes.io/projected/eec59a88-8f4d-4482-aa2a-11a508cc3a79-kube-api-access-gs89j\") pod \"eec59a88-8f4d-4482-aa2a-11a508cc3a79\" (UID: \"eec59a88-8f4d-4482-aa2a-11a508cc3a79\") " Jan 31 14:57:50 crc kubenswrapper[4751]: I0131 14:57:50.192471 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eec59a88-8f4d-4482-aa2a-11a508cc3a79-bundle\") pod \"eec59a88-8f4d-4482-aa2a-11a508cc3a79\" (UID: \"eec59a88-8f4d-4482-aa2a-11a508cc3a79\") " Jan 31 14:57:50 crc kubenswrapper[4751]: I0131 14:57:50.193185 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eec59a88-8f4d-4482-aa2a-11a508cc3a79-bundle" (OuterVolumeSpecName: "bundle") pod "eec59a88-8f4d-4482-aa2a-11a508cc3a79" (UID: "eec59a88-8f4d-4482-aa2a-11a508cc3a79"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 14:57:50 crc kubenswrapper[4751]: I0131 14:57:50.193814 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eec59a88-8f4d-4482-aa2a-11a508cc3a79-util\") pod \"eec59a88-8f4d-4482-aa2a-11a508cc3a79\" (UID: \"eec59a88-8f4d-4482-aa2a-11a508cc3a79\") " Jan 31 14:57:50 crc kubenswrapper[4751]: I0131 14:57:50.194438 4751 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eec59a88-8f4d-4482-aa2a-11a508cc3a79-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 14:57:50 crc kubenswrapper[4751]: I0131 14:57:50.214896 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eec59a88-8f4d-4482-aa2a-11a508cc3a79-kube-api-access-gs89j" (OuterVolumeSpecName: "kube-api-access-gs89j") pod "eec59a88-8f4d-4482-aa2a-11a508cc3a79" (UID: "eec59a88-8f4d-4482-aa2a-11a508cc3a79"). InnerVolumeSpecName "kube-api-access-gs89j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:57:50 crc kubenswrapper[4751]: I0131 14:57:50.220549 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eec59a88-8f4d-4482-aa2a-11a508cc3a79-util" (OuterVolumeSpecName: "util") pod "eec59a88-8f4d-4482-aa2a-11a508cc3a79" (UID: "eec59a88-8f4d-4482-aa2a-11a508cc3a79"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 14:57:50 crc kubenswrapper[4751]: I0131 14:57:50.295211 4751 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eec59a88-8f4d-4482-aa2a-11a508cc3a79-util\") on node \"crc\" DevicePath \"\"" Jan 31 14:57:50 crc kubenswrapper[4751]: I0131 14:57:50.295239 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gs89j\" (UniqueName: \"kubernetes.io/projected/eec59a88-8f4d-4482-aa2a-11a508cc3a79-kube-api-access-gs89j\") on node \"crc\" DevicePath \"\"" Jan 31 14:57:50 crc kubenswrapper[4751]: I0131 14:57:50.847773 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926c857f" event={"ID":"eec59a88-8f4d-4482-aa2a-11a508cc3a79","Type":"ContainerDied","Data":"a5c216ff73ca50f7f26c6f01ee85cb8e63a571945ededa582f156f42cc594b11"} Jan 31 14:57:50 crc kubenswrapper[4751]: I0131 14:57:50.848182 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5c216ff73ca50f7f26c6f01ee85cb8e63a571945ededa582f156f42cc594b11" Jan 31 14:57:50 crc kubenswrapper[4751]: I0131 14:57:50.847790 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926c857f" Jan 31 14:57:51 crc kubenswrapper[4751]: I0131 14:57:51.169889 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-sync-56bwv" Jan 31 14:57:51 crc kubenswrapper[4751]: I0131 14:57:51.205932 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff5e8bad-e481-445e-99e8-5a5487e908d8-config-data\") pod \"ff5e8bad-e481-445e-99e8-5a5487e908d8\" (UID: \"ff5e8bad-e481-445e-99e8-5a5487e908d8\") " Jan 31 14:57:51 crc kubenswrapper[4751]: I0131 14:57:51.206108 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hq52\" (UniqueName: \"kubernetes.io/projected/ff5e8bad-e481-445e-99e8-5a5487e908d8-kube-api-access-7hq52\") pod \"ff5e8bad-e481-445e-99e8-5a5487e908d8\" (UID: \"ff5e8bad-e481-445e-99e8-5a5487e908d8\") " Jan 31 14:57:51 crc kubenswrapper[4751]: I0131 14:57:51.211259 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff5e8bad-e481-445e-99e8-5a5487e908d8-kube-api-access-7hq52" (OuterVolumeSpecName: "kube-api-access-7hq52") pod "ff5e8bad-e481-445e-99e8-5a5487e908d8" (UID: "ff5e8bad-e481-445e-99e8-5a5487e908d8"). InnerVolumeSpecName "kube-api-access-7hq52". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:57:51 crc kubenswrapper[4751]: I0131 14:57:51.234489 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff5e8bad-e481-445e-99e8-5a5487e908d8-config-data" (OuterVolumeSpecName: "config-data") pod "ff5e8bad-e481-445e-99e8-5a5487e908d8" (UID: "ff5e8bad-e481-445e-99e8-5a5487e908d8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:57:51 crc kubenswrapper[4751]: I0131 14:57:51.307956 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff5e8bad-e481-445e-99e8-5a5487e908d8-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 14:57:51 crc kubenswrapper[4751]: I0131 14:57:51.307991 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hq52\" (UniqueName: \"kubernetes.io/projected/ff5e8bad-e481-445e-99e8-5a5487e908d8-kube-api-access-7hq52\") on node \"crc\" DevicePath \"\"" Jan 31 14:57:51 crc kubenswrapper[4751]: I0131 14:57:51.855397 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-sync-56bwv" event={"ID":"ff5e8bad-e481-445e-99e8-5a5487e908d8","Type":"ContainerDied","Data":"ecc8a96f4fea25c974208298436bf510666bd64b0748730b17c3c0f483b01a86"} Jan 31 14:57:51 crc kubenswrapper[4751]: I0131 14:57:51.855451 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ecc8a96f4fea25c974208298436bf510666bd64b0748730b17c3c0f483b01a86" Jan 31 14:57:51 crc kubenswrapper[4751]: I0131 14:57:51.855507 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-sync-56bwv" Jan 31 14:57:52 crc kubenswrapper[4751]: I0131 14:57:52.068839 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-bootstrap-hnxnd"] Jan 31 14:57:52 crc kubenswrapper[4751]: E0131 14:57:52.069159 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eec59a88-8f4d-4482-aa2a-11a508cc3a79" containerName="util" Jan 31 14:57:52 crc kubenswrapper[4751]: I0131 14:57:52.069176 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="eec59a88-8f4d-4482-aa2a-11a508cc3a79" containerName="util" Jan 31 14:57:52 crc kubenswrapper[4751]: E0131 14:57:52.069193 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="886303a3-d05b-4551-bd03-ebc2e2aef77c" containerName="pull" Jan 31 14:57:52 crc kubenswrapper[4751]: I0131 14:57:52.069202 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="886303a3-d05b-4551-bd03-ebc2e2aef77c" containerName="pull" Jan 31 14:57:52 crc kubenswrapper[4751]: E0131 14:57:52.069217 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="886303a3-d05b-4551-bd03-ebc2e2aef77c" containerName="extract" Jan 31 14:57:52 crc kubenswrapper[4751]: I0131 14:57:52.069224 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="886303a3-d05b-4551-bd03-ebc2e2aef77c" containerName="extract" Jan 31 14:57:52 crc kubenswrapper[4751]: E0131 14:57:52.069237 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff5e8bad-e481-445e-99e8-5a5487e908d8" containerName="keystone-db-sync" Jan 31 14:57:52 crc kubenswrapper[4751]: I0131 14:57:52.069245 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff5e8bad-e481-445e-99e8-5a5487e908d8" containerName="keystone-db-sync" Jan 31 14:57:52 crc kubenswrapper[4751]: E0131 14:57:52.069257 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="886303a3-d05b-4551-bd03-ebc2e2aef77c" containerName="util" Jan 31 14:57:52 crc kubenswrapper[4751]: I0131 14:57:52.069265 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="886303a3-d05b-4551-bd03-ebc2e2aef77c" containerName="util" Jan 31 14:57:52 crc kubenswrapper[4751]: E0131 14:57:52.069278 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eec59a88-8f4d-4482-aa2a-11a508cc3a79" containerName="pull" Jan 31 14:57:52 crc kubenswrapper[4751]: I0131 14:57:52.069286 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="eec59a88-8f4d-4482-aa2a-11a508cc3a79" containerName="pull" Jan 31 14:57:52 crc kubenswrapper[4751]: E0131 14:57:52.069296 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eec59a88-8f4d-4482-aa2a-11a508cc3a79" containerName="extract" Jan 31 14:57:52 crc kubenswrapper[4751]: I0131 14:57:52.069304 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="eec59a88-8f4d-4482-aa2a-11a508cc3a79" containerName="extract" Jan 31 14:57:52 crc kubenswrapper[4751]: I0131 14:57:52.069431 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff5e8bad-e481-445e-99e8-5a5487e908d8" containerName="keystone-db-sync" Jan 31 14:57:52 crc kubenswrapper[4751]: I0131 14:57:52.069449 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="886303a3-d05b-4551-bd03-ebc2e2aef77c" containerName="extract" Jan 31 14:57:52 crc kubenswrapper[4751]: I0131 14:57:52.069465 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="eec59a88-8f4d-4482-aa2a-11a508cc3a79" containerName="extract" Jan 31 14:57:52 crc kubenswrapper[4751]: I0131 14:57:52.069985 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-bootstrap-hnxnd" Jan 31 14:57:52 crc kubenswrapper[4751]: I0131 14:57:52.072257 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"osp-secret" Jan 31 14:57:52 crc kubenswrapper[4751]: I0131 14:57:52.072455 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-config-data" Jan 31 14:57:52 crc kubenswrapper[4751]: I0131 14:57:52.072551 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone" Jan 31 14:57:52 crc kubenswrapper[4751]: I0131 14:57:52.072666 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-scripts" Jan 31 14:57:52 crc kubenswrapper[4751]: I0131 14:57:52.073063 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-keystone-dockercfg-szsvc" Jan 31 14:57:52 crc kubenswrapper[4751]: I0131 14:57:52.090483 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-bootstrap-hnxnd"] Jan 31 14:57:52 crc kubenswrapper[4751]: I0131 14:57:52.127849 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/041ede36-25a1-4d6d-9de2-d16218c5fc67-credential-keys\") pod \"keystone-bootstrap-hnxnd\" (UID: \"041ede36-25a1-4d6d-9de2-d16218c5fc67\") " pod="glance-kuttl-tests/keystone-bootstrap-hnxnd" Jan 31 14:57:52 crc kubenswrapper[4751]: I0131 14:57:52.127913 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/041ede36-25a1-4d6d-9de2-d16218c5fc67-fernet-keys\") pod \"keystone-bootstrap-hnxnd\" (UID: \"041ede36-25a1-4d6d-9de2-d16218c5fc67\") " pod="glance-kuttl-tests/keystone-bootstrap-hnxnd" Jan 31 14:57:52 crc kubenswrapper[4751]: I0131 14:57:52.127934 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lxls\" (UniqueName: \"kubernetes.io/projected/041ede36-25a1-4d6d-9de2-d16218c5fc67-kube-api-access-9lxls\") pod \"keystone-bootstrap-hnxnd\" (UID: \"041ede36-25a1-4d6d-9de2-d16218c5fc67\") " pod="glance-kuttl-tests/keystone-bootstrap-hnxnd" Jan 31 14:57:52 crc kubenswrapper[4751]: I0131 14:57:52.128250 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/041ede36-25a1-4d6d-9de2-d16218c5fc67-config-data\") pod \"keystone-bootstrap-hnxnd\" (UID: \"041ede36-25a1-4d6d-9de2-d16218c5fc67\") " pod="glance-kuttl-tests/keystone-bootstrap-hnxnd" Jan 31 14:57:52 crc kubenswrapper[4751]: I0131 14:57:52.128352 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/041ede36-25a1-4d6d-9de2-d16218c5fc67-scripts\") pod \"keystone-bootstrap-hnxnd\" (UID: \"041ede36-25a1-4d6d-9de2-d16218c5fc67\") " pod="glance-kuttl-tests/keystone-bootstrap-hnxnd" Jan 31 14:57:52 crc kubenswrapper[4751]: I0131 14:57:52.229899 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/041ede36-25a1-4d6d-9de2-d16218c5fc67-scripts\") pod \"keystone-bootstrap-hnxnd\" (UID: \"041ede36-25a1-4d6d-9de2-d16218c5fc67\") " pod="glance-kuttl-tests/keystone-bootstrap-hnxnd" Jan 31 14:57:52 crc kubenswrapper[4751]: I0131 14:57:52.230007 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/041ede36-25a1-4d6d-9de2-d16218c5fc67-credential-keys\") pod \"keystone-bootstrap-hnxnd\" (UID: \"041ede36-25a1-4d6d-9de2-d16218c5fc67\") " pod="glance-kuttl-tests/keystone-bootstrap-hnxnd" Jan 31 14:57:52 crc kubenswrapper[4751]: I0131 14:57:52.230051 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/041ede36-25a1-4d6d-9de2-d16218c5fc67-fernet-keys\") pod \"keystone-bootstrap-hnxnd\" (UID: \"041ede36-25a1-4d6d-9de2-d16218c5fc67\") " pod="glance-kuttl-tests/keystone-bootstrap-hnxnd" Jan 31 14:57:52 crc kubenswrapper[4751]: I0131 14:57:52.230091 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lxls\" (UniqueName: \"kubernetes.io/projected/041ede36-25a1-4d6d-9de2-d16218c5fc67-kube-api-access-9lxls\") pod \"keystone-bootstrap-hnxnd\" (UID: \"041ede36-25a1-4d6d-9de2-d16218c5fc67\") " pod="glance-kuttl-tests/keystone-bootstrap-hnxnd" Jan 31 14:57:52 crc kubenswrapper[4751]: I0131 14:57:52.230141 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/041ede36-25a1-4d6d-9de2-d16218c5fc67-config-data\") pod \"keystone-bootstrap-hnxnd\" (UID: \"041ede36-25a1-4d6d-9de2-d16218c5fc67\") " pod="glance-kuttl-tests/keystone-bootstrap-hnxnd" Jan 31 14:57:52 crc kubenswrapper[4751]: I0131 14:57:52.248175 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/041ede36-25a1-4d6d-9de2-d16218c5fc67-fernet-keys\") pod \"keystone-bootstrap-hnxnd\" (UID: \"041ede36-25a1-4d6d-9de2-d16218c5fc67\") " pod="glance-kuttl-tests/keystone-bootstrap-hnxnd" Jan 31 14:57:52 crc kubenswrapper[4751]: I0131 14:57:52.248644 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/041ede36-25a1-4d6d-9de2-d16218c5fc67-scripts\") pod \"keystone-bootstrap-hnxnd\" (UID: \"041ede36-25a1-4d6d-9de2-d16218c5fc67\") " pod="glance-kuttl-tests/keystone-bootstrap-hnxnd" Jan 31 14:57:52 crc kubenswrapper[4751]: I0131 14:57:52.248901 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/041ede36-25a1-4d6d-9de2-d16218c5fc67-config-data\") pod \"keystone-bootstrap-hnxnd\" (UID: \"041ede36-25a1-4d6d-9de2-d16218c5fc67\") " pod="glance-kuttl-tests/keystone-bootstrap-hnxnd" Jan 31 14:57:52 crc kubenswrapper[4751]: I0131 14:57:52.251542 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/041ede36-25a1-4d6d-9de2-d16218c5fc67-credential-keys\") pod \"keystone-bootstrap-hnxnd\" (UID: \"041ede36-25a1-4d6d-9de2-d16218c5fc67\") " pod="glance-kuttl-tests/keystone-bootstrap-hnxnd" Jan 31 14:57:52 crc kubenswrapper[4751]: I0131 14:57:52.254710 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lxls\" (UniqueName: \"kubernetes.io/projected/041ede36-25a1-4d6d-9de2-d16218c5fc67-kube-api-access-9lxls\") pod \"keystone-bootstrap-hnxnd\" (UID: \"041ede36-25a1-4d6d-9de2-d16218c5fc67\") " pod="glance-kuttl-tests/keystone-bootstrap-hnxnd" Jan 31 14:57:52 crc kubenswrapper[4751]: I0131 14:57:52.397868 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-bootstrap-hnxnd" Jan 31 14:57:52 crc kubenswrapper[4751]: I0131 14:57:52.819480 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-bootstrap-hnxnd"] Jan 31 14:57:52 crc kubenswrapper[4751]: W0131 14:57:52.829399 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod041ede36_25a1_4d6d_9de2_d16218c5fc67.slice/crio-abf3888aa3724ab2f3cae0622ef9b13caeec62afa882cdfaa1640df9c86f1488 WatchSource:0}: Error finding container abf3888aa3724ab2f3cae0622ef9b13caeec62afa882cdfaa1640df9c86f1488: Status 404 returned error can't find the container with id abf3888aa3724ab2f3cae0622ef9b13caeec62afa882cdfaa1640df9c86f1488 Jan 31 14:57:52 crc kubenswrapper[4751]: I0131 14:57:52.863362 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-bootstrap-hnxnd" event={"ID":"041ede36-25a1-4d6d-9de2-d16218c5fc67","Type":"ContainerStarted","Data":"abf3888aa3724ab2f3cae0622ef9b13caeec62afa882cdfaa1640df9c86f1488"} Jan 31 14:57:55 crc kubenswrapper[4751]: I0131 14:57:55.888933 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-bootstrap-hnxnd" event={"ID":"041ede36-25a1-4d6d-9de2-d16218c5fc67","Type":"ContainerStarted","Data":"be0ffdbf0de55d407a928a375e5355c5f5a9cda93c0fc7ee45e3254cddeefdc8"} Jan 31 14:57:55 crc kubenswrapper[4751]: I0131 14:57:55.910896 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/keystone-bootstrap-hnxnd" podStartSLOduration=3.910875851 podStartE2EDuration="3.910875851s" podCreationTimestamp="2026-01-31 14:57:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:57:55.905983092 +0000 UTC m=+980.280695977" watchObservedRunningTime="2026-01-31 14:57:55.910875851 +0000 UTC m=+980.285588736" Jan 31 14:57:58 crc kubenswrapper[4751]: I0131 14:57:58.909276 4751 generic.go:334] "Generic (PLEG): container finished" podID="041ede36-25a1-4d6d-9de2-d16218c5fc67" containerID="be0ffdbf0de55d407a928a375e5355c5f5a9cda93c0fc7ee45e3254cddeefdc8" exitCode=0 Jan 31 14:57:58 crc kubenswrapper[4751]: I0131 14:57:58.909389 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-bootstrap-hnxnd" event={"ID":"041ede36-25a1-4d6d-9de2-d16218c5fc67","Type":"ContainerDied","Data":"be0ffdbf0de55d407a928a375e5355c5f5a9cda93c0fc7ee45e3254cddeefdc8"} Jan 31 14:58:00 crc kubenswrapper[4751]: I0131 14:58:00.153332 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-bootstrap-hnxnd" Jan 31 14:58:00 crc kubenswrapper[4751]: I0131 14:58:00.335281 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lxls\" (UniqueName: \"kubernetes.io/projected/041ede36-25a1-4d6d-9de2-d16218c5fc67-kube-api-access-9lxls\") pod \"041ede36-25a1-4d6d-9de2-d16218c5fc67\" (UID: \"041ede36-25a1-4d6d-9de2-d16218c5fc67\") " Jan 31 14:58:00 crc kubenswrapper[4751]: I0131 14:58:00.335361 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/041ede36-25a1-4d6d-9de2-d16218c5fc67-config-data\") pod \"041ede36-25a1-4d6d-9de2-d16218c5fc67\" (UID: \"041ede36-25a1-4d6d-9de2-d16218c5fc67\") " Jan 31 14:58:00 crc kubenswrapper[4751]: I0131 14:58:00.335409 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/041ede36-25a1-4d6d-9de2-d16218c5fc67-credential-keys\") pod \"041ede36-25a1-4d6d-9de2-d16218c5fc67\" (UID: \"041ede36-25a1-4d6d-9de2-d16218c5fc67\") " Jan 31 14:58:00 crc kubenswrapper[4751]: I0131 14:58:00.335473 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/041ede36-25a1-4d6d-9de2-d16218c5fc67-scripts\") pod \"041ede36-25a1-4d6d-9de2-d16218c5fc67\" (UID: \"041ede36-25a1-4d6d-9de2-d16218c5fc67\") " Jan 31 14:58:00 crc kubenswrapper[4751]: I0131 14:58:00.335526 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/041ede36-25a1-4d6d-9de2-d16218c5fc67-fernet-keys\") pod \"041ede36-25a1-4d6d-9de2-d16218c5fc67\" (UID: \"041ede36-25a1-4d6d-9de2-d16218c5fc67\") " Jan 31 14:58:00 crc kubenswrapper[4751]: I0131 14:58:00.340140 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/041ede36-25a1-4d6d-9de2-d16218c5fc67-scripts" (OuterVolumeSpecName: "scripts") pod "041ede36-25a1-4d6d-9de2-d16218c5fc67" (UID: "041ede36-25a1-4d6d-9de2-d16218c5fc67"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:58:00 crc kubenswrapper[4751]: I0131 14:58:00.340657 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/041ede36-25a1-4d6d-9de2-d16218c5fc67-kube-api-access-9lxls" (OuterVolumeSpecName: "kube-api-access-9lxls") pod "041ede36-25a1-4d6d-9de2-d16218c5fc67" (UID: "041ede36-25a1-4d6d-9de2-d16218c5fc67"). InnerVolumeSpecName "kube-api-access-9lxls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:58:00 crc kubenswrapper[4751]: I0131 14:58:00.341223 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/041ede36-25a1-4d6d-9de2-d16218c5fc67-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "041ede36-25a1-4d6d-9de2-d16218c5fc67" (UID: "041ede36-25a1-4d6d-9de2-d16218c5fc67"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:58:00 crc kubenswrapper[4751]: I0131 14:58:00.347298 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/041ede36-25a1-4d6d-9de2-d16218c5fc67-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "041ede36-25a1-4d6d-9de2-d16218c5fc67" (UID: "041ede36-25a1-4d6d-9de2-d16218c5fc67"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:58:00 crc kubenswrapper[4751]: I0131 14:58:00.360822 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/041ede36-25a1-4d6d-9de2-d16218c5fc67-config-data" (OuterVolumeSpecName: "config-data") pod "041ede36-25a1-4d6d-9de2-d16218c5fc67" (UID: "041ede36-25a1-4d6d-9de2-d16218c5fc67"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:58:00 crc kubenswrapper[4751]: I0131 14:58:00.437781 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lxls\" (UniqueName: \"kubernetes.io/projected/041ede36-25a1-4d6d-9de2-d16218c5fc67-kube-api-access-9lxls\") on node \"crc\" DevicePath \"\"" Jan 31 14:58:00 crc kubenswrapper[4751]: I0131 14:58:00.437814 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/041ede36-25a1-4d6d-9de2-d16218c5fc67-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 14:58:00 crc kubenswrapper[4751]: I0131 14:58:00.437824 4751 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/041ede36-25a1-4d6d-9de2-d16218c5fc67-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 31 14:58:00 crc kubenswrapper[4751]: I0131 14:58:00.437832 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/041ede36-25a1-4d6d-9de2-d16218c5fc67-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 14:58:00 crc kubenswrapper[4751]: I0131 14:58:00.437840 4751 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/041ede36-25a1-4d6d-9de2-d16218c5fc67-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 31 14:58:00 crc kubenswrapper[4751]: I0131 14:58:00.925225 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-bootstrap-hnxnd" event={"ID":"041ede36-25a1-4d6d-9de2-d16218c5fc67","Type":"ContainerDied","Data":"abf3888aa3724ab2f3cae0622ef9b13caeec62afa882cdfaa1640df9c86f1488"} Jan 31 14:58:00 crc kubenswrapper[4751]: I0131 14:58:00.925560 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abf3888aa3724ab2f3cae0622ef9b13caeec62afa882cdfaa1640df9c86f1488" Jan 31 14:58:00 crc kubenswrapper[4751]: I0131 14:58:00.925280 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-bootstrap-hnxnd" Jan 31 14:58:01 crc kubenswrapper[4751]: I0131 14:58:01.016802 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-859d455469-zqqzw"] Jan 31 14:58:01 crc kubenswrapper[4751]: E0131 14:58:01.017234 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="041ede36-25a1-4d6d-9de2-d16218c5fc67" containerName="keystone-bootstrap" Jan 31 14:58:01 crc kubenswrapper[4751]: I0131 14:58:01.017297 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="041ede36-25a1-4d6d-9de2-d16218c5fc67" containerName="keystone-bootstrap" Jan 31 14:58:01 crc kubenswrapper[4751]: I0131 14:58:01.017495 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="041ede36-25a1-4d6d-9de2-d16218c5fc67" containerName="keystone-bootstrap" Jan 31 14:58:01 crc kubenswrapper[4751]: I0131 14:58:01.017949 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-859d455469-zqqzw" Jan 31 14:58:01 crc kubenswrapper[4751]: I0131 14:58:01.022466 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-config-data" Jan 31 14:58:01 crc kubenswrapper[4751]: I0131 14:58:01.023219 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-keystone-dockercfg-szsvc" Jan 31 14:58:01 crc kubenswrapper[4751]: I0131 14:58:01.023250 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-scripts" Jan 31 14:58:01 crc kubenswrapper[4751]: I0131 14:58:01.023858 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone" Jan 31 14:58:01 crc kubenswrapper[4751]: I0131 14:58:01.039324 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-859d455469-zqqzw"] Jan 31 14:58:01 crc kubenswrapper[4751]: I0131 14:58:01.045803 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dabb55da-08db-4d2a-8b2d-ac7b2b657053-scripts\") pod \"keystone-859d455469-zqqzw\" (UID: \"dabb55da-08db-4d2a-8b2d-ac7b2b657053\") " pod="glance-kuttl-tests/keystone-859d455469-zqqzw" Jan 31 14:58:01 crc kubenswrapper[4751]: I0131 14:58:01.046024 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dabb55da-08db-4d2a-8b2d-ac7b2b657053-credential-keys\") pod \"keystone-859d455469-zqqzw\" (UID: \"dabb55da-08db-4d2a-8b2d-ac7b2b657053\") " pod="glance-kuttl-tests/keystone-859d455469-zqqzw" Jan 31 14:58:01 crc kubenswrapper[4751]: I0131 14:58:01.046146 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dabb55da-08db-4d2a-8b2d-ac7b2b657053-fernet-keys\") pod \"keystone-859d455469-zqqzw\" (UID: \"dabb55da-08db-4d2a-8b2d-ac7b2b657053\") " pod="glance-kuttl-tests/keystone-859d455469-zqqzw" Jan 31 14:58:01 crc kubenswrapper[4751]: I0131 14:58:01.046225 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dabb55da-08db-4d2a-8b2d-ac7b2b657053-config-data\") pod \"keystone-859d455469-zqqzw\" (UID: \"dabb55da-08db-4d2a-8b2d-ac7b2b657053\") " pod="glance-kuttl-tests/keystone-859d455469-zqqzw" Jan 31 14:58:01 crc kubenswrapper[4751]: I0131 14:58:01.046304 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46qfq\" (UniqueName: \"kubernetes.io/projected/dabb55da-08db-4d2a-8b2d-ac7b2b657053-kube-api-access-46qfq\") pod \"keystone-859d455469-zqqzw\" (UID: \"dabb55da-08db-4d2a-8b2d-ac7b2b657053\") " pod="glance-kuttl-tests/keystone-859d455469-zqqzw" Jan 31 14:58:01 crc kubenswrapper[4751]: I0131 14:58:01.146821 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dabb55da-08db-4d2a-8b2d-ac7b2b657053-scripts\") pod \"keystone-859d455469-zqqzw\" (UID: \"dabb55da-08db-4d2a-8b2d-ac7b2b657053\") " pod="glance-kuttl-tests/keystone-859d455469-zqqzw" Jan 31 14:58:01 crc kubenswrapper[4751]: I0131 14:58:01.146873 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dabb55da-08db-4d2a-8b2d-ac7b2b657053-credential-keys\") pod \"keystone-859d455469-zqqzw\" (UID: \"dabb55da-08db-4d2a-8b2d-ac7b2b657053\") " pod="glance-kuttl-tests/keystone-859d455469-zqqzw" Jan 31 14:58:01 crc kubenswrapper[4751]: I0131 14:58:01.146901 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dabb55da-08db-4d2a-8b2d-ac7b2b657053-fernet-keys\") pod \"keystone-859d455469-zqqzw\" (UID: \"dabb55da-08db-4d2a-8b2d-ac7b2b657053\") " pod="glance-kuttl-tests/keystone-859d455469-zqqzw" Jan 31 14:58:01 crc kubenswrapper[4751]: I0131 14:58:01.146917 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dabb55da-08db-4d2a-8b2d-ac7b2b657053-config-data\") pod \"keystone-859d455469-zqqzw\" (UID: \"dabb55da-08db-4d2a-8b2d-ac7b2b657053\") " pod="glance-kuttl-tests/keystone-859d455469-zqqzw" Jan 31 14:58:01 crc kubenswrapper[4751]: I0131 14:58:01.146951 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46qfq\" (UniqueName: \"kubernetes.io/projected/dabb55da-08db-4d2a-8b2d-ac7b2b657053-kube-api-access-46qfq\") pod \"keystone-859d455469-zqqzw\" (UID: \"dabb55da-08db-4d2a-8b2d-ac7b2b657053\") " pod="glance-kuttl-tests/keystone-859d455469-zqqzw" Jan 31 14:58:01 crc kubenswrapper[4751]: I0131 14:58:01.150670 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dabb55da-08db-4d2a-8b2d-ac7b2b657053-credential-keys\") pod \"keystone-859d455469-zqqzw\" (UID: \"dabb55da-08db-4d2a-8b2d-ac7b2b657053\") " pod="glance-kuttl-tests/keystone-859d455469-zqqzw" Jan 31 14:58:01 crc kubenswrapper[4751]: I0131 14:58:01.150789 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dabb55da-08db-4d2a-8b2d-ac7b2b657053-config-data\") pod \"keystone-859d455469-zqqzw\" (UID: \"dabb55da-08db-4d2a-8b2d-ac7b2b657053\") " pod="glance-kuttl-tests/keystone-859d455469-zqqzw" Jan 31 14:58:01 crc kubenswrapper[4751]: I0131 14:58:01.151352 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dabb55da-08db-4d2a-8b2d-ac7b2b657053-scripts\") pod \"keystone-859d455469-zqqzw\" (UID: \"dabb55da-08db-4d2a-8b2d-ac7b2b657053\") " pod="glance-kuttl-tests/keystone-859d455469-zqqzw" Jan 31 14:58:01 crc kubenswrapper[4751]: I0131 14:58:01.151672 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dabb55da-08db-4d2a-8b2d-ac7b2b657053-fernet-keys\") pod \"keystone-859d455469-zqqzw\" (UID: \"dabb55da-08db-4d2a-8b2d-ac7b2b657053\") " pod="glance-kuttl-tests/keystone-859d455469-zqqzw" Jan 31 14:58:01 crc kubenswrapper[4751]: I0131 14:58:01.164706 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46qfq\" (UniqueName: \"kubernetes.io/projected/dabb55da-08db-4d2a-8b2d-ac7b2b657053-kube-api-access-46qfq\") pod \"keystone-859d455469-zqqzw\" (UID: \"dabb55da-08db-4d2a-8b2d-ac7b2b657053\") " pod="glance-kuttl-tests/keystone-859d455469-zqqzw" Jan 31 14:58:01 crc kubenswrapper[4751]: I0131 14:58:01.333206 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-859d455469-zqqzw" Jan 31 14:58:01 crc kubenswrapper[4751]: I0131 14:58:01.779353 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-859d455469-zqqzw"] Jan 31 14:58:01 crc kubenswrapper[4751]: I0131 14:58:01.932437 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-859d455469-zqqzw" event={"ID":"dabb55da-08db-4d2a-8b2d-ac7b2b657053","Type":"ContainerStarted","Data":"6d283eaf7e9a4eadb7f123ebbd0723c09363494de09f8fb76c6271216f1a8ecb"} Jan 31 14:58:01 crc kubenswrapper[4751]: I0131 14:58:01.932482 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-859d455469-zqqzw" event={"ID":"dabb55da-08db-4d2a-8b2d-ac7b2b657053","Type":"ContainerStarted","Data":"02e1eb0fcf9c093b28dd6fc9f0fb02613d1865a02336d6e8e82c2fa50f8597a7"} Jan 31 14:58:01 crc kubenswrapper[4751]: I0131 14:58:01.932588 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/keystone-859d455469-zqqzw" Jan 31 14:58:01 crc kubenswrapper[4751]: I0131 14:58:01.947938 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/keystone-859d455469-zqqzw" podStartSLOduration=1.947917205 podStartE2EDuration="1.947917205s" podCreationTimestamp="2026-01-31 14:58:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:58:01.947306669 +0000 UTC m=+986.322019564" watchObservedRunningTime="2026-01-31 14:58:01.947917205 +0000 UTC m=+986.322630090" Jan 31 14:58:04 crc kubenswrapper[4751]: I0131 14:58:04.304904 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-59595cd-9djr5"] Jan 31 14:58:04 crc kubenswrapper[4751]: I0131 14:58:04.306030 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-59595cd-9djr5" Jan 31 14:58:04 crc kubenswrapper[4751]: I0131 14:58:04.307863 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-zkbxh" Jan 31 14:58:04 crc kubenswrapper[4751]: I0131 14:58:04.308173 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-service-cert" Jan 31 14:58:04 crc kubenswrapper[4751]: I0131 14:58:04.322943 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-59595cd-9djr5"] Jan 31 14:58:04 crc kubenswrapper[4751]: I0131 14:58:04.496539 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9f3dfaad-d451-448b-a447-47fc7bbff0e5-apiservice-cert\") pod \"swift-operator-controller-manager-59595cd-9djr5\" (UID: \"9f3dfaad-d451-448b-a447-47fc7bbff0e5\") " pod="openstack-operators/swift-operator-controller-manager-59595cd-9djr5" Jan 31 14:58:04 crc kubenswrapper[4751]: I0131 14:58:04.496612 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9f3dfaad-d451-448b-a447-47fc7bbff0e5-webhook-cert\") pod \"swift-operator-controller-manager-59595cd-9djr5\" (UID: \"9f3dfaad-d451-448b-a447-47fc7bbff0e5\") " pod="openstack-operators/swift-operator-controller-manager-59595cd-9djr5" Jan 31 14:58:04 crc kubenswrapper[4751]: I0131 14:58:04.496672 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67l2j\" (UniqueName: \"kubernetes.io/projected/9f3dfaad-d451-448b-a447-47fc7bbff0e5-kube-api-access-67l2j\") pod \"swift-operator-controller-manager-59595cd-9djr5\" (UID: \"9f3dfaad-d451-448b-a447-47fc7bbff0e5\") " pod="openstack-operators/swift-operator-controller-manager-59595cd-9djr5" Jan 31 14:58:04 crc kubenswrapper[4751]: I0131 14:58:04.597502 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9f3dfaad-d451-448b-a447-47fc7bbff0e5-apiservice-cert\") pod \"swift-operator-controller-manager-59595cd-9djr5\" (UID: \"9f3dfaad-d451-448b-a447-47fc7bbff0e5\") " pod="openstack-operators/swift-operator-controller-manager-59595cd-9djr5" Jan 31 14:58:04 crc kubenswrapper[4751]: I0131 14:58:04.597558 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9f3dfaad-d451-448b-a447-47fc7bbff0e5-webhook-cert\") pod \"swift-operator-controller-manager-59595cd-9djr5\" (UID: \"9f3dfaad-d451-448b-a447-47fc7bbff0e5\") " pod="openstack-operators/swift-operator-controller-manager-59595cd-9djr5" Jan 31 14:58:04 crc kubenswrapper[4751]: I0131 14:58:04.597588 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67l2j\" (UniqueName: \"kubernetes.io/projected/9f3dfaad-d451-448b-a447-47fc7bbff0e5-kube-api-access-67l2j\") pod \"swift-operator-controller-manager-59595cd-9djr5\" (UID: \"9f3dfaad-d451-448b-a447-47fc7bbff0e5\") " pod="openstack-operators/swift-operator-controller-manager-59595cd-9djr5" Jan 31 14:58:04 crc kubenswrapper[4751]: I0131 14:58:04.603537 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9f3dfaad-d451-448b-a447-47fc7bbff0e5-apiservice-cert\") pod \"swift-operator-controller-manager-59595cd-9djr5\" (UID: \"9f3dfaad-d451-448b-a447-47fc7bbff0e5\") " pod="openstack-operators/swift-operator-controller-manager-59595cd-9djr5" Jan 31 14:58:04 crc kubenswrapper[4751]: I0131 14:58:04.606602 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9f3dfaad-d451-448b-a447-47fc7bbff0e5-webhook-cert\") pod \"swift-operator-controller-manager-59595cd-9djr5\" (UID: \"9f3dfaad-d451-448b-a447-47fc7bbff0e5\") " pod="openstack-operators/swift-operator-controller-manager-59595cd-9djr5" Jan 31 14:58:04 crc kubenswrapper[4751]: I0131 14:58:04.613644 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67l2j\" (UniqueName: \"kubernetes.io/projected/9f3dfaad-d451-448b-a447-47fc7bbff0e5-kube-api-access-67l2j\") pod \"swift-operator-controller-manager-59595cd-9djr5\" (UID: \"9f3dfaad-d451-448b-a447-47fc7bbff0e5\") " pod="openstack-operators/swift-operator-controller-manager-59595cd-9djr5" Jan 31 14:58:04 crc kubenswrapper[4751]: I0131 14:58:04.621561 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-59595cd-9djr5" Jan 31 14:58:05 crc kubenswrapper[4751]: I0131 14:58:05.097928 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-59595cd-9djr5"] Jan 31 14:58:05 crc kubenswrapper[4751]: I0131 14:58:05.107014 4751 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 31 14:58:05 crc kubenswrapper[4751]: I0131 14:58:05.967728 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-59595cd-9djr5" event={"ID":"9f3dfaad-d451-448b-a447-47fc7bbff0e5","Type":"ContainerStarted","Data":"da3b689c07e135768fb2bc22c72ffa9872cf722e04a986707e86515f65114b9c"} Jan 31 14:58:07 crc kubenswrapper[4751]: I0131 14:58:07.990161 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-59595cd-9djr5" event={"ID":"9f3dfaad-d451-448b-a447-47fc7bbff0e5","Type":"ContainerStarted","Data":"668d137892e68f6f4b2298a804a817a3b16a09cc4c85201f8a03fca82e38e755"} Jan 31 14:58:07 crc kubenswrapper[4751]: I0131 14:58:07.990533 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-59595cd-9djr5" Jan 31 14:58:08 crc kubenswrapper[4751]: I0131 14:58:08.024997 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-59595cd-9djr5" podStartSLOduration=2.059231994 podStartE2EDuration="4.024959423s" podCreationTimestamp="2026-01-31 14:58:04 +0000 UTC" firstStartedPulling="2026-01-31 14:58:05.106816771 +0000 UTC m=+989.481529656" lastFinishedPulling="2026-01-31 14:58:07.0725442 +0000 UTC m=+991.447257085" observedRunningTime="2026-01-31 14:58:08.01572853 +0000 UTC m=+992.390441485" watchObservedRunningTime="2026-01-31 14:58:08.024959423 +0000 UTC m=+992.399672388" Jan 31 14:58:08 crc kubenswrapper[4751]: I0131 14:58:08.897044 4751 patch_prober.go:28] interesting pod/machine-config-daemon-2wpj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 14:58:08 crc kubenswrapper[4751]: I0131 14:58:08.897165 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podUID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 14:58:13 crc kubenswrapper[4751]: I0131 14:58:13.594594 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d4cb5b58-r8xn7"] Jan 31 14:58:13 crc kubenswrapper[4751]: I0131 14:58:13.595556 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d4cb5b58-r8xn7" Jan 31 14:58:13 crc kubenswrapper[4751]: I0131 14:58:13.597254 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-service-cert" Jan 31 14:58:13 crc kubenswrapper[4751]: I0131 14:58:13.597374 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-fdxgb" Jan 31 14:58:13 crc kubenswrapper[4751]: I0131 14:58:13.611229 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d4cb5b58-r8xn7"] Jan 31 14:58:13 crc kubenswrapper[4751]: I0131 14:58:13.689918 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8fsr\" (UniqueName: \"kubernetes.io/projected/91cc4333-403a-4ce4-a347-8b475ad0169a-kube-api-access-f8fsr\") pod \"horizon-operator-controller-manager-6d4cb5b58-r8xn7\" (UID: \"91cc4333-403a-4ce4-a347-8b475ad0169a\") " pod="openstack-operators/horizon-operator-controller-manager-6d4cb5b58-r8xn7" Jan 31 14:58:13 crc kubenswrapper[4751]: I0131 14:58:13.689993 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/91cc4333-403a-4ce4-a347-8b475ad0169a-apiservice-cert\") pod \"horizon-operator-controller-manager-6d4cb5b58-r8xn7\" (UID: \"91cc4333-403a-4ce4-a347-8b475ad0169a\") " pod="openstack-operators/horizon-operator-controller-manager-6d4cb5b58-r8xn7" Jan 31 14:58:13 crc kubenswrapper[4751]: I0131 14:58:13.690215 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/91cc4333-403a-4ce4-a347-8b475ad0169a-webhook-cert\") pod \"horizon-operator-controller-manager-6d4cb5b58-r8xn7\" (UID: \"91cc4333-403a-4ce4-a347-8b475ad0169a\") " pod="openstack-operators/horizon-operator-controller-manager-6d4cb5b58-r8xn7" Jan 31 14:58:13 crc kubenswrapper[4751]: I0131 14:58:13.791278 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/91cc4333-403a-4ce4-a347-8b475ad0169a-webhook-cert\") pod \"horizon-operator-controller-manager-6d4cb5b58-r8xn7\" (UID: \"91cc4333-403a-4ce4-a347-8b475ad0169a\") " pod="openstack-operators/horizon-operator-controller-manager-6d4cb5b58-r8xn7" Jan 31 14:58:13 crc kubenswrapper[4751]: I0131 14:58:13.791334 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8fsr\" (UniqueName: \"kubernetes.io/projected/91cc4333-403a-4ce4-a347-8b475ad0169a-kube-api-access-f8fsr\") pod \"horizon-operator-controller-manager-6d4cb5b58-r8xn7\" (UID: \"91cc4333-403a-4ce4-a347-8b475ad0169a\") " pod="openstack-operators/horizon-operator-controller-manager-6d4cb5b58-r8xn7" Jan 31 14:58:13 crc kubenswrapper[4751]: I0131 14:58:13.791384 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/91cc4333-403a-4ce4-a347-8b475ad0169a-apiservice-cert\") pod \"horizon-operator-controller-manager-6d4cb5b58-r8xn7\" (UID: \"91cc4333-403a-4ce4-a347-8b475ad0169a\") " pod="openstack-operators/horizon-operator-controller-manager-6d4cb5b58-r8xn7" Jan 31 14:58:13 crc kubenswrapper[4751]: I0131 14:58:13.796491 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/91cc4333-403a-4ce4-a347-8b475ad0169a-webhook-cert\") pod \"horizon-operator-controller-manager-6d4cb5b58-r8xn7\" (UID: \"91cc4333-403a-4ce4-a347-8b475ad0169a\") " pod="openstack-operators/horizon-operator-controller-manager-6d4cb5b58-r8xn7" Jan 31 14:58:13 crc kubenswrapper[4751]: I0131 14:58:13.802710 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/91cc4333-403a-4ce4-a347-8b475ad0169a-apiservice-cert\") pod \"horizon-operator-controller-manager-6d4cb5b58-r8xn7\" (UID: \"91cc4333-403a-4ce4-a347-8b475ad0169a\") " pod="openstack-operators/horizon-operator-controller-manager-6d4cb5b58-r8xn7" Jan 31 14:58:13 crc kubenswrapper[4751]: I0131 14:58:13.807764 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8fsr\" (UniqueName: \"kubernetes.io/projected/91cc4333-403a-4ce4-a347-8b475ad0169a-kube-api-access-f8fsr\") pod \"horizon-operator-controller-manager-6d4cb5b58-r8xn7\" (UID: \"91cc4333-403a-4ce4-a347-8b475ad0169a\") " pod="openstack-operators/horizon-operator-controller-manager-6d4cb5b58-r8xn7" Jan 31 14:58:13 crc kubenswrapper[4751]: I0131 14:58:13.914487 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d4cb5b58-r8xn7" Jan 31 14:58:14 crc kubenswrapper[4751]: I0131 14:58:14.318727 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d4cb5b58-r8xn7"] Jan 31 14:58:14 crc kubenswrapper[4751]: I0131 14:58:14.627572 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-59595cd-9djr5" Jan 31 14:58:15 crc kubenswrapper[4751]: I0131 14:58:15.049674 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d4cb5b58-r8xn7" event={"ID":"91cc4333-403a-4ce4-a347-8b475ad0169a","Type":"ContainerStarted","Data":"db6a2307fd9e1cecbdc1efd47215efe13bdd3558905a52adfb7808e20c228b72"} Jan 31 14:58:17 crc kubenswrapper[4751]: I0131 14:58:17.065188 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d4cb5b58-r8xn7" event={"ID":"91cc4333-403a-4ce4-a347-8b475ad0169a","Type":"ContainerStarted","Data":"54a3468fd01d3d2ceae1f10629ed129d49cf0c01e0869142914b381d771228f2"} Jan 31 14:58:17 crc kubenswrapper[4751]: I0131 14:58:17.065726 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6d4cb5b58-r8xn7" Jan 31 14:58:17 crc kubenswrapper[4751]: I0131 14:58:17.084907 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6d4cb5b58-r8xn7" podStartSLOduration=1.8851242080000001 podStartE2EDuration="4.084889204s" podCreationTimestamp="2026-01-31 14:58:13 +0000 UTC" firstStartedPulling="2026-01-31 14:58:14.321618049 +0000 UTC m=+998.696330934" lastFinishedPulling="2026-01-31 14:58:16.521383045 +0000 UTC m=+1000.896095930" observedRunningTime="2026-01-31 14:58:17.08437527 +0000 UTC m=+1001.459088155" watchObservedRunningTime="2026-01-31 14:58:17.084889204 +0000 UTC m=+1001.459602079" Jan 31 14:58:20 crc kubenswrapper[4751]: I0131 14:58:20.908622 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/swift-storage-0"] Jan 31 14:58:20 crc kubenswrapper[4751]: I0131 14:58:20.917744 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-storage-0" Jan 31 14:58:20 crc kubenswrapper[4751]: I0131 14:58:20.922671 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"swift-ring-files" Jan 31 14:58:20 crc kubenswrapper[4751]: I0131 14:58:20.923907 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"swift-conf" Jan 31 14:58:20 crc kubenswrapper[4751]: I0131 14:58:20.924268 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"swift-storage-config-data" Jan 31 14:58:20 crc kubenswrapper[4751]: I0131 14:58:20.925629 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"swift-swift-dockercfg-nwfcb" Jan 31 14:58:20 crc kubenswrapper[4751]: I0131 14:58:20.947083 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-storage-0"] Jan 31 14:58:21 crc kubenswrapper[4751]: I0131 14:58:21.095590 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"swift-storage-0\" (UID: \"440e5809-7b49-4b21-99dd-668468c84017\") " pod="glance-kuttl-tests/swift-storage-0" Jan 31 14:58:21 crc kubenswrapper[4751]: I0131 14:58:21.095805 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqvvn\" (UniqueName: \"kubernetes.io/projected/440e5809-7b49-4b21-99dd-668468c84017-kube-api-access-kqvvn\") pod \"swift-storage-0\" (UID: \"440e5809-7b49-4b21-99dd-668468c84017\") " pod="glance-kuttl-tests/swift-storage-0" Jan 31 14:58:21 crc kubenswrapper[4751]: I0131 14:58:21.095917 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/440e5809-7b49-4b21-99dd-668468c84017-etc-swift\") pod \"swift-storage-0\" (UID: \"440e5809-7b49-4b21-99dd-668468c84017\") " pod="glance-kuttl-tests/swift-storage-0" Jan 31 14:58:21 crc kubenswrapper[4751]: I0131 14:58:21.096045 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/440e5809-7b49-4b21-99dd-668468c84017-cache\") pod \"swift-storage-0\" (UID: \"440e5809-7b49-4b21-99dd-668468c84017\") " pod="glance-kuttl-tests/swift-storage-0" Jan 31 14:58:21 crc kubenswrapper[4751]: I0131 14:58:21.096192 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/440e5809-7b49-4b21-99dd-668468c84017-lock\") pod \"swift-storage-0\" (UID: \"440e5809-7b49-4b21-99dd-668468c84017\") " pod="glance-kuttl-tests/swift-storage-0" Jan 31 14:58:21 crc kubenswrapper[4751]: I0131 14:58:21.197813 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/440e5809-7b49-4b21-99dd-668468c84017-cache\") pod \"swift-storage-0\" (UID: \"440e5809-7b49-4b21-99dd-668468c84017\") " pod="glance-kuttl-tests/swift-storage-0" Jan 31 14:58:21 crc kubenswrapper[4751]: I0131 14:58:21.197893 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/440e5809-7b49-4b21-99dd-668468c84017-lock\") pod \"swift-storage-0\" (UID: \"440e5809-7b49-4b21-99dd-668468c84017\") " pod="glance-kuttl-tests/swift-storage-0" Jan 31 14:58:21 crc kubenswrapper[4751]: I0131 14:58:21.197931 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"swift-storage-0\" (UID: \"440e5809-7b49-4b21-99dd-668468c84017\") " pod="glance-kuttl-tests/swift-storage-0" Jan 31 14:58:21 crc kubenswrapper[4751]: I0131 14:58:21.197953 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqvvn\" (UniqueName: \"kubernetes.io/projected/440e5809-7b49-4b21-99dd-668468c84017-kube-api-access-kqvvn\") pod \"swift-storage-0\" (UID: \"440e5809-7b49-4b21-99dd-668468c84017\") " pod="glance-kuttl-tests/swift-storage-0" Jan 31 14:58:21 crc kubenswrapper[4751]: I0131 14:58:21.197969 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/440e5809-7b49-4b21-99dd-668468c84017-etc-swift\") pod \"swift-storage-0\" (UID: \"440e5809-7b49-4b21-99dd-668468c84017\") " pod="glance-kuttl-tests/swift-storage-0" Jan 31 14:58:21 crc kubenswrapper[4751]: E0131 14:58:21.198102 4751 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 14:58:21 crc kubenswrapper[4751]: E0131 14:58:21.198114 4751 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 31 14:58:21 crc kubenswrapper[4751]: E0131 14:58:21.198155 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/440e5809-7b49-4b21-99dd-668468c84017-etc-swift podName:440e5809-7b49-4b21-99dd-668468c84017 nodeName:}" failed. No retries permitted until 2026-01-31 14:58:21.698140724 +0000 UTC m=+1006.072853599 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/440e5809-7b49-4b21-99dd-668468c84017-etc-swift") pod "swift-storage-0" (UID: "440e5809-7b49-4b21-99dd-668468c84017") : configmap "swift-ring-files" not found Jan 31 14:58:21 crc kubenswrapper[4751]: I0131 14:58:21.198544 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/440e5809-7b49-4b21-99dd-668468c84017-cache\") pod \"swift-storage-0\" (UID: \"440e5809-7b49-4b21-99dd-668468c84017\") " pod="glance-kuttl-tests/swift-storage-0" Jan 31 14:58:21 crc kubenswrapper[4751]: I0131 14:58:21.198605 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"swift-storage-0\" (UID: \"440e5809-7b49-4b21-99dd-668468c84017\") device mount path \"/mnt/openstack/pv14\"" pod="glance-kuttl-tests/swift-storage-0" Jan 31 14:58:21 crc kubenswrapper[4751]: I0131 14:58:21.210572 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/440e5809-7b49-4b21-99dd-668468c84017-lock\") pod \"swift-storage-0\" (UID: \"440e5809-7b49-4b21-99dd-668468c84017\") " pod="glance-kuttl-tests/swift-storage-0" Jan 31 14:58:21 crc kubenswrapper[4751]: I0131 14:58:21.220723 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"swift-storage-0\" (UID: \"440e5809-7b49-4b21-99dd-668468c84017\") " pod="glance-kuttl-tests/swift-storage-0" Jan 31 14:58:21 crc kubenswrapper[4751]: I0131 14:58:21.245175 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqvvn\" (UniqueName: \"kubernetes.io/projected/440e5809-7b49-4b21-99dd-668468c84017-kube-api-access-kqvvn\") pod \"swift-storage-0\" (UID: \"440e5809-7b49-4b21-99dd-668468c84017\") " pod="glance-kuttl-tests/swift-storage-0" Jan 31 14:58:21 crc kubenswrapper[4751]: I0131 14:58:21.704694 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/440e5809-7b49-4b21-99dd-668468c84017-etc-swift\") pod \"swift-storage-0\" (UID: \"440e5809-7b49-4b21-99dd-668468c84017\") " pod="glance-kuttl-tests/swift-storage-0" Jan 31 14:58:21 crc kubenswrapper[4751]: E0131 14:58:21.704895 4751 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 14:58:21 crc kubenswrapper[4751]: E0131 14:58:21.704924 4751 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 31 14:58:21 crc kubenswrapper[4751]: E0131 14:58:21.704988 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/440e5809-7b49-4b21-99dd-668468c84017-etc-swift podName:440e5809-7b49-4b21-99dd-668468c84017 nodeName:}" failed. No retries permitted until 2026-01-31 14:58:22.704970521 +0000 UTC m=+1007.079683416 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/440e5809-7b49-4b21-99dd-668468c84017-etc-swift") pod "swift-storage-0" (UID: "440e5809-7b49-4b21-99dd-668468c84017") : configmap "swift-ring-files" not found Jan 31 14:58:22 crc kubenswrapper[4751]: I0131 14:58:22.521295 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-index-bvvpv"] Jan 31 14:58:22 crc kubenswrapper[4751]: I0131 14:58:22.524209 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-index-bvvpv" Jan 31 14:58:22 crc kubenswrapper[4751]: I0131 14:58:22.527367 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-index-dockercfg-jtwrn" Jan 31 14:58:22 crc kubenswrapper[4751]: I0131 14:58:22.556642 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-index-bvvpv"] Jan 31 14:58:22 crc kubenswrapper[4751]: I0131 14:58:22.720350 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/440e5809-7b49-4b21-99dd-668468c84017-etc-swift\") pod \"swift-storage-0\" (UID: \"440e5809-7b49-4b21-99dd-668468c84017\") " pod="glance-kuttl-tests/swift-storage-0" Jan 31 14:58:22 crc kubenswrapper[4751]: I0131 14:58:22.720480 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtvgv\" (UniqueName: \"kubernetes.io/projected/eacc0c6c-95c4-487f-945e-4a1e3e17c508-kube-api-access-vtvgv\") pod \"glance-operator-index-bvvpv\" (UID: \"eacc0c6c-95c4-487f-945e-4a1e3e17c508\") " pod="openstack-operators/glance-operator-index-bvvpv" Jan 31 14:58:22 crc kubenswrapper[4751]: E0131 14:58:22.720640 4751 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 14:58:22 crc kubenswrapper[4751]: E0131 14:58:22.720668 4751 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 31 14:58:22 crc kubenswrapper[4751]: E0131 14:58:22.720743 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/440e5809-7b49-4b21-99dd-668468c84017-etc-swift podName:440e5809-7b49-4b21-99dd-668468c84017 nodeName:}" failed. No retries permitted until 2026-01-31 14:58:24.720717261 +0000 UTC m=+1009.095430156 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/440e5809-7b49-4b21-99dd-668468c84017-etc-swift") pod "swift-storage-0" (UID: "440e5809-7b49-4b21-99dd-668468c84017") : configmap "swift-ring-files" not found Jan 31 14:58:22 crc kubenswrapper[4751]: I0131 14:58:22.822120 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtvgv\" (UniqueName: \"kubernetes.io/projected/eacc0c6c-95c4-487f-945e-4a1e3e17c508-kube-api-access-vtvgv\") pod \"glance-operator-index-bvvpv\" (UID: \"eacc0c6c-95c4-487f-945e-4a1e3e17c508\") " pod="openstack-operators/glance-operator-index-bvvpv" Jan 31 14:58:22 crc kubenswrapper[4751]: I0131 14:58:22.858973 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtvgv\" (UniqueName: \"kubernetes.io/projected/eacc0c6c-95c4-487f-945e-4a1e3e17c508-kube-api-access-vtvgv\") pod \"glance-operator-index-bvvpv\" (UID: \"eacc0c6c-95c4-487f-945e-4a1e3e17c508\") " pod="openstack-operators/glance-operator-index-bvvpv" Jan 31 14:58:23 crc kubenswrapper[4751]: I0131 14:58:23.156940 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-index-bvvpv" Jan 31 14:58:23 crc kubenswrapper[4751]: I0131 14:58:23.380034 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-index-bvvpv"] Jan 31 14:58:23 crc kubenswrapper[4751]: I0131 14:58:23.918693 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6d4cb5b58-r8xn7" Jan 31 14:58:24 crc kubenswrapper[4751]: I0131 14:58:24.130924 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-index-bvvpv" event={"ID":"eacc0c6c-95c4-487f-945e-4a1e3e17c508","Type":"ContainerStarted","Data":"701664b77023940ba4b0968a1f7dc87bd2c93fe4b8f5f2f39b4e39a24e4b2f4b"} Jan 31 14:58:24 crc kubenswrapper[4751]: I0131 14:58:24.752823 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/440e5809-7b49-4b21-99dd-668468c84017-etc-swift\") pod \"swift-storage-0\" (UID: \"440e5809-7b49-4b21-99dd-668468c84017\") " pod="glance-kuttl-tests/swift-storage-0" Jan 31 14:58:24 crc kubenswrapper[4751]: E0131 14:58:24.752989 4751 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 14:58:24 crc kubenswrapper[4751]: E0131 14:58:24.753001 4751 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 31 14:58:24 crc kubenswrapper[4751]: E0131 14:58:24.753047 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/440e5809-7b49-4b21-99dd-668468c84017-etc-swift podName:440e5809-7b49-4b21-99dd-668468c84017 nodeName:}" failed. No retries permitted until 2026-01-31 14:58:28.75303457 +0000 UTC m=+1013.127747455 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/440e5809-7b49-4b21-99dd-668468c84017-etc-swift") pod "swift-storage-0" (UID: "440e5809-7b49-4b21-99dd-668468c84017") : configmap "swift-ring-files" not found Jan 31 14:58:24 crc kubenswrapper[4751]: I0131 14:58:24.909357 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-z72xp"] Jan 31 14:58:24 crc kubenswrapper[4751]: I0131 14:58:24.910442 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-z72xp" Jan 31 14:58:24 crc kubenswrapper[4751]: I0131 14:58:24.912886 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"swift-ring-config-data" Jan 31 14:58:24 crc kubenswrapper[4751]: I0131 14:58:24.913046 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"swift-ring-scripts" Jan 31 14:58:24 crc kubenswrapper[4751]: I0131 14:58:24.913053 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"swift-proxy-config-data" Jan 31 14:58:24 crc kubenswrapper[4751]: I0131 14:58:24.927536 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-z72xp"] Jan 31 14:58:25 crc kubenswrapper[4751]: I0131 14:58:25.057285 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/606aa4a9-2afe-4f51-a562-90f716040b58-swiftconf\") pod \"swift-ring-rebalance-z72xp\" (UID: \"606aa4a9-2afe-4f51-a562-90f716040b58\") " pod="glance-kuttl-tests/swift-ring-rebalance-z72xp" Jan 31 14:58:25 crc kubenswrapper[4751]: I0131 14:58:25.057364 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/606aa4a9-2afe-4f51-a562-90f716040b58-etc-swift\") pod \"swift-ring-rebalance-z72xp\" (UID: \"606aa4a9-2afe-4f51-a562-90f716040b58\") " pod="glance-kuttl-tests/swift-ring-rebalance-z72xp" Jan 31 14:58:25 crc kubenswrapper[4751]: I0131 14:58:25.057417 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/606aa4a9-2afe-4f51-a562-90f716040b58-scripts\") pod \"swift-ring-rebalance-z72xp\" (UID: \"606aa4a9-2afe-4f51-a562-90f716040b58\") " pod="glance-kuttl-tests/swift-ring-rebalance-z72xp" Jan 31 14:58:25 crc kubenswrapper[4751]: I0131 14:58:25.057503 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84kdw\" (UniqueName: \"kubernetes.io/projected/606aa4a9-2afe-4f51-a562-90f716040b58-kube-api-access-84kdw\") pod \"swift-ring-rebalance-z72xp\" (UID: \"606aa4a9-2afe-4f51-a562-90f716040b58\") " pod="glance-kuttl-tests/swift-ring-rebalance-z72xp" Jan 31 14:58:25 crc kubenswrapper[4751]: I0131 14:58:25.057538 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/606aa4a9-2afe-4f51-a562-90f716040b58-dispersionconf\") pod \"swift-ring-rebalance-z72xp\" (UID: \"606aa4a9-2afe-4f51-a562-90f716040b58\") " pod="glance-kuttl-tests/swift-ring-rebalance-z72xp" Jan 31 14:58:25 crc kubenswrapper[4751]: I0131 14:58:25.057567 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/606aa4a9-2afe-4f51-a562-90f716040b58-ring-data-devices\") pod \"swift-ring-rebalance-z72xp\" (UID: \"606aa4a9-2afe-4f51-a562-90f716040b58\") " pod="glance-kuttl-tests/swift-ring-rebalance-z72xp" Jan 31 14:58:25 crc kubenswrapper[4751]: I0131 14:58:25.159130 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/606aa4a9-2afe-4f51-a562-90f716040b58-scripts\") pod \"swift-ring-rebalance-z72xp\" (UID: \"606aa4a9-2afe-4f51-a562-90f716040b58\") " pod="glance-kuttl-tests/swift-ring-rebalance-z72xp" Jan 31 14:58:25 crc kubenswrapper[4751]: I0131 14:58:25.159192 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84kdw\" (UniqueName: \"kubernetes.io/projected/606aa4a9-2afe-4f51-a562-90f716040b58-kube-api-access-84kdw\") pod \"swift-ring-rebalance-z72xp\" (UID: \"606aa4a9-2afe-4f51-a562-90f716040b58\") " pod="glance-kuttl-tests/swift-ring-rebalance-z72xp" Jan 31 14:58:25 crc kubenswrapper[4751]: I0131 14:58:25.159216 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/606aa4a9-2afe-4f51-a562-90f716040b58-dispersionconf\") pod \"swift-ring-rebalance-z72xp\" (UID: \"606aa4a9-2afe-4f51-a562-90f716040b58\") " pod="glance-kuttl-tests/swift-ring-rebalance-z72xp" Jan 31 14:58:25 crc kubenswrapper[4751]: I0131 14:58:25.159241 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/606aa4a9-2afe-4f51-a562-90f716040b58-ring-data-devices\") pod \"swift-ring-rebalance-z72xp\" (UID: \"606aa4a9-2afe-4f51-a562-90f716040b58\") " pod="glance-kuttl-tests/swift-ring-rebalance-z72xp" Jan 31 14:58:25 crc kubenswrapper[4751]: I0131 14:58:25.159308 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/606aa4a9-2afe-4f51-a562-90f716040b58-swiftconf\") pod \"swift-ring-rebalance-z72xp\" (UID: \"606aa4a9-2afe-4f51-a562-90f716040b58\") " pod="glance-kuttl-tests/swift-ring-rebalance-z72xp" Jan 31 14:58:25 crc kubenswrapper[4751]: I0131 14:58:25.159331 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/606aa4a9-2afe-4f51-a562-90f716040b58-etc-swift\") pod \"swift-ring-rebalance-z72xp\" (UID: \"606aa4a9-2afe-4f51-a562-90f716040b58\") " pod="glance-kuttl-tests/swift-ring-rebalance-z72xp" Jan 31 14:58:25 crc kubenswrapper[4751]: I0131 14:58:25.159806 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/606aa4a9-2afe-4f51-a562-90f716040b58-etc-swift\") pod \"swift-ring-rebalance-z72xp\" (UID: \"606aa4a9-2afe-4f51-a562-90f716040b58\") " pod="glance-kuttl-tests/swift-ring-rebalance-z72xp" Jan 31 14:58:25 crc kubenswrapper[4751]: I0131 14:58:25.160210 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/606aa4a9-2afe-4f51-a562-90f716040b58-scripts\") pod \"swift-ring-rebalance-z72xp\" (UID: \"606aa4a9-2afe-4f51-a562-90f716040b58\") " pod="glance-kuttl-tests/swift-ring-rebalance-z72xp" Jan 31 14:58:25 crc kubenswrapper[4751]: I0131 14:58:25.160651 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/606aa4a9-2afe-4f51-a562-90f716040b58-ring-data-devices\") pod \"swift-ring-rebalance-z72xp\" (UID: \"606aa4a9-2afe-4f51-a562-90f716040b58\") " pod="glance-kuttl-tests/swift-ring-rebalance-z72xp" Jan 31 14:58:25 crc kubenswrapper[4751]: I0131 14:58:25.167566 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/606aa4a9-2afe-4f51-a562-90f716040b58-swiftconf\") pod \"swift-ring-rebalance-z72xp\" (UID: \"606aa4a9-2afe-4f51-a562-90f716040b58\") " pod="glance-kuttl-tests/swift-ring-rebalance-z72xp" Jan 31 14:58:25 crc kubenswrapper[4751]: I0131 14:58:25.169607 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/606aa4a9-2afe-4f51-a562-90f716040b58-dispersionconf\") pod \"swift-ring-rebalance-z72xp\" (UID: \"606aa4a9-2afe-4f51-a562-90f716040b58\") " pod="glance-kuttl-tests/swift-ring-rebalance-z72xp" Jan 31 14:58:25 crc kubenswrapper[4751]: I0131 14:58:25.182867 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84kdw\" (UniqueName: \"kubernetes.io/projected/606aa4a9-2afe-4f51-a562-90f716040b58-kube-api-access-84kdw\") pod \"swift-ring-rebalance-z72xp\" (UID: \"606aa4a9-2afe-4f51-a562-90f716040b58\") " pod="glance-kuttl-tests/swift-ring-rebalance-z72xp" Jan 31 14:58:25 crc kubenswrapper[4751]: I0131 14:58:25.229929 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-z72xp" Jan 31 14:58:26 crc kubenswrapper[4751]: I0131 14:58:26.237479 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-z72xp"] Jan 31 14:58:27 crc kubenswrapper[4751]: I0131 14:58:27.161852 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-index-bvvpv" event={"ID":"eacc0c6c-95c4-487f-945e-4a1e3e17c508","Type":"ContainerStarted","Data":"432c19d38a03c6f3813c47fccb7f600a3290b65e75e89319b377dead29257091"} Jan 31 14:58:27 crc kubenswrapper[4751]: I0131 14:58:27.164565 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-ring-rebalance-z72xp" event={"ID":"606aa4a9-2afe-4f51-a562-90f716040b58","Type":"ContainerStarted","Data":"6d59aea6f88c0b1dfe68e8fb50352f164b60343fcb2577fd3636718b016322c8"} Jan 31 14:58:27 crc kubenswrapper[4751]: I0131 14:58:27.181826 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-index-bvvpv" podStartSLOduration=1.597177464 podStartE2EDuration="5.181807642s" podCreationTimestamp="2026-01-31 14:58:22 +0000 UTC" firstStartedPulling="2026-01-31 14:58:23.389503719 +0000 UTC m=+1007.764216594" lastFinishedPulling="2026-01-31 14:58:26.974133887 +0000 UTC m=+1011.348846772" observedRunningTime="2026-01-31 14:58:27.177440857 +0000 UTC m=+1011.552153762" watchObservedRunningTime="2026-01-31 14:58:27.181807642 +0000 UTC m=+1011.556520557" Jan 31 14:58:28 crc kubenswrapper[4751]: I0131 14:58:28.815861 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/440e5809-7b49-4b21-99dd-668468c84017-etc-swift\") pod \"swift-storage-0\" (UID: \"440e5809-7b49-4b21-99dd-668468c84017\") " pod="glance-kuttl-tests/swift-storage-0" Jan 31 14:58:28 crc kubenswrapper[4751]: E0131 14:58:28.816009 4751 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 14:58:28 crc kubenswrapper[4751]: E0131 14:58:28.816152 4751 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 31 14:58:28 crc kubenswrapper[4751]: E0131 14:58:28.816206 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/440e5809-7b49-4b21-99dd-668468c84017-etc-swift podName:440e5809-7b49-4b21-99dd-668468c84017 nodeName:}" failed. No retries permitted until 2026-01-31 14:58:36.816187701 +0000 UTC m=+1021.190900586 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/440e5809-7b49-4b21-99dd-668468c84017-etc-swift") pod "swift-storage-0" (UID: "440e5809-7b49-4b21-99dd-668468c84017") : configmap "swift-ring-files" not found Jan 31 14:58:32 crc kubenswrapper[4751]: I0131 14:58:32.781590 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/keystone-859d455469-zqqzw" Jan 31 14:58:33 crc kubenswrapper[4751]: I0131 14:58:33.157876 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-index-bvvpv" Jan 31 14:58:33 crc kubenswrapper[4751]: I0131 14:58:33.157965 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/glance-operator-index-bvvpv" Jan 31 14:58:33 crc kubenswrapper[4751]: I0131 14:58:33.182569 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/glance-operator-index-bvvpv" Jan 31 14:58:33 crc kubenswrapper[4751]: I0131 14:58:33.239293 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-index-bvvpv" Jan 31 14:58:34 crc kubenswrapper[4751]: I0131 14:58:34.218527 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-ring-rebalance-z72xp" event={"ID":"606aa4a9-2afe-4f51-a562-90f716040b58","Type":"ContainerStarted","Data":"d7982a0dd9c095e8b3eb11a8ff02587d379ddd34791962ab93f48b60a33bec98"} Jan 31 14:58:34 crc kubenswrapper[4751]: I0131 14:58:34.238053 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/swift-ring-rebalance-z72xp" podStartSLOduration=3.72631647 podStartE2EDuration="10.238034396s" podCreationTimestamp="2026-01-31 14:58:24 +0000 UTC" firstStartedPulling="2026-01-31 14:58:26.842458052 +0000 UTC m=+1011.217170937" lastFinishedPulling="2026-01-31 14:58:33.354175958 +0000 UTC m=+1017.728888863" observedRunningTime="2026-01-31 14:58:34.236176037 +0000 UTC m=+1018.610888932" watchObservedRunningTime="2026-01-31 14:58:34.238034396 +0000 UTC m=+1018.612747281" Jan 31 14:58:36 crc kubenswrapper[4751]: I0131 14:58:36.851615 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/440e5809-7b49-4b21-99dd-668468c84017-etc-swift\") pod \"swift-storage-0\" (UID: \"440e5809-7b49-4b21-99dd-668468c84017\") " pod="glance-kuttl-tests/swift-storage-0" Jan 31 14:58:36 crc kubenswrapper[4751]: E0131 14:58:36.851796 4751 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 14:58:36 crc kubenswrapper[4751]: E0131 14:58:36.852214 4751 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 31 14:58:36 crc kubenswrapper[4751]: E0131 14:58:36.852291 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/440e5809-7b49-4b21-99dd-668468c84017-etc-swift podName:440e5809-7b49-4b21-99dd-668468c84017 nodeName:}" failed. No retries permitted until 2026-01-31 14:58:52.85226761 +0000 UTC m=+1037.226980515 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/440e5809-7b49-4b21-99dd-668468c84017-etc-swift") pod "swift-storage-0" (UID: "440e5809-7b49-4b21-99dd-668468c84017") : configmap "swift-ring-files" not found Jan 31 14:58:37 crc kubenswrapper[4751]: I0131 14:58:37.884640 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/swift-proxy-6d699db77c-58vrl"] Jan 31 14:58:37 crc kubenswrapper[4751]: I0131 14:58:37.886157 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-proxy-6d699db77c-58vrl" Jan 31 14:58:37 crc kubenswrapper[4751]: I0131 14:58:37.896267 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-proxy-6d699db77c-58vrl"] Jan 31 14:58:37 crc kubenswrapper[4751]: I0131 14:58:37.967714 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26ee66f9-5607-4559-9a64-6767dfbcc078-run-httpd\") pod \"swift-proxy-6d699db77c-58vrl\" (UID: \"26ee66f9-5607-4559-9a64-6767dfbcc078\") " pod="glance-kuttl-tests/swift-proxy-6d699db77c-58vrl" Jan 31 14:58:37 crc kubenswrapper[4751]: I0131 14:58:37.967837 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26ee66f9-5607-4559-9a64-6767dfbcc078-log-httpd\") pod \"swift-proxy-6d699db77c-58vrl\" (UID: \"26ee66f9-5607-4559-9a64-6767dfbcc078\") " pod="glance-kuttl-tests/swift-proxy-6d699db77c-58vrl" Jan 31 14:58:37 crc kubenswrapper[4751]: I0131 14:58:37.967878 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/26ee66f9-5607-4559-9a64-6767dfbcc078-etc-swift\") pod \"swift-proxy-6d699db77c-58vrl\" (UID: \"26ee66f9-5607-4559-9a64-6767dfbcc078\") " pod="glance-kuttl-tests/swift-proxy-6d699db77c-58vrl" Jan 31 14:58:37 crc kubenswrapper[4751]: I0131 14:58:37.967901 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xgx9\" (UniqueName: \"kubernetes.io/projected/26ee66f9-5607-4559-9a64-6767dfbcc078-kube-api-access-6xgx9\") pod \"swift-proxy-6d699db77c-58vrl\" (UID: \"26ee66f9-5607-4559-9a64-6767dfbcc078\") " pod="glance-kuttl-tests/swift-proxy-6d699db77c-58vrl" Jan 31 14:58:37 crc kubenswrapper[4751]: I0131 14:58:37.967919 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26ee66f9-5607-4559-9a64-6767dfbcc078-config-data\") pod \"swift-proxy-6d699db77c-58vrl\" (UID: \"26ee66f9-5607-4559-9a64-6767dfbcc078\") " pod="glance-kuttl-tests/swift-proxy-6d699db77c-58vrl" Jan 31 14:58:38 crc kubenswrapper[4751]: I0131 14:58:38.068673 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26ee66f9-5607-4559-9a64-6767dfbcc078-log-httpd\") pod \"swift-proxy-6d699db77c-58vrl\" (UID: \"26ee66f9-5607-4559-9a64-6767dfbcc078\") " pod="glance-kuttl-tests/swift-proxy-6d699db77c-58vrl" Jan 31 14:58:38 crc kubenswrapper[4751]: I0131 14:58:38.068964 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/26ee66f9-5607-4559-9a64-6767dfbcc078-etc-swift\") pod \"swift-proxy-6d699db77c-58vrl\" (UID: \"26ee66f9-5607-4559-9a64-6767dfbcc078\") " pod="glance-kuttl-tests/swift-proxy-6d699db77c-58vrl" Jan 31 14:58:38 crc kubenswrapper[4751]: I0131 14:58:38.068986 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xgx9\" (UniqueName: \"kubernetes.io/projected/26ee66f9-5607-4559-9a64-6767dfbcc078-kube-api-access-6xgx9\") pod \"swift-proxy-6d699db77c-58vrl\" (UID: \"26ee66f9-5607-4559-9a64-6767dfbcc078\") " pod="glance-kuttl-tests/swift-proxy-6d699db77c-58vrl" Jan 31 14:58:38 crc kubenswrapper[4751]: I0131 14:58:38.069006 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26ee66f9-5607-4559-9a64-6767dfbcc078-config-data\") pod \"swift-proxy-6d699db77c-58vrl\" (UID: \"26ee66f9-5607-4559-9a64-6767dfbcc078\") " pod="glance-kuttl-tests/swift-proxy-6d699db77c-58vrl" Jan 31 14:58:38 crc kubenswrapper[4751]: I0131 14:58:38.069091 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26ee66f9-5607-4559-9a64-6767dfbcc078-run-httpd\") pod \"swift-proxy-6d699db77c-58vrl\" (UID: \"26ee66f9-5607-4559-9a64-6767dfbcc078\") " pod="glance-kuttl-tests/swift-proxy-6d699db77c-58vrl" Jan 31 14:58:38 crc kubenswrapper[4751]: I0131 14:58:38.069206 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26ee66f9-5607-4559-9a64-6767dfbcc078-log-httpd\") pod \"swift-proxy-6d699db77c-58vrl\" (UID: \"26ee66f9-5607-4559-9a64-6767dfbcc078\") " pod="glance-kuttl-tests/swift-proxy-6d699db77c-58vrl" Jan 31 14:58:38 crc kubenswrapper[4751]: E0131 14:58:38.069276 4751 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 14:58:38 crc kubenswrapper[4751]: E0131 14:58:38.069303 4751 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-6d699db77c-58vrl: configmap "swift-ring-files" not found Jan 31 14:58:38 crc kubenswrapper[4751]: E0131 14:58:38.069363 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/26ee66f9-5607-4559-9a64-6767dfbcc078-etc-swift podName:26ee66f9-5607-4559-9a64-6767dfbcc078 nodeName:}" failed. No retries permitted until 2026-01-31 14:58:38.569345357 +0000 UTC m=+1022.944058242 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/26ee66f9-5607-4559-9a64-6767dfbcc078-etc-swift") pod "swift-proxy-6d699db77c-58vrl" (UID: "26ee66f9-5607-4559-9a64-6767dfbcc078") : configmap "swift-ring-files" not found Jan 31 14:58:38 crc kubenswrapper[4751]: I0131 14:58:38.069611 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26ee66f9-5607-4559-9a64-6767dfbcc078-run-httpd\") pod \"swift-proxy-6d699db77c-58vrl\" (UID: \"26ee66f9-5607-4559-9a64-6767dfbcc078\") " pod="glance-kuttl-tests/swift-proxy-6d699db77c-58vrl" Jan 31 14:58:38 crc kubenswrapper[4751]: I0131 14:58:38.077645 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26ee66f9-5607-4559-9a64-6767dfbcc078-config-data\") pod \"swift-proxy-6d699db77c-58vrl\" (UID: \"26ee66f9-5607-4559-9a64-6767dfbcc078\") " pod="glance-kuttl-tests/swift-proxy-6d699db77c-58vrl" Jan 31 14:58:38 crc kubenswrapper[4751]: I0131 14:58:38.088374 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xgx9\" (UniqueName: \"kubernetes.io/projected/26ee66f9-5607-4559-9a64-6767dfbcc078-kube-api-access-6xgx9\") pod \"swift-proxy-6d699db77c-58vrl\" (UID: \"26ee66f9-5607-4559-9a64-6767dfbcc078\") " pod="glance-kuttl-tests/swift-proxy-6d699db77c-58vrl" Jan 31 14:58:38 crc kubenswrapper[4751]: I0131 14:58:38.574910 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/26ee66f9-5607-4559-9a64-6767dfbcc078-etc-swift\") pod \"swift-proxy-6d699db77c-58vrl\" (UID: \"26ee66f9-5607-4559-9a64-6767dfbcc078\") " pod="glance-kuttl-tests/swift-proxy-6d699db77c-58vrl" Jan 31 14:58:38 crc kubenswrapper[4751]: E0131 14:58:38.575134 4751 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 14:58:38 crc kubenswrapper[4751]: E0131 14:58:38.575165 4751 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-6d699db77c-58vrl: configmap "swift-ring-files" not found Jan 31 14:58:38 crc kubenswrapper[4751]: E0131 14:58:38.575222 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/26ee66f9-5607-4559-9a64-6767dfbcc078-etc-swift podName:26ee66f9-5607-4559-9a64-6767dfbcc078 nodeName:}" failed. No retries permitted until 2026-01-31 14:58:39.575204869 +0000 UTC m=+1023.949917754 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/26ee66f9-5607-4559-9a64-6767dfbcc078-etc-swift") pod "swift-proxy-6d699db77c-58vrl" (UID: "26ee66f9-5607-4559-9a64-6767dfbcc078") : configmap "swift-ring-files" not found Jan 31 14:58:38 crc kubenswrapper[4751]: I0131 14:58:38.897186 4751 patch_prober.go:28] interesting pod/machine-config-daemon-2wpj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 14:58:38 crc kubenswrapper[4751]: I0131 14:58:38.897278 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podUID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 14:58:39 crc kubenswrapper[4751]: I0131 14:58:39.590313 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/26ee66f9-5607-4559-9a64-6767dfbcc078-etc-swift\") pod \"swift-proxy-6d699db77c-58vrl\" (UID: \"26ee66f9-5607-4559-9a64-6767dfbcc078\") " pod="glance-kuttl-tests/swift-proxy-6d699db77c-58vrl" Jan 31 14:58:39 crc kubenswrapper[4751]: E0131 14:58:39.590542 4751 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 14:58:39 crc kubenswrapper[4751]: E0131 14:58:39.590957 4751 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-6d699db77c-58vrl: configmap "swift-ring-files" not found Jan 31 14:58:39 crc kubenswrapper[4751]: E0131 14:58:39.591057 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/26ee66f9-5607-4559-9a64-6767dfbcc078-etc-swift podName:26ee66f9-5607-4559-9a64-6767dfbcc078 nodeName:}" failed. No retries permitted until 2026-01-31 14:58:41.59102945 +0000 UTC m=+1025.965742345 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/26ee66f9-5607-4559-9a64-6767dfbcc078-etc-swift") pod "swift-proxy-6d699db77c-58vrl" (UID: "26ee66f9-5607-4559-9a64-6767dfbcc078") : configmap "swift-ring-files" not found Jan 31 14:58:40 crc kubenswrapper[4751]: I0131 14:58:40.961236 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/9a978acee48f3b64f3e45376f243bba270f67c38c120bc2488cd2a78caspwrp"] Jan 31 14:58:40 crc kubenswrapper[4751]: I0131 14:58:40.962490 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9a978acee48f3b64f3e45376f243bba270f67c38c120bc2488cd2a78caspwrp" Jan 31 14:58:40 crc kubenswrapper[4751]: I0131 14:58:40.965165 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-wxkjx" Jan 31 14:58:40 crc kubenswrapper[4751]: I0131 14:58:40.975411 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9a978acee48f3b64f3e45376f243bba270f67c38c120bc2488cd2a78caspwrp"] Jan 31 14:58:41 crc kubenswrapper[4751]: I0131 14:58:41.112764 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzcg8\" (UniqueName: \"kubernetes.io/projected/585f0c4b-3594-4683-bb38-d1fcbbee12cd-kube-api-access-rzcg8\") pod \"9a978acee48f3b64f3e45376f243bba270f67c38c120bc2488cd2a78caspwrp\" (UID: \"585f0c4b-3594-4683-bb38-d1fcbbee12cd\") " pod="openstack-operators/9a978acee48f3b64f3e45376f243bba270f67c38c120bc2488cd2a78caspwrp" Jan 31 14:58:41 crc kubenswrapper[4751]: I0131 14:58:41.113143 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/585f0c4b-3594-4683-bb38-d1fcbbee12cd-util\") pod \"9a978acee48f3b64f3e45376f243bba270f67c38c120bc2488cd2a78caspwrp\" (UID: \"585f0c4b-3594-4683-bb38-d1fcbbee12cd\") " pod="openstack-operators/9a978acee48f3b64f3e45376f243bba270f67c38c120bc2488cd2a78caspwrp" Jan 31 14:58:41 crc kubenswrapper[4751]: I0131 14:58:41.113428 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/585f0c4b-3594-4683-bb38-d1fcbbee12cd-bundle\") pod \"9a978acee48f3b64f3e45376f243bba270f67c38c120bc2488cd2a78caspwrp\" (UID: \"585f0c4b-3594-4683-bb38-d1fcbbee12cd\") " pod="openstack-operators/9a978acee48f3b64f3e45376f243bba270f67c38c120bc2488cd2a78caspwrp" Jan 31 14:58:41 crc kubenswrapper[4751]: I0131 14:58:41.215161 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/585f0c4b-3594-4683-bb38-d1fcbbee12cd-util\") pod \"9a978acee48f3b64f3e45376f243bba270f67c38c120bc2488cd2a78caspwrp\" (UID: \"585f0c4b-3594-4683-bb38-d1fcbbee12cd\") " pod="openstack-operators/9a978acee48f3b64f3e45376f243bba270f67c38c120bc2488cd2a78caspwrp" Jan 31 14:58:41 crc kubenswrapper[4751]: I0131 14:58:41.215660 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/585f0c4b-3594-4683-bb38-d1fcbbee12cd-util\") pod \"9a978acee48f3b64f3e45376f243bba270f67c38c120bc2488cd2a78caspwrp\" (UID: \"585f0c4b-3594-4683-bb38-d1fcbbee12cd\") " pod="openstack-operators/9a978acee48f3b64f3e45376f243bba270f67c38c120bc2488cd2a78caspwrp" Jan 31 14:58:41 crc kubenswrapper[4751]: I0131 14:58:41.215999 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/585f0c4b-3594-4683-bb38-d1fcbbee12cd-bundle\") pod \"9a978acee48f3b64f3e45376f243bba270f67c38c120bc2488cd2a78caspwrp\" (UID: \"585f0c4b-3594-4683-bb38-d1fcbbee12cd\") " pod="openstack-operators/9a978acee48f3b64f3e45376f243bba270f67c38c120bc2488cd2a78caspwrp" Jan 31 14:58:41 crc kubenswrapper[4751]: I0131 14:58:41.216283 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzcg8\" (UniqueName: \"kubernetes.io/projected/585f0c4b-3594-4683-bb38-d1fcbbee12cd-kube-api-access-rzcg8\") pod \"9a978acee48f3b64f3e45376f243bba270f67c38c120bc2488cd2a78caspwrp\" (UID: \"585f0c4b-3594-4683-bb38-d1fcbbee12cd\") " pod="openstack-operators/9a978acee48f3b64f3e45376f243bba270f67c38c120bc2488cd2a78caspwrp" Jan 31 14:58:41 crc kubenswrapper[4751]: I0131 14:58:41.216570 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/585f0c4b-3594-4683-bb38-d1fcbbee12cd-bundle\") pod \"9a978acee48f3b64f3e45376f243bba270f67c38c120bc2488cd2a78caspwrp\" (UID: \"585f0c4b-3594-4683-bb38-d1fcbbee12cd\") " pod="openstack-operators/9a978acee48f3b64f3e45376f243bba270f67c38c120bc2488cd2a78caspwrp" Jan 31 14:58:41 crc kubenswrapper[4751]: I0131 14:58:41.240372 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzcg8\" (UniqueName: \"kubernetes.io/projected/585f0c4b-3594-4683-bb38-d1fcbbee12cd-kube-api-access-rzcg8\") pod \"9a978acee48f3b64f3e45376f243bba270f67c38c120bc2488cd2a78caspwrp\" (UID: \"585f0c4b-3594-4683-bb38-d1fcbbee12cd\") " pod="openstack-operators/9a978acee48f3b64f3e45376f243bba270f67c38c120bc2488cd2a78caspwrp" Jan 31 14:58:41 crc kubenswrapper[4751]: I0131 14:58:41.280872 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9a978acee48f3b64f3e45376f243bba270f67c38c120bc2488cd2a78caspwrp" Jan 31 14:58:41 crc kubenswrapper[4751]: I0131 14:58:41.621923 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/26ee66f9-5607-4559-9a64-6767dfbcc078-etc-swift\") pod \"swift-proxy-6d699db77c-58vrl\" (UID: \"26ee66f9-5607-4559-9a64-6767dfbcc078\") " pod="glance-kuttl-tests/swift-proxy-6d699db77c-58vrl" Jan 31 14:58:41 crc kubenswrapper[4751]: E0131 14:58:41.622132 4751 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 14:58:41 crc kubenswrapper[4751]: E0131 14:58:41.622149 4751 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-6d699db77c-58vrl: configmap "swift-ring-files" not found Jan 31 14:58:41 crc kubenswrapper[4751]: E0131 14:58:41.622201 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/26ee66f9-5607-4559-9a64-6767dfbcc078-etc-swift podName:26ee66f9-5607-4559-9a64-6767dfbcc078 nodeName:}" failed. No retries permitted until 2026-01-31 14:58:45.622182879 +0000 UTC m=+1029.996895774 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/26ee66f9-5607-4559-9a64-6767dfbcc078-etc-swift") pod "swift-proxy-6d699db77c-58vrl" (UID: "26ee66f9-5607-4559-9a64-6767dfbcc078") : configmap "swift-ring-files" not found Jan 31 14:58:41 crc kubenswrapper[4751]: I0131 14:58:41.756122 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9a978acee48f3b64f3e45376f243bba270f67c38c120bc2488cd2a78caspwrp"] Jan 31 14:58:42 crc kubenswrapper[4751]: I0131 14:58:42.277569 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9a978acee48f3b64f3e45376f243bba270f67c38c120bc2488cd2a78caspwrp" event={"ID":"585f0c4b-3594-4683-bb38-d1fcbbee12cd","Type":"ContainerStarted","Data":"fe746551f3718a5230142c952bda79cd40176a8ea52e7c55612bd125b16c09e1"} Jan 31 14:58:42 crc kubenswrapper[4751]: I0131 14:58:42.279992 4751 generic.go:334] "Generic (PLEG): container finished" podID="606aa4a9-2afe-4f51-a562-90f716040b58" containerID="d7982a0dd9c095e8b3eb11a8ff02587d379ddd34791962ab93f48b60a33bec98" exitCode=0 Jan 31 14:58:42 crc kubenswrapper[4751]: I0131 14:58:42.280038 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-ring-rebalance-z72xp" event={"ID":"606aa4a9-2afe-4f51-a562-90f716040b58","Type":"ContainerDied","Data":"d7982a0dd9c095e8b3eb11a8ff02587d379ddd34791962ab93f48b60a33bec98"} Jan 31 14:58:43 crc kubenswrapper[4751]: I0131 14:58:43.686888 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-z72xp" Jan 31 14:58:43 crc kubenswrapper[4751]: I0131 14:58:43.856711 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84kdw\" (UniqueName: \"kubernetes.io/projected/606aa4a9-2afe-4f51-a562-90f716040b58-kube-api-access-84kdw\") pod \"606aa4a9-2afe-4f51-a562-90f716040b58\" (UID: \"606aa4a9-2afe-4f51-a562-90f716040b58\") " Jan 31 14:58:43 crc kubenswrapper[4751]: I0131 14:58:43.856807 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/606aa4a9-2afe-4f51-a562-90f716040b58-swiftconf\") pod \"606aa4a9-2afe-4f51-a562-90f716040b58\" (UID: \"606aa4a9-2afe-4f51-a562-90f716040b58\") " Jan 31 14:58:43 crc kubenswrapper[4751]: I0131 14:58:43.856883 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/606aa4a9-2afe-4f51-a562-90f716040b58-etc-swift\") pod \"606aa4a9-2afe-4f51-a562-90f716040b58\" (UID: \"606aa4a9-2afe-4f51-a562-90f716040b58\") " Jan 31 14:58:43 crc kubenswrapper[4751]: I0131 14:58:43.856915 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/606aa4a9-2afe-4f51-a562-90f716040b58-scripts\") pod \"606aa4a9-2afe-4f51-a562-90f716040b58\" (UID: \"606aa4a9-2afe-4f51-a562-90f716040b58\") " Jan 31 14:58:43 crc kubenswrapper[4751]: I0131 14:58:43.856990 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/606aa4a9-2afe-4f51-a562-90f716040b58-ring-data-devices\") pod \"606aa4a9-2afe-4f51-a562-90f716040b58\" (UID: \"606aa4a9-2afe-4f51-a562-90f716040b58\") " Jan 31 14:58:43 crc kubenswrapper[4751]: I0131 14:58:43.857058 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/606aa4a9-2afe-4f51-a562-90f716040b58-dispersionconf\") pod \"606aa4a9-2afe-4f51-a562-90f716040b58\" (UID: \"606aa4a9-2afe-4f51-a562-90f716040b58\") " Jan 31 14:58:43 crc kubenswrapper[4751]: I0131 14:58:43.857819 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/606aa4a9-2afe-4f51-a562-90f716040b58-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "606aa4a9-2afe-4f51-a562-90f716040b58" (UID: "606aa4a9-2afe-4f51-a562-90f716040b58"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 14:58:43 crc kubenswrapper[4751]: I0131 14:58:43.858305 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/606aa4a9-2afe-4f51-a562-90f716040b58-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "606aa4a9-2afe-4f51-a562-90f716040b58" (UID: "606aa4a9-2afe-4f51-a562-90f716040b58"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:58:43 crc kubenswrapper[4751]: I0131 14:58:43.862472 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/606aa4a9-2afe-4f51-a562-90f716040b58-kube-api-access-84kdw" (OuterVolumeSpecName: "kube-api-access-84kdw") pod "606aa4a9-2afe-4f51-a562-90f716040b58" (UID: "606aa4a9-2afe-4f51-a562-90f716040b58"). InnerVolumeSpecName "kube-api-access-84kdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:58:43 crc kubenswrapper[4751]: I0131 14:58:43.867233 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/606aa4a9-2afe-4f51-a562-90f716040b58-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "606aa4a9-2afe-4f51-a562-90f716040b58" (UID: "606aa4a9-2afe-4f51-a562-90f716040b58"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:58:43 crc kubenswrapper[4751]: I0131 14:58:43.880221 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/606aa4a9-2afe-4f51-a562-90f716040b58-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "606aa4a9-2afe-4f51-a562-90f716040b58" (UID: "606aa4a9-2afe-4f51-a562-90f716040b58"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:58:43 crc kubenswrapper[4751]: I0131 14:58:43.890920 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/606aa4a9-2afe-4f51-a562-90f716040b58-scripts" (OuterVolumeSpecName: "scripts") pod "606aa4a9-2afe-4f51-a562-90f716040b58" (UID: "606aa4a9-2afe-4f51-a562-90f716040b58"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:58:43 crc kubenswrapper[4751]: I0131 14:58:43.958555 4751 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/606aa4a9-2afe-4f51-a562-90f716040b58-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 31 14:58:43 crc kubenswrapper[4751]: I0131 14:58:43.958589 4751 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/606aa4a9-2afe-4f51-a562-90f716040b58-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 31 14:58:43 crc kubenswrapper[4751]: I0131 14:58:43.958600 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84kdw\" (UniqueName: \"kubernetes.io/projected/606aa4a9-2afe-4f51-a562-90f716040b58-kube-api-access-84kdw\") on node \"crc\" DevicePath \"\"" Jan 31 14:58:43 crc kubenswrapper[4751]: I0131 14:58:43.958613 4751 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/606aa4a9-2afe-4f51-a562-90f716040b58-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 31 14:58:43 crc kubenswrapper[4751]: I0131 14:58:43.958624 4751 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/606aa4a9-2afe-4f51-a562-90f716040b58-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 31 14:58:43 crc kubenswrapper[4751]: I0131 14:58:43.958635 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/606aa4a9-2afe-4f51-a562-90f716040b58-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 14:58:44 crc kubenswrapper[4751]: I0131 14:58:44.298103 4751 generic.go:334] "Generic (PLEG): container finished" podID="585f0c4b-3594-4683-bb38-d1fcbbee12cd" containerID="3bbca91afaf0c02d15eadfd14c9b7b21724ed7ad9f88766a7c7a0c41fcf118a3" exitCode=0 Jan 31 14:58:44 crc kubenswrapper[4751]: I0131 14:58:44.298177 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9a978acee48f3b64f3e45376f243bba270f67c38c120bc2488cd2a78caspwrp" event={"ID":"585f0c4b-3594-4683-bb38-d1fcbbee12cd","Type":"ContainerDied","Data":"3bbca91afaf0c02d15eadfd14c9b7b21724ed7ad9f88766a7c7a0c41fcf118a3"} Jan 31 14:58:44 crc kubenswrapper[4751]: I0131 14:58:44.299662 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-ring-rebalance-z72xp" event={"ID":"606aa4a9-2afe-4f51-a562-90f716040b58","Type":"ContainerDied","Data":"6d59aea6f88c0b1dfe68e8fb50352f164b60343fcb2577fd3636718b016322c8"} Jan 31 14:58:44 crc kubenswrapper[4751]: I0131 14:58:44.299687 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-z72xp" Jan 31 14:58:44 crc kubenswrapper[4751]: I0131 14:58:44.299721 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d59aea6f88c0b1dfe68e8fb50352f164b60343fcb2577fd3636718b016322c8" Jan 31 14:58:45 crc kubenswrapper[4751]: I0131 14:58:45.309869 4751 generic.go:334] "Generic (PLEG): container finished" podID="585f0c4b-3594-4683-bb38-d1fcbbee12cd" containerID="f8a8825d481236aeb9aa96c02aca48495f3689b5e59d7cbdc781d2a43a293e1d" exitCode=0 Jan 31 14:58:45 crc kubenswrapper[4751]: I0131 14:58:45.309906 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9a978acee48f3b64f3e45376f243bba270f67c38c120bc2488cd2a78caspwrp" event={"ID":"585f0c4b-3594-4683-bb38-d1fcbbee12cd","Type":"ContainerDied","Data":"f8a8825d481236aeb9aa96c02aca48495f3689b5e59d7cbdc781d2a43a293e1d"} Jan 31 14:58:45 crc kubenswrapper[4751]: I0131 14:58:45.687735 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/26ee66f9-5607-4559-9a64-6767dfbcc078-etc-swift\") pod \"swift-proxy-6d699db77c-58vrl\" (UID: \"26ee66f9-5607-4559-9a64-6767dfbcc078\") " pod="glance-kuttl-tests/swift-proxy-6d699db77c-58vrl" Jan 31 14:58:45 crc kubenswrapper[4751]: I0131 14:58:45.696492 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/26ee66f9-5607-4559-9a64-6767dfbcc078-etc-swift\") pod \"swift-proxy-6d699db77c-58vrl\" (UID: \"26ee66f9-5607-4559-9a64-6767dfbcc078\") " pod="glance-kuttl-tests/swift-proxy-6d699db77c-58vrl" Jan 31 14:58:45 crc kubenswrapper[4751]: I0131 14:58:45.705143 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-proxy-6d699db77c-58vrl" Jan 31 14:58:45 crc kubenswrapper[4751]: I0131 14:58:45.940222 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-proxy-6d699db77c-58vrl"] Jan 31 14:58:45 crc kubenswrapper[4751]: W0131 14:58:45.946271 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26ee66f9_5607_4559_9a64_6767dfbcc078.slice/crio-85b271a6f57bacb15bd471b08b0e25366c5d1865f74c103bc014e71042620a53 WatchSource:0}: Error finding container 85b271a6f57bacb15bd471b08b0e25366c5d1865f74c103bc014e71042620a53: Status 404 returned error can't find the container with id 85b271a6f57bacb15bd471b08b0e25366c5d1865f74c103bc014e71042620a53 Jan 31 14:58:46 crc kubenswrapper[4751]: I0131 14:58:46.318251 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-proxy-6d699db77c-58vrl" event={"ID":"26ee66f9-5607-4559-9a64-6767dfbcc078","Type":"ContainerStarted","Data":"67b86444b56a0b82ee27cdb476ad3cf81bbe2a2988cb8d86234cd5cb875fb2be"} Jan 31 14:58:46 crc kubenswrapper[4751]: I0131 14:58:46.318513 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-proxy-6d699db77c-58vrl" event={"ID":"26ee66f9-5607-4559-9a64-6767dfbcc078","Type":"ContainerStarted","Data":"7994c2196fb62df3ba578a245a33690acd7fb1518638072c0dcea5a66bf4d321"} Jan 31 14:58:46 crc kubenswrapper[4751]: I0131 14:58:46.318523 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-proxy-6d699db77c-58vrl" event={"ID":"26ee66f9-5607-4559-9a64-6767dfbcc078","Type":"ContainerStarted","Data":"85b271a6f57bacb15bd471b08b0e25366c5d1865f74c103bc014e71042620a53"} Jan 31 14:58:46 crc kubenswrapper[4751]: I0131 14:58:46.318538 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/swift-proxy-6d699db77c-58vrl" Jan 31 14:58:46 crc kubenswrapper[4751]: I0131 14:58:46.321214 4751 generic.go:334] "Generic (PLEG): container finished" podID="585f0c4b-3594-4683-bb38-d1fcbbee12cd" containerID="30a855aaf2538d16f15d520cfdce2fe3cf7008190e9478d912986cc8f0f389d2" exitCode=0 Jan 31 14:58:46 crc kubenswrapper[4751]: I0131 14:58:46.321316 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9a978acee48f3b64f3e45376f243bba270f67c38c120bc2488cd2a78caspwrp" event={"ID":"585f0c4b-3594-4683-bb38-d1fcbbee12cd","Type":"ContainerDied","Data":"30a855aaf2538d16f15d520cfdce2fe3cf7008190e9478d912986cc8f0f389d2"} Jan 31 14:58:46 crc kubenswrapper[4751]: I0131 14:58:46.361364 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/swift-proxy-6d699db77c-58vrl" podStartSLOduration=9.36134065 podStartE2EDuration="9.36134065s" podCreationTimestamp="2026-01-31 14:58:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:58:46.340843441 +0000 UTC m=+1030.715556336" watchObservedRunningTime="2026-01-31 14:58:46.36134065 +0000 UTC m=+1030.736053535" Jan 31 14:58:47 crc kubenswrapper[4751]: I0131 14:58:47.330048 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/swift-proxy-6d699db77c-58vrl" Jan 31 14:58:47 crc kubenswrapper[4751]: I0131 14:58:47.595735 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9a978acee48f3b64f3e45376f243bba270f67c38c120bc2488cd2a78caspwrp" Jan 31 14:58:47 crc kubenswrapper[4751]: I0131 14:58:47.730264 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/585f0c4b-3594-4683-bb38-d1fcbbee12cd-bundle\") pod \"585f0c4b-3594-4683-bb38-d1fcbbee12cd\" (UID: \"585f0c4b-3594-4683-bb38-d1fcbbee12cd\") " Jan 31 14:58:47 crc kubenswrapper[4751]: I0131 14:58:47.730407 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzcg8\" (UniqueName: \"kubernetes.io/projected/585f0c4b-3594-4683-bb38-d1fcbbee12cd-kube-api-access-rzcg8\") pod \"585f0c4b-3594-4683-bb38-d1fcbbee12cd\" (UID: \"585f0c4b-3594-4683-bb38-d1fcbbee12cd\") " Jan 31 14:58:47 crc kubenswrapper[4751]: I0131 14:58:47.730485 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/585f0c4b-3594-4683-bb38-d1fcbbee12cd-util\") pod \"585f0c4b-3594-4683-bb38-d1fcbbee12cd\" (UID: \"585f0c4b-3594-4683-bb38-d1fcbbee12cd\") " Jan 31 14:58:47 crc kubenswrapper[4751]: I0131 14:58:47.732786 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/585f0c4b-3594-4683-bb38-d1fcbbee12cd-bundle" (OuterVolumeSpecName: "bundle") pod "585f0c4b-3594-4683-bb38-d1fcbbee12cd" (UID: "585f0c4b-3594-4683-bb38-d1fcbbee12cd"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 14:58:47 crc kubenswrapper[4751]: I0131 14:58:47.737388 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/585f0c4b-3594-4683-bb38-d1fcbbee12cd-kube-api-access-rzcg8" (OuterVolumeSpecName: "kube-api-access-rzcg8") pod "585f0c4b-3594-4683-bb38-d1fcbbee12cd" (UID: "585f0c4b-3594-4683-bb38-d1fcbbee12cd"). InnerVolumeSpecName "kube-api-access-rzcg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:58:47 crc kubenswrapper[4751]: I0131 14:58:47.745041 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/585f0c4b-3594-4683-bb38-d1fcbbee12cd-util" (OuterVolumeSpecName: "util") pod "585f0c4b-3594-4683-bb38-d1fcbbee12cd" (UID: "585f0c4b-3594-4683-bb38-d1fcbbee12cd"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 14:58:47 crc kubenswrapper[4751]: I0131 14:58:47.832233 4751 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/585f0c4b-3594-4683-bb38-d1fcbbee12cd-util\") on node \"crc\" DevicePath \"\"" Jan 31 14:58:47 crc kubenswrapper[4751]: I0131 14:58:47.832267 4751 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/585f0c4b-3594-4683-bb38-d1fcbbee12cd-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 14:58:47 crc kubenswrapper[4751]: I0131 14:58:47.832278 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzcg8\" (UniqueName: \"kubernetes.io/projected/585f0c4b-3594-4683-bb38-d1fcbbee12cd-kube-api-access-rzcg8\") on node \"crc\" DevicePath \"\"" Jan 31 14:58:48 crc kubenswrapper[4751]: I0131 14:58:48.340511 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9a978acee48f3b64f3e45376f243bba270f67c38c120bc2488cd2a78caspwrp" Jan 31 14:58:48 crc kubenswrapper[4751]: I0131 14:58:48.340522 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9a978acee48f3b64f3e45376f243bba270f67c38c120bc2488cd2a78caspwrp" event={"ID":"585f0c4b-3594-4683-bb38-d1fcbbee12cd","Type":"ContainerDied","Data":"fe746551f3718a5230142c952bda79cd40176a8ea52e7c55612bd125b16c09e1"} Jan 31 14:58:48 crc kubenswrapper[4751]: I0131 14:58:48.340573 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe746551f3718a5230142c952bda79cd40176a8ea52e7c55612bd125b16c09e1" Jan 31 14:58:52 crc kubenswrapper[4751]: I0131 14:58:52.907189 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/440e5809-7b49-4b21-99dd-668468c84017-etc-swift\") pod \"swift-storage-0\" (UID: \"440e5809-7b49-4b21-99dd-668468c84017\") " pod="glance-kuttl-tests/swift-storage-0" Jan 31 14:58:52 crc kubenswrapper[4751]: I0131 14:58:52.919674 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/440e5809-7b49-4b21-99dd-668468c84017-etc-swift\") pod \"swift-storage-0\" (UID: \"440e5809-7b49-4b21-99dd-668468c84017\") " pod="glance-kuttl-tests/swift-storage-0" Jan 31 14:58:53 crc kubenswrapper[4751]: I0131 14:58:53.032728 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-storage-0" Jan 31 14:58:53 crc kubenswrapper[4751]: I0131 14:58:53.522880 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-storage-0"] Jan 31 14:58:54 crc kubenswrapper[4751]: I0131 14:58:54.382976 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"440e5809-7b49-4b21-99dd-668468c84017","Type":"ContainerStarted","Data":"7955d37d9d1be24fa8d9a015aa2ea953036cee2a0334d1dbf39fdbe1dcef40e5"} Jan 31 14:58:55 crc kubenswrapper[4751]: I0131 14:58:55.392286 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"440e5809-7b49-4b21-99dd-668468c84017","Type":"ContainerStarted","Data":"a4e14596c5c3a7af2ea9e82736c916fc73b8fcbf27a523b8fe47f9a8e69b1bc2"} Jan 31 14:58:55 crc kubenswrapper[4751]: I0131 14:58:55.392576 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"440e5809-7b49-4b21-99dd-668468c84017","Type":"ContainerStarted","Data":"461a1aaa8bc72705195647c97b28e111484e900c69e9a4da07e510a6c451ed4c"} Jan 31 14:58:55 crc kubenswrapper[4751]: I0131 14:58:55.392589 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"440e5809-7b49-4b21-99dd-668468c84017","Type":"ContainerStarted","Data":"d0ab6cd06ea2abbd171a5345dc579495df175d9d8a52b30a0139e24e65e43616"} Jan 31 14:58:55 crc kubenswrapper[4751]: I0131 14:58:55.392602 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"440e5809-7b49-4b21-99dd-668468c84017","Type":"ContainerStarted","Data":"d93f0c8cc4f4e310c9d207351f924f281c14e44b511b3d4a8f51fed27dbeed8f"} Jan 31 14:58:55 crc kubenswrapper[4751]: I0131 14:58:55.707880 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/swift-proxy-6d699db77c-58vrl" Jan 31 14:58:55 crc kubenswrapper[4751]: I0131 14:58:55.710816 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/swift-proxy-6d699db77c-58vrl" Jan 31 14:58:57 crc kubenswrapper[4751]: I0131 14:58:57.451395 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"440e5809-7b49-4b21-99dd-668468c84017","Type":"ContainerStarted","Data":"400722d3dac6cd5b0b727b3e599b127bb527981160049f2561a32e7ada14affd"} Jan 31 14:58:57 crc kubenswrapper[4751]: I0131 14:58:57.451792 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"440e5809-7b49-4b21-99dd-668468c84017","Type":"ContainerStarted","Data":"34a87b0cfca857f6a2c07d4713531103b7df75f0fdc3e2be299ecaf554d5d9db"} Jan 31 14:58:57 crc kubenswrapper[4751]: I0131 14:58:57.451804 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"440e5809-7b49-4b21-99dd-668468c84017","Type":"ContainerStarted","Data":"03c86cbbc819872662746f2a8384c7c50f07b481c42b5f3d39e0b1e87c7b0557"} Jan 31 14:58:57 crc kubenswrapper[4751]: I0131 14:58:57.451814 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"440e5809-7b49-4b21-99dd-668468c84017","Type":"ContainerStarted","Data":"3b4375e902d16ea731761694aa85354dcfcda568f68f1d4210b06b07c701f380"} Jan 31 14:58:58 crc kubenswrapper[4751]: I0131 14:58:58.467940 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"440e5809-7b49-4b21-99dd-668468c84017","Type":"ContainerStarted","Data":"1f74cf8c2ce97cd17f509447e4c986197d8af0e8b2f40e7c6a07653c81e66d3b"} Jan 31 14:58:59 crc kubenswrapper[4751]: I0131 14:58:59.480006 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"440e5809-7b49-4b21-99dd-668468c84017","Type":"ContainerStarted","Data":"ae11b6c0a7f7893c0ba728593c9e1b6db0bc399ae9c55df1f1023d422fc9333c"} Jan 31 14:58:59 crc kubenswrapper[4751]: I0131 14:58:59.480309 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"440e5809-7b49-4b21-99dd-668468c84017","Type":"ContainerStarted","Data":"519bd8155f30918b172e24832e84310378bd7ea10e796377a992dd3fe9e7276d"} Jan 31 14:58:59 crc kubenswrapper[4751]: I0131 14:58:59.480319 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"440e5809-7b49-4b21-99dd-668468c84017","Type":"ContainerStarted","Data":"950232b5b660c70b9100e81003ff993443f745f40d7da6ba8dc037822059cb8e"} Jan 31 14:58:59 crc kubenswrapper[4751]: I0131 14:58:59.480329 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"440e5809-7b49-4b21-99dd-668468c84017","Type":"ContainerStarted","Data":"71ca1416bdc095b268ec385a4ebcd269b729c80c3aee7f832db2892f4fe6e78a"} Jan 31 14:58:59 crc kubenswrapper[4751]: I0131 14:58:59.480340 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"440e5809-7b49-4b21-99dd-668468c84017","Type":"ContainerStarted","Data":"1e2003fe4d2366b583ebedf393e2492c910be0ebf3f2652f5a15b1e8c78961df"} Jan 31 14:58:59 crc kubenswrapper[4751]: I0131 14:58:59.480351 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"440e5809-7b49-4b21-99dd-668468c84017","Type":"ContainerStarted","Data":"03b25054db738f38056ec8af2822c9203e252f1a4f95be8c4ab8c1c34de3455c"} Jan 31 14:58:59 crc kubenswrapper[4751]: I0131 14:58:59.521767 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/swift-storage-0" podStartSLOduration=35.875552909 podStartE2EDuration="40.521746512s" podCreationTimestamp="2026-01-31 14:58:19 +0000 UTC" firstStartedPulling="2026-01-31 14:58:53.521585855 +0000 UTC m=+1037.896298740" lastFinishedPulling="2026-01-31 14:58:58.167779458 +0000 UTC m=+1042.542492343" observedRunningTime="2026-01-31 14:58:59.515816756 +0000 UTC m=+1043.890529651" watchObservedRunningTime="2026-01-31 14:58:59.521746512 +0000 UTC m=+1043.896459397" Jan 31 14:59:06 crc kubenswrapper[4751]: I0131 14:59:06.649616 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-75dc47fc9-v4thz"] Jan 31 14:59:06 crc kubenswrapper[4751]: E0131 14:59:06.650507 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="606aa4a9-2afe-4f51-a562-90f716040b58" containerName="swift-ring-rebalance" Jan 31 14:59:06 crc kubenswrapper[4751]: I0131 14:59:06.650523 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="606aa4a9-2afe-4f51-a562-90f716040b58" containerName="swift-ring-rebalance" Jan 31 14:59:06 crc kubenswrapper[4751]: E0131 14:59:06.650554 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="585f0c4b-3594-4683-bb38-d1fcbbee12cd" containerName="extract" Jan 31 14:59:06 crc kubenswrapper[4751]: I0131 14:59:06.650563 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="585f0c4b-3594-4683-bb38-d1fcbbee12cd" containerName="extract" Jan 31 14:59:06 crc kubenswrapper[4751]: E0131 14:59:06.650578 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="585f0c4b-3594-4683-bb38-d1fcbbee12cd" containerName="pull" Jan 31 14:59:06 crc kubenswrapper[4751]: I0131 14:59:06.650587 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="585f0c4b-3594-4683-bb38-d1fcbbee12cd" containerName="pull" Jan 31 14:59:06 crc kubenswrapper[4751]: E0131 14:59:06.650603 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="585f0c4b-3594-4683-bb38-d1fcbbee12cd" containerName="util" Jan 31 14:59:06 crc kubenswrapper[4751]: I0131 14:59:06.650612 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="585f0c4b-3594-4683-bb38-d1fcbbee12cd" containerName="util" Jan 31 14:59:06 crc kubenswrapper[4751]: I0131 14:59:06.650763 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="585f0c4b-3594-4683-bb38-d1fcbbee12cd" containerName="extract" Jan 31 14:59:06 crc kubenswrapper[4751]: I0131 14:59:06.650787 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="606aa4a9-2afe-4f51-a562-90f716040b58" containerName="swift-ring-rebalance" Jan 31 14:59:06 crc kubenswrapper[4751]: I0131 14:59:06.651360 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-75dc47fc9-v4thz" Jan 31 14:59:06 crc kubenswrapper[4751]: I0131 14:59:06.653685 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-service-cert" Jan 31 14:59:06 crc kubenswrapper[4751]: I0131 14:59:06.655752 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-ph7z8" Jan 31 14:59:06 crc kubenswrapper[4751]: I0131 14:59:06.666937 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-75dc47fc9-v4thz"] Jan 31 14:59:06 crc kubenswrapper[4751]: I0131 14:59:06.738583 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f70443db-a342-4f5d-81b2-39c01f494cf8-webhook-cert\") pod \"glance-operator-controller-manager-75dc47fc9-v4thz\" (UID: \"f70443db-a342-4f5d-81b2-39c01f494cf8\") " pod="openstack-operators/glance-operator-controller-manager-75dc47fc9-v4thz" Jan 31 14:59:06 crc kubenswrapper[4751]: I0131 14:59:06.738655 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f70443db-a342-4f5d-81b2-39c01f494cf8-apiservice-cert\") pod \"glance-operator-controller-manager-75dc47fc9-v4thz\" (UID: \"f70443db-a342-4f5d-81b2-39c01f494cf8\") " pod="openstack-operators/glance-operator-controller-manager-75dc47fc9-v4thz" Jan 31 14:59:06 crc kubenswrapper[4751]: I0131 14:59:06.738754 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6f7g\" (UniqueName: \"kubernetes.io/projected/f70443db-a342-4f5d-81b2-39c01f494cf8-kube-api-access-l6f7g\") pod \"glance-operator-controller-manager-75dc47fc9-v4thz\" (UID: \"f70443db-a342-4f5d-81b2-39c01f494cf8\") " pod="openstack-operators/glance-operator-controller-manager-75dc47fc9-v4thz" Jan 31 14:59:06 crc kubenswrapper[4751]: I0131 14:59:06.839881 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f70443db-a342-4f5d-81b2-39c01f494cf8-webhook-cert\") pod \"glance-operator-controller-manager-75dc47fc9-v4thz\" (UID: \"f70443db-a342-4f5d-81b2-39c01f494cf8\") " pod="openstack-operators/glance-operator-controller-manager-75dc47fc9-v4thz" Jan 31 14:59:06 crc kubenswrapper[4751]: I0131 14:59:06.839942 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f70443db-a342-4f5d-81b2-39c01f494cf8-apiservice-cert\") pod \"glance-operator-controller-manager-75dc47fc9-v4thz\" (UID: \"f70443db-a342-4f5d-81b2-39c01f494cf8\") " pod="openstack-operators/glance-operator-controller-manager-75dc47fc9-v4thz" Jan 31 14:59:06 crc kubenswrapper[4751]: I0131 14:59:06.840051 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6f7g\" (UniqueName: \"kubernetes.io/projected/f70443db-a342-4f5d-81b2-39c01f494cf8-kube-api-access-l6f7g\") pod \"glance-operator-controller-manager-75dc47fc9-v4thz\" (UID: \"f70443db-a342-4f5d-81b2-39c01f494cf8\") " pod="openstack-operators/glance-operator-controller-manager-75dc47fc9-v4thz" Jan 31 14:59:06 crc kubenswrapper[4751]: I0131 14:59:06.855251 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f70443db-a342-4f5d-81b2-39c01f494cf8-webhook-cert\") pod \"glance-operator-controller-manager-75dc47fc9-v4thz\" (UID: \"f70443db-a342-4f5d-81b2-39c01f494cf8\") " pod="openstack-operators/glance-operator-controller-manager-75dc47fc9-v4thz" Jan 31 14:59:06 crc kubenswrapper[4751]: I0131 14:59:06.855256 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f70443db-a342-4f5d-81b2-39c01f494cf8-apiservice-cert\") pod \"glance-operator-controller-manager-75dc47fc9-v4thz\" (UID: \"f70443db-a342-4f5d-81b2-39c01f494cf8\") " pod="openstack-operators/glance-operator-controller-manager-75dc47fc9-v4thz" Jan 31 14:59:06 crc kubenswrapper[4751]: I0131 14:59:06.861879 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6f7g\" (UniqueName: \"kubernetes.io/projected/f70443db-a342-4f5d-81b2-39c01f494cf8-kube-api-access-l6f7g\") pod \"glance-operator-controller-manager-75dc47fc9-v4thz\" (UID: \"f70443db-a342-4f5d-81b2-39c01f494cf8\") " pod="openstack-operators/glance-operator-controller-manager-75dc47fc9-v4thz" Jan 31 14:59:06 crc kubenswrapper[4751]: I0131 14:59:06.976696 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-75dc47fc9-v4thz" Jan 31 14:59:07 crc kubenswrapper[4751]: I0131 14:59:07.481286 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-75dc47fc9-v4thz"] Jan 31 14:59:07 crc kubenswrapper[4751]: W0131 14:59:07.488186 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf70443db_a342_4f5d_81b2_39c01f494cf8.slice/crio-1eff2ed52d31a6cb86d6cac75fe9fb2899624e91687b3dbe55c93d71e4cef517 WatchSource:0}: Error finding container 1eff2ed52d31a6cb86d6cac75fe9fb2899624e91687b3dbe55c93d71e4cef517: Status 404 returned error can't find the container with id 1eff2ed52d31a6cb86d6cac75fe9fb2899624e91687b3dbe55c93d71e4cef517 Jan 31 14:59:07 crc kubenswrapper[4751]: I0131 14:59:07.550451 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-75dc47fc9-v4thz" event={"ID":"f70443db-a342-4f5d-81b2-39c01f494cf8","Type":"ContainerStarted","Data":"1eff2ed52d31a6cb86d6cac75fe9fb2899624e91687b3dbe55c93d71e4cef517"} Jan 31 14:59:08 crc kubenswrapper[4751]: I0131 14:59:08.896834 4751 patch_prober.go:28] interesting pod/machine-config-daemon-2wpj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 14:59:08 crc kubenswrapper[4751]: I0131 14:59:08.897226 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podUID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 14:59:08 crc kubenswrapper[4751]: I0131 14:59:08.897286 4751 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" Jan 31 14:59:08 crc kubenswrapper[4751]: I0131 14:59:08.898005 4751 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dc064826cd8a78005216541d25736856cc2dd920bfe44778b79dbfd2f76ed341"} pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 31 14:59:08 crc kubenswrapper[4751]: I0131 14:59:08.898095 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podUID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" containerName="machine-config-daemon" containerID="cri-o://dc064826cd8a78005216541d25736856cc2dd920bfe44778b79dbfd2f76ed341" gracePeriod=600 Jan 31 14:59:09 crc kubenswrapper[4751]: I0131 14:59:09.569304 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-75dc47fc9-v4thz" event={"ID":"f70443db-a342-4f5d-81b2-39c01f494cf8","Type":"ContainerStarted","Data":"ab946ef56298d90f2da08c7aa03dc9761afb66c0a527a34685eef2375ecebd56"} Jan 31 14:59:09 crc kubenswrapper[4751]: I0131 14:59:09.569836 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-75dc47fc9-v4thz" Jan 31 14:59:09 crc kubenswrapper[4751]: I0131 14:59:09.573128 4751 generic.go:334] "Generic (PLEG): container finished" podID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" containerID="dc064826cd8a78005216541d25736856cc2dd920bfe44778b79dbfd2f76ed341" exitCode=0 Jan 31 14:59:09 crc kubenswrapper[4751]: I0131 14:59:09.573181 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" event={"ID":"b4c170e8-22c9-43a9-8b34-9d626c2ccddc","Type":"ContainerDied","Data":"dc064826cd8a78005216541d25736856cc2dd920bfe44778b79dbfd2f76ed341"} Jan 31 14:59:09 crc kubenswrapper[4751]: I0131 14:59:09.573213 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" event={"ID":"b4c170e8-22c9-43a9-8b34-9d626c2ccddc","Type":"ContainerStarted","Data":"7fcf941f127d31d0e5c99d5ef038c633782d289d0e911f4e9c5c6f77b2a91e2a"} Jan 31 14:59:09 crc kubenswrapper[4751]: I0131 14:59:09.573235 4751 scope.go:117] "RemoveContainer" containerID="f4d4f92719c72ec0adb31e02a10d5c8bcb4b1a9b3bfb5b0e7ed8cfdbc4a53235" Jan 31 14:59:09 crc kubenswrapper[4751]: I0131 14:59:09.590449 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-75dc47fc9-v4thz" podStartSLOduration=2.293730486 podStartE2EDuration="3.590428321s" podCreationTimestamp="2026-01-31 14:59:06 +0000 UTC" firstStartedPulling="2026-01-31 14:59:07.492224529 +0000 UTC m=+1051.866937404" lastFinishedPulling="2026-01-31 14:59:08.788922344 +0000 UTC m=+1053.163635239" observedRunningTime="2026-01-31 14:59:09.590232926 +0000 UTC m=+1053.964945811" watchObservedRunningTime="2026-01-31 14:59:09.590428321 +0000 UTC m=+1053.965141206" Jan 31 14:59:16 crc kubenswrapper[4751]: I0131 14:59:16.982735 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-75dc47fc9-v4thz" Jan 31 14:59:20 crc kubenswrapper[4751]: I0131 14:59:20.501393 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/openstackclient"] Jan 31 14:59:20 crc kubenswrapper[4751]: I0131 14:59:20.502686 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Jan 31 14:59:20 crc kubenswrapper[4751]: I0131 14:59:20.505793 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"default-dockercfg-gh2c4" Jan 31 14:59:20 crc kubenswrapper[4751]: I0131 14:59:20.506376 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"openstack-config-secret" Jan 31 14:59:20 crc kubenswrapper[4751]: I0131 14:59:20.506641 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-scripts-9db6gc427h" Jan 31 14:59:20 crc kubenswrapper[4751]: I0131 14:59:20.506900 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-config" Jan 31 14:59:20 crc kubenswrapper[4751]: I0131 14:59:20.509912 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstackclient"] Jan 31 14:59:20 crc kubenswrapper[4751]: I0131 14:59:20.552366 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-bf79-account-create-update-whmk8"] Jan 31 14:59:20 crc kubenswrapper[4751]: I0131 14:59:20.553587 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-bf79-account-create-update-whmk8" Jan 31 14:59:20 crc kubenswrapper[4751]: I0131 14:59:20.557357 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-db-secret" Jan 31 14:59:20 crc kubenswrapper[4751]: I0131 14:59:20.561860 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-create-b8nfw"] Jan 31 14:59:20 crc kubenswrapper[4751]: I0131 14:59:20.563088 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-b8nfw" Jan 31 14:59:20 crc kubenswrapper[4751]: I0131 14:59:20.575683 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-b8nfw"] Jan 31 14:59:20 crc kubenswrapper[4751]: I0131 14:59:20.586463 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-bf79-account-create-update-whmk8"] Jan 31 14:59:20 crc kubenswrapper[4751]: I0131 14:59:20.650734 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d360673b-7556-44b9-b7bd-4805810da349-openstack-config\") pod \"openstackclient\" (UID: \"d360673b-7556-44b9-b7bd-4805810da349\") " pod="glance-kuttl-tests/openstackclient" Jan 31 14:59:20 crc kubenswrapper[4751]: I0131 14:59:20.650784 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d360673b-7556-44b9-b7bd-4805810da349-openstack-config-secret\") pod \"openstackclient\" (UID: \"d360673b-7556-44b9-b7bd-4805810da349\") " pod="glance-kuttl-tests/openstackclient" Jan 31 14:59:20 crc kubenswrapper[4751]: I0131 14:59:20.650816 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/892f8632-f7d8-46b0-a39a-4a84f5e3a2aa-operator-scripts\") pod \"glance-db-create-b8nfw\" (UID: \"892f8632-f7d8-46b0-a39a-4a84f5e3a2aa\") " pod="glance-kuttl-tests/glance-db-create-b8nfw" Jan 31 14:59:20 crc kubenswrapper[4751]: I0131 14:59:20.650836 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bs77g\" (UniqueName: \"kubernetes.io/projected/892f8632-f7d8-46b0-a39a-4a84f5e3a2aa-kube-api-access-bs77g\") pod \"glance-db-create-b8nfw\" (UID: \"892f8632-f7d8-46b0-a39a-4a84f5e3a2aa\") " pod="glance-kuttl-tests/glance-db-create-b8nfw" Jan 31 14:59:20 crc kubenswrapper[4751]: I0131 14:59:20.650865 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqbgs\" (UniqueName: \"kubernetes.io/projected/4c9ad1c0-9bb7-4d3e-8e68-8310292d89fa-kube-api-access-cqbgs\") pod \"glance-bf79-account-create-update-whmk8\" (UID: \"4c9ad1c0-9bb7-4d3e-8e68-8310292d89fa\") " pod="glance-kuttl-tests/glance-bf79-account-create-update-whmk8" Jan 31 14:59:20 crc kubenswrapper[4751]: I0131 14:59:20.651034 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ch7w\" (UniqueName: \"kubernetes.io/projected/d360673b-7556-44b9-b7bd-4805810da349-kube-api-access-8ch7w\") pod \"openstackclient\" (UID: \"d360673b-7556-44b9-b7bd-4805810da349\") " pod="glance-kuttl-tests/openstackclient" Jan 31 14:59:20 crc kubenswrapper[4751]: I0131 14:59:20.651114 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/d360673b-7556-44b9-b7bd-4805810da349-openstack-scripts\") pod \"openstackclient\" (UID: \"d360673b-7556-44b9-b7bd-4805810da349\") " pod="glance-kuttl-tests/openstackclient" Jan 31 14:59:20 crc kubenswrapper[4751]: I0131 14:59:20.651170 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c9ad1c0-9bb7-4d3e-8e68-8310292d89fa-operator-scripts\") pod \"glance-bf79-account-create-update-whmk8\" (UID: \"4c9ad1c0-9bb7-4d3e-8e68-8310292d89fa\") " pod="glance-kuttl-tests/glance-bf79-account-create-update-whmk8" Jan 31 14:59:20 crc kubenswrapper[4751]: I0131 14:59:20.752444 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/d360673b-7556-44b9-b7bd-4805810da349-openstack-scripts\") pod \"openstackclient\" (UID: \"d360673b-7556-44b9-b7bd-4805810da349\") " pod="glance-kuttl-tests/openstackclient" Jan 31 14:59:20 crc kubenswrapper[4751]: I0131 14:59:20.752534 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c9ad1c0-9bb7-4d3e-8e68-8310292d89fa-operator-scripts\") pod \"glance-bf79-account-create-update-whmk8\" (UID: \"4c9ad1c0-9bb7-4d3e-8e68-8310292d89fa\") " pod="glance-kuttl-tests/glance-bf79-account-create-update-whmk8" Jan 31 14:59:20 crc kubenswrapper[4751]: I0131 14:59:20.752646 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d360673b-7556-44b9-b7bd-4805810da349-openstack-config\") pod \"openstackclient\" (UID: \"d360673b-7556-44b9-b7bd-4805810da349\") " pod="glance-kuttl-tests/openstackclient" Jan 31 14:59:20 crc kubenswrapper[4751]: I0131 14:59:20.752690 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d360673b-7556-44b9-b7bd-4805810da349-openstack-config-secret\") pod \"openstackclient\" (UID: \"d360673b-7556-44b9-b7bd-4805810da349\") " pod="glance-kuttl-tests/openstackclient" Jan 31 14:59:20 crc kubenswrapper[4751]: I0131 14:59:20.752735 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/892f8632-f7d8-46b0-a39a-4a84f5e3a2aa-operator-scripts\") pod \"glance-db-create-b8nfw\" (UID: \"892f8632-f7d8-46b0-a39a-4a84f5e3a2aa\") " pod="glance-kuttl-tests/glance-db-create-b8nfw" Jan 31 14:59:20 crc kubenswrapper[4751]: I0131 14:59:20.752762 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bs77g\" (UniqueName: \"kubernetes.io/projected/892f8632-f7d8-46b0-a39a-4a84f5e3a2aa-kube-api-access-bs77g\") pod \"glance-db-create-b8nfw\" (UID: \"892f8632-f7d8-46b0-a39a-4a84f5e3a2aa\") " pod="glance-kuttl-tests/glance-db-create-b8nfw" Jan 31 14:59:20 crc kubenswrapper[4751]: I0131 14:59:20.752804 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqbgs\" (UniqueName: \"kubernetes.io/projected/4c9ad1c0-9bb7-4d3e-8e68-8310292d89fa-kube-api-access-cqbgs\") pod \"glance-bf79-account-create-update-whmk8\" (UID: \"4c9ad1c0-9bb7-4d3e-8e68-8310292d89fa\") " pod="glance-kuttl-tests/glance-bf79-account-create-update-whmk8" Jan 31 14:59:20 crc kubenswrapper[4751]: I0131 14:59:20.752863 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ch7w\" (UniqueName: \"kubernetes.io/projected/d360673b-7556-44b9-b7bd-4805810da349-kube-api-access-8ch7w\") pod \"openstackclient\" (UID: \"d360673b-7556-44b9-b7bd-4805810da349\") " pod="glance-kuttl-tests/openstackclient" Jan 31 14:59:20 crc kubenswrapper[4751]: I0131 14:59:20.753535 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d360673b-7556-44b9-b7bd-4805810da349-openstack-config\") pod \"openstackclient\" (UID: \"d360673b-7556-44b9-b7bd-4805810da349\") " pod="glance-kuttl-tests/openstackclient" Jan 31 14:59:20 crc kubenswrapper[4751]: I0131 14:59:20.753698 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/892f8632-f7d8-46b0-a39a-4a84f5e3a2aa-operator-scripts\") pod \"glance-db-create-b8nfw\" (UID: \"892f8632-f7d8-46b0-a39a-4a84f5e3a2aa\") " pod="glance-kuttl-tests/glance-db-create-b8nfw" Jan 31 14:59:20 crc kubenswrapper[4751]: I0131 14:59:20.753749 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c9ad1c0-9bb7-4d3e-8e68-8310292d89fa-operator-scripts\") pod \"glance-bf79-account-create-update-whmk8\" (UID: \"4c9ad1c0-9bb7-4d3e-8e68-8310292d89fa\") " pod="glance-kuttl-tests/glance-bf79-account-create-update-whmk8" Jan 31 14:59:20 crc kubenswrapper[4751]: I0131 14:59:20.754027 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/d360673b-7556-44b9-b7bd-4805810da349-openstack-scripts\") pod \"openstackclient\" (UID: \"d360673b-7556-44b9-b7bd-4805810da349\") " pod="glance-kuttl-tests/openstackclient" Jan 31 14:59:20 crc kubenswrapper[4751]: I0131 14:59:20.759988 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d360673b-7556-44b9-b7bd-4805810da349-openstack-config-secret\") pod \"openstackclient\" (UID: \"d360673b-7556-44b9-b7bd-4805810da349\") " pod="glance-kuttl-tests/openstackclient" Jan 31 14:59:20 crc kubenswrapper[4751]: I0131 14:59:20.769908 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ch7w\" (UniqueName: \"kubernetes.io/projected/d360673b-7556-44b9-b7bd-4805810da349-kube-api-access-8ch7w\") pod \"openstackclient\" (UID: \"d360673b-7556-44b9-b7bd-4805810da349\") " pod="glance-kuttl-tests/openstackclient" Jan 31 14:59:20 crc kubenswrapper[4751]: I0131 14:59:20.771439 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqbgs\" (UniqueName: \"kubernetes.io/projected/4c9ad1c0-9bb7-4d3e-8e68-8310292d89fa-kube-api-access-cqbgs\") pod \"glance-bf79-account-create-update-whmk8\" (UID: \"4c9ad1c0-9bb7-4d3e-8e68-8310292d89fa\") " pod="glance-kuttl-tests/glance-bf79-account-create-update-whmk8" Jan 31 14:59:20 crc kubenswrapper[4751]: I0131 14:59:20.771855 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bs77g\" (UniqueName: \"kubernetes.io/projected/892f8632-f7d8-46b0-a39a-4a84f5e3a2aa-kube-api-access-bs77g\") pod \"glance-db-create-b8nfw\" (UID: \"892f8632-f7d8-46b0-a39a-4a84f5e3a2aa\") " pod="glance-kuttl-tests/glance-db-create-b8nfw" Jan 31 14:59:20 crc kubenswrapper[4751]: I0131 14:59:20.828370 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Jan 31 14:59:20 crc kubenswrapper[4751]: I0131 14:59:20.874960 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-bf79-account-create-update-whmk8" Jan 31 14:59:20 crc kubenswrapper[4751]: I0131 14:59:20.887965 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-b8nfw" Jan 31 14:59:21 crc kubenswrapper[4751]: I0131 14:59:21.232254 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstackclient"] Jan 31 14:59:21 crc kubenswrapper[4751]: I0131 14:59:21.357241 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-bf79-account-create-update-whmk8"] Jan 31 14:59:21 crc kubenswrapper[4751]: W0131 14:59:21.365441 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c9ad1c0_9bb7_4d3e_8e68_8310292d89fa.slice/crio-546a3a33848e346d24fbc975e927aab08414e617a2ceb91ba8f794b0d2405aee WatchSource:0}: Error finding container 546a3a33848e346d24fbc975e927aab08414e617a2ceb91ba8f794b0d2405aee: Status 404 returned error can't find the container with id 546a3a33848e346d24fbc975e927aab08414e617a2ceb91ba8f794b0d2405aee Jan 31 14:59:21 crc kubenswrapper[4751]: I0131 14:59:21.386379 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-b8nfw"] Jan 31 14:59:21 crc kubenswrapper[4751]: W0131 14:59:21.391035 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod892f8632_f7d8_46b0_a39a_4a84f5e3a2aa.slice/crio-67b5a0239378d9b43272b8e5a4f0eb11ba98f694abae6196e55befbe976cb98c WatchSource:0}: Error finding container 67b5a0239378d9b43272b8e5a4f0eb11ba98f694abae6196e55befbe976cb98c: Status 404 returned error can't find the container with id 67b5a0239378d9b43272b8e5a4f0eb11ba98f694abae6196e55befbe976cb98c Jan 31 14:59:21 crc kubenswrapper[4751]: I0131 14:59:21.677259 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-bf79-account-create-update-whmk8" event={"ID":"4c9ad1c0-9bb7-4d3e-8e68-8310292d89fa","Type":"ContainerStarted","Data":"4157453b55a598117cf21c7e58fec8625fe386b3472f188272985dac7429ad14"} Jan 31 14:59:21 crc kubenswrapper[4751]: I0131 14:59:21.677600 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-bf79-account-create-update-whmk8" event={"ID":"4c9ad1c0-9bb7-4d3e-8e68-8310292d89fa","Type":"ContainerStarted","Data":"546a3a33848e346d24fbc975e927aab08414e617a2ceb91ba8f794b0d2405aee"} Jan 31 14:59:21 crc kubenswrapper[4751]: I0131 14:59:21.680285 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-b8nfw" event={"ID":"892f8632-f7d8-46b0-a39a-4a84f5e3a2aa","Type":"ContainerStarted","Data":"53e4421364bd50f8121a14bb4c3e20cbc7c5ba08c19bdb68ee47f37ac2b94308"} Jan 31 14:59:21 crc kubenswrapper[4751]: I0131 14:59:21.680319 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-b8nfw" event={"ID":"892f8632-f7d8-46b0-a39a-4a84f5e3a2aa","Type":"ContainerStarted","Data":"67b5a0239378d9b43272b8e5a4f0eb11ba98f694abae6196e55befbe976cb98c"} Jan 31 14:59:21 crc kubenswrapper[4751]: I0131 14:59:21.684342 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"d360673b-7556-44b9-b7bd-4805810da349","Type":"ContainerStarted","Data":"c2e3697d65b3597868569dcd055006b2a37c4a1b745665e4039d129867477c4a"} Jan 31 14:59:21 crc kubenswrapper[4751]: I0131 14:59:21.713366 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-bf79-account-create-update-whmk8" podStartSLOduration=1.713347022 podStartE2EDuration="1.713347022s" podCreationTimestamp="2026-01-31 14:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:59:21.695301206 +0000 UTC m=+1066.070014091" watchObservedRunningTime="2026-01-31 14:59:21.713347022 +0000 UTC m=+1066.088059907" Jan 31 14:59:21 crc kubenswrapper[4751]: I0131 14:59:21.715185 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-create-b8nfw" podStartSLOduration=1.71517907 podStartE2EDuration="1.71517907s" podCreationTimestamp="2026-01-31 14:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 14:59:21.711086082 +0000 UTC m=+1066.085798967" watchObservedRunningTime="2026-01-31 14:59:21.71517907 +0000 UTC m=+1066.089891955" Jan 31 14:59:22 crc kubenswrapper[4751]: I0131 14:59:22.695432 4751 generic.go:334] "Generic (PLEG): container finished" podID="892f8632-f7d8-46b0-a39a-4a84f5e3a2aa" containerID="53e4421364bd50f8121a14bb4c3e20cbc7c5ba08c19bdb68ee47f37ac2b94308" exitCode=0 Jan 31 14:59:22 crc kubenswrapper[4751]: I0131 14:59:22.695702 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-b8nfw" event={"ID":"892f8632-f7d8-46b0-a39a-4a84f5e3a2aa","Type":"ContainerDied","Data":"53e4421364bd50f8121a14bb4c3e20cbc7c5ba08c19bdb68ee47f37ac2b94308"} Jan 31 14:59:22 crc kubenswrapper[4751]: I0131 14:59:22.699412 4751 generic.go:334] "Generic (PLEG): container finished" podID="4c9ad1c0-9bb7-4d3e-8e68-8310292d89fa" containerID="4157453b55a598117cf21c7e58fec8625fe386b3472f188272985dac7429ad14" exitCode=0 Jan 31 14:59:22 crc kubenswrapper[4751]: I0131 14:59:22.699441 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-bf79-account-create-update-whmk8" event={"ID":"4c9ad1c0-9bb7-4d3e-8e68-8310292d89fa","Type":"ContainerDied","Data":"4157453b55a598117cf21c7e58fec8625fe386b3472f188272985dac7429ad14"} Jan 31 14:59:24 crc kubenswrapper[4751]: I0131 14:59:24.150737 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-bf79-account-create-update-whmk8" Jan 31 14:59:24 crc kubenswrapper[4751]: I0131 14:59:24.158694 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-b8nfw" Jan 31 14:59:24 crc kubenswrapper[4751]: I0131 14:59:24.212522 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c9ad1c0-9bb7-4d3e-8e68-8310292d89fa-operator-scripts\") pod \"4c9ad1c0-9bb7-4d3e-8e68-8310292d89fa\" (UID: \"4c9ad1c0-9bb7-4d3e-8e68-8310292d89fa\") " Jan 31 14:59:24 crc kubenswrapper[4751]: I0131 14:59:24.212706 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqbgs\" (UniqueName: \"kubernetes.io/projected/4c9ad1c0-9bb7-4d3e-8e68-8310292d89fa-kube-api-access-cqbgs\") pod \"4c9ad1c0-9bb7-4d3e-8e68-8310292d89fa\" (UID: \"4c9ad1c0-9bb7-4d3e-8e68-8310292d89fa\") " Jan 31 14:59:24 crc kubenswrapper[4751]: I0131 14:59:24.213416 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c9ad1c0-9bb7-4d3e-8e68-8310292d89fa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4c9ad1c0-9bb7-4d3e-8e68-8310292d89fa" (UID: "4c9ad1c0-9bb7-4d3e-8e68-8310292d89fa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:59:24 crc kubenswrapper[4751]: I0131 14:59:24.213796 4751 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c9ad1c0-9bb7-4d3e-8e68-8310292d89fa-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 14:59:24 crc kubenswrapper[4751]: I0131 14:59:24.217956 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c9ad1c0-9bb7-4d3e-8e68-8310292d89fa-kube-api-access-cqbgs" (OuterVolumeSpecName: "kube-api-access-cqbgs") pod "4c9ad1c0-9bb7-4d3e-8e68-8310292d89fa" (UID: "4c9ad1c0-9bb7-4d3e-8e68-8310292d89fa"). InnerVolumeSpecName "kube-api-access-cqbgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:59:24 crc kubenswrapper[4751]: I0131 14:59:24.315015 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/892f8632-f7d8-46b0-a39a-4a84f5e3a2aa-operator-scripts\") pod \"892f8632-f7d8-46b0-a39a-4a84f5e3a2aa\" (UID: \"892f8632-f7d8-46b0-a39a-4a84f5e3a2aa\") " Jan 31 14:59:24 crc kubenswrapper[4751]: I0131 14:59:24.315328 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bs77g\" (UniqueName: \"kubernetes.io/projected/892f8632-f7d8-46b0-a39a-4a84f5e3a2aa-kube-api-access-bs77g\") pod \"892f8632-f7d8-46b0-a39a-4a84f5e3a2aa\" (UID: \"892f8632-f7d8-46b0-a39a-4a84f5e3a2aa\") " Jan 31 14:59:24 crc kubenswrapper[4751]: I0131 14:59:24.315546 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/892f8632-f7d8-46b0-a39a-4a84f5e3a2aa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "892f8632-f7d8-46b0-a39a-4a84f5e3a2aa" (UID: "892f8632-f7d8-46b0-a39a-4a84f5e3a2aa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 14:59:24 crc kubenswrapper[4751]: I0131 14:59:24.315603 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqbgs\" (UniqueName: \"kubernetes.io/projected/4c9ad1c0-9bb7-4d3e-8e68-8310292d89fa-kube-api-access-cqbgs\") on node \"crc\" DevicePath \"\"" Jan 31 14:59:24 crc kubenswrapper[4751]: I0131 14:59:24.317834 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/892f8632-f7d8-46b0-a39a-4a84f5e3a2aa-kube-api-access-bs77g" (OuterVolumeSpecName: "kube-api-access-bs77g") pod "892f8632-f7d8-46b0-a39a-4a84f5e3a2aa" (UID: "892f8632-f7d8-46b0-a39a-4a84f5e3a2aa"). InnerVolumeSpecName "kube-api-access-bs77g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:59:24 crc kubenswrapper[4751]: I0131 14:59:24.416675 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bs77g\" (UniqueName: \"kubernetes.io/projected/892f8632-f7d8-46b0-a39a-4a84f5e3a2aa-kube-api-access-bs77g\") on node \"crc\" DevicePath \"\"" Jan 31 14:59:24 crc kubenswrapper[4751]: I0131 14:59:24.416709 4751 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/892f8632-f7d8-46b0-a39a-4a84f5e3a2aa-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 14:59:24 crc kubenswrapper[4751]: I0131 14:59:24.726784 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-b8nfw" event={"ID":"892f8632-f7d8-46b0-a39a-4a84f5e3a2aa","Type":"ContainerDied","Data":"67b5a0239378d9b43272b8e5a4f0eb11ba98f694abae6196e55befbe976cb98c"} Jan 31 14:59:24 crc kubenswrapper[4751]: I0131 14:59:24.726840 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67b5a0239378d9b43272b8e5a4f0eb11ba98f694abae6196e55befbe976cb98c" Jan 31 14:59:24 crc kubenswrapper[4751]: I0131 14:59:24.727983 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-b8nfw" Jan 31 14:59:24 crc kubenswrapper[4751]: I0131 14:59:24.729173 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-bf79-account-create-update-whmk8" event={"ID":"4c9ad1c0-9bb7-4d3e-8e68-8310292d89fa","Type":"ContainerDied","Data":"546a3a33848e346d24fbc975e927aab08414e617a2ceb91ba8f794b0d2405aee"} Jan 31 14:59:24 crc kubenswrapper[4751]: I0131 14:59:24.729218 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="546a3a33848e346d24fbc975e927aab08414e617a2ceb91ba8f794b0d2405aee" Jan 31 14:59:24 crc kubenswrapper[4751]: I0131 14:59:24.729278 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-bf79-account-create-update-whmk8" Jan 31 14:59:25 crc kubenswrapper[4751]: I0131 14:59:25.746090 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-sync-lvpgz"] Jan 31 14:59:25 crc kubenswrapper[4751]: E0131 14:59:25.746613 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="892f8632-f7d8-46b0-a39a-4a84f5e3a2aa" containerName="mariadb-database-create" Jan 31 14:59:25 crc kubenswrapper[4751]: I0131 14:59:25.746627 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="892f8632-f7d8-46b0-a39a-4a84f5e3a2aa" containerName="mariadb-database-create" Jan 31 14:59:25 crc kubenswrapper[4751]: E0131 14:59:25.746644 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c9ad1c0-9bb7-4d3e-8e68-8310292d89fa" containerName="mariadb-account-create-update" Jan 31 14:59:25 crc kubenswrapper[4751]: I0131 14:59:25.746650 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c9ad1c0-9bb7-4d3e-8e68-8310292d89fa" containerName="mariadb-account-create-update" Jan 31 14:59:25 crc kubenswrapper[4751]: I0131 14:59:25.746778 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c9ad1c0-9bb7-4d3e-8e68-8310292d89fa" containerName="mariadb-account-create-update" Jan 31 14:59:25 crc kubenswrapper[4751]: I0131 14:59:25.746794 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="892f8632-f7d8-46b0-a39a-4a84f5e3a2aa" containerName="mariadb-database-create" Jan 31 14:59:25 crc kubenswrapper[4751]: I0131 14:59:25.747581 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-lvpgz" Jan 31 14:59:25 crc kubenswrapper[4751]: W0131 14:59:25.751358 4751 reflector.go:561] object-"glance-kuttl-tests"/"glance-glance-dockercfg-fgjx2": failed to list *v1.Secret: secrets "glance-glance-dockercfg-fgjx2" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "glance-kuttl-tests": no relationship found between node 'crc' and this object Jan 31 14:59:25 crc kubenswrapper[4751]: W0131 14:59:25.751373 4751 reflector.go:561] object-"glance-kuttl-tests"/"glance-config-data": failed to list *v1.Secret: secrets "glance-config-data" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "glance-kuttl-tests": no relationship found between node 'crc' and this object Jan 31 14:59:25 crc kubenswrapper[4751]: E0131 14:59:25.751407 4751 reflector.go:158] "Unhandled Error" err="object-\"glance-kuttl-tests\"/\"glance-glance-dockercfg-fgjx2\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"glance-glance-dockercfg-fgjx2\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"glance-kuttl-tests\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 31 14:59:25 crc kubenswrapper[4751]: E0131 14:59:25.751426 4751 reflector.go:158] "Unhandled Error" err="object-\"glance-kuttl-tests\"/\"glance-config-data\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"glance-config-data\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"glance-kuttl-tests\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 31 14:59:25 crc kubenswrapper[4751]: I0131 14:59:25.761400 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-lvpgz"] Jan 31 14:59:25 crc kubenswrapper[4751]: I0131 14:59:25.849514 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvkld\" (UniqueName: \"kubernetes.io/projected/5c7b87c6-2803-4ae5-9257-1a7e12d26f61-kube-api-access-bvkld\") pod \"glance-db-sync-lvpgz\" (UID: \"5c7b87c6-2803-4ae5-9257-1a7e12d26f61\") " pod="glance-kuttl-tests/glance-db-sync-lvpgz" Jan 31 14:59:25 crc kubenswrapper[4751]: I0131 14:59:25.849620 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c7b87c6-2803-4ae5-9257-1a7e12d26f61-config-data\") pod \"glance-db-sync-lvpgz\" (UID: \"5c7b87c6-2803-4ae5-9257-1a7e12d26f61\") " pod="glance-kuttl-tests/glance-db-sync-lvpgz" Jan 31 14:59:25 crc kubenswrapper[4751]: I0131 14:59:25.849719 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5c7b87c6-2803-4ae5-9257-1a7e12d26f61-db-sync-config-data\") pod \"glance-db-sync-lvpgz\" (UID: \"5c7b87c6-2803-4ae5-9257-1a7e12d26f61\") " pod="glance-kuttl-tests/glance-db-sync-lvpgz" Jan 31 14:59:25 crc kubenswrapper[4751]: I0131 14:59:25.951590 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5c7b87c6-2803-4ae5-9257-1a7e12d26f61-db-sync-config-data\") pod \"glance-db-sync-lvpgz\" (UID: \"5c7b87c6-2803-4ae5-9257-1a7e12d26f61\") " pod="glance-kuttl-tests/glance-db-sync-lvpgz" Jan 31 14:59:25 crc kubenswrapper[4751]: I0131 14:59:25.951730 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvkld\" (UniqueName: \"kubernetes.io/projected/5c7b87c6-2803-4ae5-9257-1a7e12d26f61-kube-api-access-bvkld\") pod \"glance-db-sync-lvpgz\" (UID: \"5c7b87c6-2803-4ae5-9257-1a7e12d26f61\") " pod="glance-kuttl-tests/glance-db-sync-lvpgz" Jan 31 14:59:25 crc kubenswrapper[4751]: I0131 14:59:25.952158 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c7b87c6-2803-4ae5-9257-1a7e12d26f61-config-data\") pod \"glance-db-sync-lvpgz\" (UID: \"5c7b87c6-2803-4ae5-9257-1a7e12d26f61\") " pod="glance-kuttl-tests/glance-db-sync-lvpgz" Jan 31 14:59:25 crc kubenswrapper[4751]: I0131 14:59:25.971967 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvkld\" (UniqueName: \"kubernetes.io/projected/5c7b87c6-2803-4ae5-9257-1a7e12d26f61-kube-api-access-bvkld\") pod \"glance-db-sync-lvpgz\" (UID: \"5c7b87c6-2803-4ae5-9257-1a7e12d26f61\") " pod="glance-kuttl-tests/glance-db-sync-lvpgz" Jan 31 14:59:26 crc kubenswrapper[4751]: I0131 14:59:26.782869 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-config-data" Jan 31 14:59:26 crc kubenswrapper[4751]: I0131 14:59:26.787558 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5c7b87c6-2803-4ae5-9257-1a7e12d26f61-db-sync-config-data\") pod \"glance-db-sync-lvpgz\" (UID: \"5c7b87c6-2803-4ae5-9257-1a7e12d26f61\") " pod="glance-kuttl-tests/glance-db-sync-lvpgz" Jan 31 14:59:26 crc kubenswrapper[4751]: I0131 14:59:26.797881 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c7b87c6-2803-4ae5-9257-1a7e12d26f61-config-data\") pod \"glance-db-sync-lvpgz\" (UID: \"5c7b87c6-2803-4ae5-9257-1a7e12d26f61\") " pod="glance-kuttl-tests/glance-db-sync-lvpgz" Jan 31 14:59:27 crc kubenswrapper[4751]: I0131 14:59:27.067329 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-fgjx2" Jan 31 14:59:27 crc kubenswrapper[4751]: I0131 14:59:27.077143 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-lvpgz" Jan 31 14:59:30 crc kubenswrapper[4751]: I0131 14:59:30.630467 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-lvpgz"] Jan 31 14:59:30 crc kubenswrapper[4751]: I0131 14:59:30.775239 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"d360673b-7556-44b9-b7bd-4805810da349","Type":"ContainerStarted","Data":"44aed1a495029b4f70f43e4d99769c88600905aa98e64ff530f4a6570f61d3ae"} Jan 31 14:59:30 crc kubenswrapper[4751]: I0131 14:59:30.777736 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-lvpgz" event={"ID":"5c7b87c6-2803-4ae5-9257-1a7e12d26f61","Type":"ContainerStarted","Data":"3231b5b68a66beba965e16caaa4e761a73c20400a40b39c6426b49ba73bc4dac"} Jan 31 14:59:30 crc kubenswrapper[4751]: I0131 14:59:30.792199 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/openstackclient" podStartSLOduration=1.6500771429999999 podStartE2EDuration="10.792181933s" podCreationTimestamp="2026-01-31 14:59:20 +0000 UTC" firstStartedPulling="2026-01-31 14:59:21.245181359 +0000 UTC m=+1065.619894244" lastFinishedPulling="2026-01-31 14:59:30.387286149 +0000 UTC m=+1074.761999034" observedRunningTime="2026-01-31 14:59:30.790977641 +0000 UTC m=+1075.165690526" watchObservedRunningTime="2026-01-31 14:59:30.792181933 +0000 UTC m=+1075.166894828" Jan 31 14:59:42 crc kubenswrapper[4751]: I0131 14:59:42.868562 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-lvpgz" event={"ID":"5c7b87c6-2803-4ae5-9257-1a7e12d26f61","Type":"ContainerStarted","Data":"4d83615719bc342a748610c14a746dcc08356aa04afe079d5f96b964e25ed0f6"} Jan 31 14:59:42 crc kubenswrapper[4751]: I0131 14:59:42.892387 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-sync-lvpgz" podStartSLOduration=6.627121364 podStartE2EDuration="17.892369155s" podCreationTimestamp="2026-01-31 14:59:25 +0000 UTC" firstStartedPulling="2026-01-31 14:59:30.641059645 +0000 UTC m=+1075.015772530" lastFinishedPulling="2026-01-31 14:59:41.906307406 +0000 UTC m=+1086.281020321" observedRunningTime="2026-01-31 14:59:42.88383621 +0000 UTC m=+1087.258549095" watchObservedRunningTime="2026-01-31 14:59:42.892369155 +0000 UTC m=+1087.267082040" Jan 31 14:59:49 crc kubenswrapper[4751]: I0131 14:59:49.931239 4751 generic.go:334] "Generic (PLEG): container finished" podID="5c7b87c6-2803-4ae5-9257-1a7e12d26f61" containerID="4d83615719bc342a748610c14a746dcc08356aa04afe079d5f96b964e25ed0f6" exitCode=0 Jan 31 14:59:49 crc kubenswrapper[4751]: I0131 14:59:49.931355 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-lvpgz" event={"ID":"5c7b87c6-2803-4ae5-9257-1a7e12d26f61","Type":"ContainerDied","Data":"4d83615719bc342a748610c14a746dcc08356aa04afe079d5f96b964e25ed0f6"} Jan 31 14:59:51 crc kubenswrapper[4751]: I0131 14:59:51.235427 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-lvpgz" Jan 31 14:59:51 crc kubenswrapper[4751]: I0131 14:59:51.357229 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c7b87c6-2803-4ae5-9257-1a7e12d26f61-config-data\") pod \"5c7b87c6-2803-4ae5-9257-1a7e12d26f61\" (UID: \"5c7b87c6-2803-4ae5-9257-1a7e12d26f61\") " Jan 31 14:59:51 crc kubenswrapper[4751]: I0131 14:59:51.357334 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5c7b87c6-2803-4ae5-9257-1a7e12d26f61-db-sync-config-data\") pod \"5c7b87c6-2803-4ae5-9257-1a7e12d26f61\" (UID: \"5c7b87c6-2803-4ae5-9257-1a7e12d26f61\") " Jan 31 14:59:51 crc kubenswrapper[4751]: I0131 14:59:51.357370 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvkld\" (UniqueName: \"kubernetes.io/projected/5c7b87c6-2803-4ae5-9257-1a7e12d26f61-kube-api-access-bvkld\") pod \"5c7b87c6-2803-4ae5-9257-1a7e12d26f61\" (UID: \"5c7b87c6-2803-4ae5-9257-1a7e12d26f61\") " Jan 31 14:59:51 crc kubenswrapper[4751]: I0131 14:59:51.363562 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c7b87c6-2803-4ae5-9257-1a7e12d26f61-kube-api-access-bvkld" (OuterVolumeSpecName: "kube-api-access-bvkld") pod "5c7b87c6-2803-4ae5-9257-1a7e12d26f61" (UID: "5c7b87c6-2803-4ae5-9257-1a7e12d26f61"). InnerVolumeSpecName "kube-api-access-bvkld". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 14:59:51 crc kubenswrapper[4751]: I0131 14:59:51.364051 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c7b87c6-2803-4ae5-9257-1a7e12d26f61-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "5c7b87c6-2803-4ae5-9257-1a7e12d26f61" (UID: "5c7b87c6-2803-4ae5-9257-1a7e12d26f61"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:59:51 crc kubenswrapper[4751]: I0131 14:59:51.404538 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c7b87c6-2803-4ae5-9257-1a7e12d26f61-config-data" (OuterVolumeSpecName: "config-data") pod "5c7b87c6-2803-4ae5-9257-1a7e12d26f61" (UID: "5c7b87c6-2803-4ae5-9257-1a7e12d26f61"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 14:59:51 crc kubenswrapper[4751]: I0131 14:59:51.459881 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c7b87c6-2803-4ae5-9257-1a7e12d26f61-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 14:59:51 crc kubenswrapper[4751]: I0131 14:59:51.459943 4751 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5c7b87c6-2803-4ae5-9257-1a7e12d26f61-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 14:59:51 crc kubenswrapper[4751]: I0131 14:59:51.459967 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bvkld\" (UniqueName: \"kubernetes.io/projected/5c7b87c6-2803-4ae5-9257-1a7e12d26f61-kube-api-access-bvkld\") on node \"crc\" DevicePath \"\"" Jan 31 14:59:51 crc kubenswrapper[4751]: I0131 14:59:51.947347 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-lvpgz" event={"ID":"5c7b87c6-2803-4ae5-9257-1a7e12d26f61","Type":"ContainerDied","Data":"3231b5b68a66beba965e16caaa4e761a73c20400a40b39c6426b49ba73bc4dac"} Jan 31 14:59:51 crc kubenswrapper[4751]: I0131 14:59:51.947664 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3231b5b68a66beba965e16caaa4e761a73c20400a40b39c6426b49ba73bc4dac" Jan 31 14:59:51 crc kubenswrapper[4751]: I0131 14:59:51.947434 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-lvpgz" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.155958 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Jan 31 14:59:53 crc kubenswrapper[4751]: E0131 14:59:53.156296 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c7b87c6-2803-4ae5-9257-1a7e12d26f61" containerName="glance-db-sync" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.156313 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c7b87c6-2803-4ae5-9257-1a7e12d26f61" containerName="glance-db-sync" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.156473 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c7b87c6-2803-4ae5-9257-1a7e12d26f61" containerName="glance-db-sync" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.157182 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.158853 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-fgjx2" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.159637 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.161529 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-single-config-data" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.184154 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.286251 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/53e80c85-256f-4e3a-8338-091b69c8a111-lib-modules\") pod \"glance-default-single-1\" (UID: \"53e80c85-256f-4e3a-8338-091b69c8a111\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.286503 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/53e80c85-256f-4e3a-8338-091b69c8a111-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"53e80c85-256f-4e3a-8338-091b69c8a111\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.286530 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/53e80c85-256f-4e3a-8338-091b69c8a111-dev\") pod \"glance-default-single-1\" (UID: \"53e80c85-256f-4e3a-8338-091b69c8a111\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.286586 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53e80c85-256f-4e3a-8338-091b69c8a111-scripts\") pod \"glance-default-single-1\" (UID: \"53e80c85-256f-4e3a-8338-091b69c8a111\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.286607 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-single-1\" (UID: \"53e80c85-256f-4e3a-8338-091b69c8a111\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.286620 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/53e80c85-256f-4e3a-8338-091b69c8a111-sys\") pod \"glance-default-single-1\" (UID: \"53e80c85-256f-4e3a-8338-091b69c8a111\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.286634 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53e80c85-256f-4e3a-8338-091b69c8a111-logs\") pod \"glance-default-single-1\" (UID: \"53e80c85-256f-4e3a-8338-091b69c8a111\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.286654 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2cjr\" (UniqueName: \"kubernetes.io/projected/53e80c85-256f-4e3a-8338-091b69c8a111-kube-api-access-p2cjr\") pod \"glance-default-single-1\" (UID: \"53e80c85-256f-4e3a-8338-091b69c8a111\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.286676 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-single-1\" (UID: \"53e80c85-256f-4e3a-8338-091b69c8a111\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.286714 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/53e80c85-256f-4e3a-8338-091b69c8a111-run\") pod \"glance-default-single-1\" (UID: \"53e80c85-256f-4e3a-8338-091b69c8a111\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.286736 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/53e80c85-256f-4e3a-8338-091b69c8a111-httpd-run\") pod \"glance-default-single-1\" (UID: \"53e80c85-256f-4e3a-8338-091b69c8a111\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.286753 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/53e80c85-256f-4e3a-8338-091b69c8a111-etc-nvme\") pod \"glance-default-single-1\" (UID: \"53e80c85-256f-4e3a-8338-091b69c8a111\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.286770 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/53e80c85-256f-4e3a-8338-091b69c8a111-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"53e80c85-256f-4e3a-8338-091b69c8a111\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.286788 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53e80c85-256f-4e3a-8338-091b69c8a111-config-data\") pod \"glance-default-single-1\" (UID: \"53e80c85-256f-4e3a-8338-091b69c8a111\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.305864 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.306966 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.324501 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.388429 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac0f9efc-607e-4d26-8677-3cfdbcae5644-config-data\") pod \"glance-default-single-0\" (UID: \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.388488 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-single-1\" (UID: \"53e80c85-256f-4e3a-8338-091b69c8a111\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.388551 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac0f9efc-607e-4d26-8677-3cfdbcae5644-logs\") pod \"glance-default-single-0\" (UID: \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.388576 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ac0f9efc-607e-4d26-8677-3cfdbcae5644-lib-modules\") pod \"glance-default-single-0\" (UID: \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.388618 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ac0f9efc-607e-4d26-8677-3cfdbcae5644-sys\") pod \"glance-default-single-0\" (UID: \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.388638 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ac0f9efc-607e-4d26-8677-3cfdbcae5644-run\") pod \"glance-default-single-0\" (UID: \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.388662 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/53e80c85-256f-4e3a-8338-091b69c8a111-run\") pod \"glance-default-single-1\" (UID: \"53e80c85-256f-4e3a-8338-091b69c8a111\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.388692 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/53e80c85-256f-4e3a-8338-091b69c8a111-run\") pod \"glance-default-single-1\" (UID: \"53e80c85-256f-4e3a-8338-091b69c8a111\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.388704 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ac0f9efc-607e-4d26-8677-3cfdbcae5644-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.388751 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/53e80c85-256f-4e3a-8338-091b69c8a111-httpd-run\") pod \"glance-default-single-1\" (UID: \"53e80c85-256f-4e3a-8338-091b69c8a111\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.388775 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/53e80c85-256f-4e3a-8338-091b69c8a111-etc-nvme\") pod \"glance-default-single-1\" (UID: \"53e80c85-256f-4e3a-8338-091b69c8a111\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.388801 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/53e80c85-256f-4e3a-8338-091b69c8a111-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"53e80c85-256f-4e3a-8338-091b69c8a111\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.388823 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53e80c85-256f-4e3a-8338-091b69c8a111-config-data\") pod \"glance-default-single-1\" (UID: \"53e80c85-256f-4e3a-8338-091b69c8a111\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.388842 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-single-1\" (UID: \"53e80c85-256f-4e3a-8338-091b69c8a111\") device mount path \"/mnt/openstack/pv02\"" pod="glance-kuttl-tests/glance-default-single-1" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.388934 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/53e80c85-256f-4e3a-8338-091b69c8a111-etc-nvme\") pod \"glance-default-single-1\" (UID: \"53e80c85-256f-4e3a-8338-091b69c8a111\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.388847 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t65v9\" (UniqueName: \"kubernetes.io/projected/ac0f9efc-607e-4d26-8677-3cfdbcae5644-kube-api-access-t65v9\") pod \"glance-default-single-0\" (UID: \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.389208 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/53e80c85-256f-4e3a-8338-091b69c8a111-httpd-run\") pod \"glance-default-single-1\" (UID: \"53e80c85-256f-4e3a-8338-091b69c8a111\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.389233 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/53e80c85-256f-4e3a-8338-091b69c8a111-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"53e80c85-256f-4e3a-8338-091b69c8a111\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.389300 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/53e80c85-256f-4e3a-8338-091b69c8a111-lib-modules\") pod \"glance-default-single-1\" (UID: \"53e80c85-256f-4e3a-8338-091b69c8a111\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.389354 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/53e80c85-256f-4e3a-8338-091b69c8a111-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"53e80c85-256f-4e3a-8338-091b69c8a111\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.389396 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ac0f9efc-607e-4d26-8677-3cfdbcae5644-dev\") pod \"glance-default-single-0\" (UID: \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.389426 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/53e80c85-256f-4e3a-8338-091b69c8a111-dev\") pod \"glance-default-single-1\" (UID: \"53e80c85-256f-4e3a-8338-091b69c8a111\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.389424 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/53e80c85-256f-4e3a-8338-091b69c8a111-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"53e80c85-256f-4e3a-8338-091b69c8a111\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.389395 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/53e80c85-256f-4e3a-8338-091b69c8a111-lib-modules\") pod \"glance-default-single-1\" (UID: \"53e80c85-256f-4e3a-8338-091b69c8a111\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.389509 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/53e80c85-256f-4e3a-8338-091b69c8a111-dev\") pod \"glance-default-single-1\" (UID: \"53e80c85-256f-4e3a-8338-091b69c8a111\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.389529 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-single-0\" (UID: \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.389552 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac0f9efc-607e-4d26-8677-3cfdbcae5644-scripts\") pod \"glance-default-single-0\" (UID: \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.389579 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ac0f9efc-607e-4d26-8677-3cfdbcae5644-httpd-run\") pod \"glance-default-single-0\" (UID: \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.389625 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ac0f9efc-607e-4d26-8677-3cfdbcae5644-etc-nvme\") pod \"glance-default-single-0\" (UID: \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.389650 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53e80c85-256f-4e3a-8338-091b69c8a111-scripts\") pod \"glance-default-single-1\" (UID: \"53e80c85-256f-4e3a-8338-091b69c8a111\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.389672 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-single-0\" (UID: \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.389694 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ac0f9efc-607e-4d26-8677-3cfdbcae5644-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.389715 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-single-1\" (UID: \"53e80c85-256f-4e3a-8338-091b69c8a111\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.389734 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/53e80c85-256f-4e3a-8338-091b69c8a111-sys\") pod \"glance-default-single-1\" (UID: \"53e80c85-256f-4e3a-8338-091b69c8a111\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.389756 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53e80c85-256f-4e3a-8338-091b69c8a111-logs\") pod \"glance-default-single-1\" (UID: \"53e80c85-256f-4e3a-8338-091b69c8a111\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.389789 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2cjr\" (UniqueName: \"kubernetes.io/projected/53e80c85-256f-4e3a-8338-091b69c8a111-kube-api-access-p2cjr\") pod \"glance-default-single-1\" (UID: \"53e80c85-256f-4e3a-8338-091b69c8a111\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.389856 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-single-1\" (UID: \"53e80c85-256f-4e3a-8338-091b69c8a111\") device mount path \"/mnt/openstack/pv05\"" pod="glance-kuttl-tests/glance-default-single-1" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.390065 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/53e80c85-256f-4e3a-8338-091b69c8a111-sys\") pod \"glance-default-single-1\" (UID: \"53e80c85-256f-4e3a-8338-091b69c8a111\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.390402 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53e80c85-256f-4e3a-8338-091b69c8a111-logs\") pod \"glance-default-single-1\" (UID: \"53e80c85-256f-4e3a-8338-091b69c8a111\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.404305 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53e80c85-256f-4e3a-8338-091b69c8a111-scripts\") pod \"glance-default-single-1\" (UID: \"53e80c85-256f-4e3a-8338-091b69c8a111\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.405988 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53e80c85-256f-4e3a-8338-091b69c8a111-config-data\") pod \"glance-default-single-1\" (UID: \"53e80c85-256f-4e3a-8338-091b69c8a111\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.410543 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-single-1\" (UID: \"53e80c85-256f-4e3a-8338-091b69c8a111\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.412541 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2cjr\" (UniqueName: \"kubernetes.io/projected/53e80c85-256f-4e3a-8338-091b69c8a111-kube-api-access-p2cjr\") pod \"glance-default-single-1\" (UID: \"53e80c85-256f-4e3a-8338-091b69c8a111\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.412736 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-single-1\" (UID: \"53e80c85-256f-4e3a-8338-091b69c8a111\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.481777 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.490913 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ac0f9efc-607e-4d26-8677-3cfdbcae5644-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.490974 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t65v9\" (UniqueName: \"kubernetes.io/projected/ac0f9efc-607e-4d26-8677-3cfdbcae5644-kube-api-access-t65v9\") pod \"glance-default-single-0\" (UID: \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.491009 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ac0f9efc-607e-4d26-8677-3cfdbcae5644-dev\") pod \"glance-default-single-0\" (UID: \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.491054 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-single-0\" (UID: \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.491061 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ac0f9efc-607e-4d26-8677-3cfdbcae5644-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.491097 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac0f9efc-607e-4d26-8677-3cfdbcae5644-scripts\") pod \"glance-default-single-0\" (UID: \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.491138 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ac0f9efc-607e-4d26-8677-3cfdbcae5644-httpd-run\") pod \"glance-default-single-0\" (UID: \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.491141 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ac0f9efc-607e-4d26-8677-3cfdbcae5644-dev\") pod \"glance-default-single-0\" (UID: \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.491176 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ac0f9efc-607e-4d26-8677-3cfdbcae5644-etc-nvme\") pod \"glance-default-single-0\" (UID: \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.491209 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-single-0\" (UID: \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.491232 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ac0f9efc-607e-4d26-8677-3cfdbcae5644-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.491274 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac0f9efc-607e-4d26-8677-3cfdbcae5644-config-data\") pod \"glance-default-single-0\" (UID: \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.491294 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ac0f9efc-607e-4d26-8677-3cfdbcae5644-etc-nvme\") pod \"glance-default-single-0\" (UID: \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.491308 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac0f9efc-607e-4d26-8677-3cfdbcae5644-logs\") pod \"glance-default-single-0\" (UID: \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.491323 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-single-0\" (UID: \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\") device mount path \"/mnt/openstack/pv01\"" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.491333 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ac0f9efc-607e-4d26-8677-3cfdbcae5644-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.491336 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ac0f9efc-607e-4d26-8677-3cfdbcae5644-lib-modules\") pod \"glance-default-single-0\" (UID: \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.491370 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ac0f9efc-607e-4d26-8677-3cfdbcae5644-sys\") pod \"glance-default-single-0\" (UID: \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.491394 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ac0f9efc-607e-4d26-8677-3cfdbcae5644-run\") pod \"glance-default-single-0\" (UID: \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.491408 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-single-0\" (UID: \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\") device mount path \"/mnt/openstack/pv03\"" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.491452 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ac0f9efc-607e-4d26-8677-3cfdbcae5644-sys\") pod \"glance-default-single-0\" (UID: \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.491476 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ac0f9efc-607e-4d26-8677-3cfdbcae5644-run\") pod \"glance-default-single-0\" (UID: \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.491868 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac0f9efc-607e-4d26-8677-3cfdbcae5644-logs\") pod \"glance-default-single-0\" (UID: \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.491876 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ac0f9efc-607e-4d26-8677-3cfdbcae5644-httpd-run\") pod \"glance-default-single-0\" (UID: \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.491997 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ac0f9efc-607e-4d26-8677-3cfdbcae5644-lib-modules\") pod \"glance-default-single-0\" (UID: \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.497424 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac0f9efc-607e-4d26-8677-3cfdbcae5644-config-data\") pod \"glance-default-single-0\" (UID: \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.498496 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac0f9efc-607e-4d26-8677-3cfdbcae5644-scripts\") pod \"glance-default-single-0\" (UID: \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.512423 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-single-0\" (UID: \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.516415 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t65v9\" (UniqueName: \"kubernetes.io/projected/ac0f9efc-607e-4d26-8677-3cfdbcae5644-kube-api-access-t65v9\") pod \"glance-default-single-0\" (UID: \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.519611 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-single-0\" (UID: \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.621520 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.824264 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Jan 31 14:59:53 crc kubenswrapper[4751]: W0131 14:59:53.827143 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53e80c85_256f_4e3a_8338_091b69c8a111.slice/crio-d0fc51af73d94f86e3cd1b0621a38ca7cd14201bdbba30a0fccb4019efc30e6f WatchSource:0}: Error finding container d0fc51af73d94f86e3cd1b0621a38ca7cd14201bdbba30a0fccb4019efc30e6f: Status 404 returned error can't find the container with id d0fc51af73d94f86e3cd1b0621a38ca7cd14201bdbba30a0fccb4019efc30e6f Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.928538 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.969371 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"ac0f9efc-607e-4d26-8677-3cfdbcae5644","Type":"ContainerStarted","Data":"ed8251e3c7704b6272035a19ef7b83f396abd3798eb2dc76182eb938556d6f1b"} Jan 31 14:59:53 crc kubenswrapper[4751]: I0131 14:59:53.970633 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"53e80c85-256f-4e3a-8338-091b69c8a111","Type":"ContainerStarted","Data":"d0fc51af73d94f86e3cd1b0621a38ca7cd14201bdbba30a0fccb4019efc30e6f"} Jan 31 15:00:00 crc kubenswrapper[4751]: I0131 15:00:00.030800 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"53e80c85-256f-4e3a-8338-091b69c8a111","Type":"ContainerStarted","Data":"2470407eb06da53e051c9bcfd402a9b5b782f16d58c74eaa361abad6c79fcccd"} Jan 31 15:00:00 crc kubenswrapper[4751]: I0131 15:00:00.162807 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-single-1-cleaner-29497860-zg4ld"] Jan 31 15:00:00 crc kubenswrapper[4751]: I0131 15:00:00.163915 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-cache-glance-default-single-1-cleaner-29497860-zg4ld" Jan 31 15:00:00 crc kubenswrapper[4751]: I0131 15:00:00.196532 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-single-0-cleaner-29497860-lwlxh"] Jan 31 15:00:00 crc kubenswrapper[4751]: I0131 15:00:00.197635 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-cache-glance-default-single-0-cleaner-29497860-lwlxh" Jan 31 15:00:00 crc kubenswrapper[4751]: I0131 15:00:00.204411 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497860-xjmxh"] Jan 31 15:00:00 crc kubenswrapper[4751]: I0131 15:00:00.205359 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497860-xjmxh" Jan 31 15:00:00 crc kubenswrapper[4751]: I0131 15:00:00.207025 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 31 15:00:00 crc kubenswrapper[4751]: I0131 15:00:00.207195 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 31 15:00:00 crc kubenswrapper[4751]: I0131 15:00:00.208034 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfv7v\" (UniqueName: \"kubernetes.io/projected/4359ffb3-e292-485f-b762-e131f9a9e869-kube-api-access-gfv7v\") pod \"glance-cache-glance-default-single-1-cleaner-29497860-zg4ld\" (UID: \"4359ffb3-e292-485f-b762-e131f9a9e869\") " pod="glance-kuttl-tests/glance-cache-glance-default-single-1-cleaner-29497860-zg4ld" Jan 31 15:00:00 crc kubenswrapper[4751]: I0131 15:00:00.208099 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-cache-glance-default-single-1-cleaner-29497860-zg4ld\" (UID: \"4359ffb3-e292-485f-b762-e131f9a9e869\") " pod="glance-kuttl-tests/glance-cache-glance-default-single-1-cleaner-29497860-zg4ld" Jan 31 15:00:00 crc kubenswrapper[4751]: I0131 15:00:00.208183 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/4359ffb3-e292-485f-b762-e131f9a9e869-image-cache-config-data\") pod \"glance-cache-glance-default-single-1-cleaner-29497860-zg4ld\" (UID: \"4359ffb3-e292-485f-b762-e131f9a9e869\") " pod="glance-kuttl-tests/glance-cache-glance-default-single-1-cleaner-29497860-zg4ld" Jan 31 15:00:00 crc kubenswrapper[4751]: I0131 15:00:00.225827 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-single-0-cleaner-29497860-lwlxh"] Jan 31 15:00:00 crc kubenswrapper[4751]: I0131 15:00:00.234405 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-cache-glance-default-single-1-cleaner-29497860-zg4ld\" (UID: \"4359ffb3-e292-485f-b762-e131f9a9e869\") " pod="glance-kuttl-tests/glance-cache-glance-default-single-1-cleaner-29497860-zg4ld" Jan 31 15:00:00 crc kubenswrapper[4751]: I0131 15:00:00.273402 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-single-1-cleaner-29497860-zg4ld"] Jan 31 15:00:00 crc kubenswrapper[4751]: I0131 15:00:00.297940 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497860-xjmxh"] Jan 31 15:00:00 crc kubenswrapper[4751]: I0131 15:00:00.309654 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mdt5\" (UniqueName: \"kubernetes.io/projected/d5db9258-7fae-47d2-acf9-c523d3d87193-kube-api-access-7mdt5\") pod \"glance-cache-glance-default-single-0-cleaner-29497860-lwlxh\" (UID: \"d5db9258-7fae-47d2-acf9-c523d3d87193\") " pod="glance-kuttl-tests/glance-cache-glance-default-single-0-cleaner-29497860-lwlxh" Jan 31 15:00:00 crc kubenswrapper[4751]: I0131 15:00:00.309704 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-cache-glance-default-single-0-cleaner-29497860-lwlxh\" (UID: \"d5db9258-7fae-47d2-acf9-c523d3d87193\") " pod="glance-kuttl-tests/glance-cache-glance-default-single-0-cleaner-29497860-lwlxh" Jan 31 15:00:00 crc kubenswrapper[4751]: I0131 15:00:00.309737 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/d5db9258-7fae-47d2-acf9-c523d3d87193-image-cache-config-data\") pod \"glance-cache-glance-default-single-0-cleaner-29497860-lwlxh\" (UID: \"d5db9258-7fae-47d2-acf9-c523d3d87193\") " pod="glance-kuttl-tests/glance-cache-glance-default-single-0-cleaner-29497860-lwlxh" Jan 31 15:00:00 crc kubenswrapper[4751]: I0131 15:00:00.309780 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfv7v\" (UniqueName: \"kubernetes.io/projected/4359ffb3-e292-485f-b762-e131f9a9e869-kube-api-access-gfv7v\") pod \"glance-cache-glance-default-single-1-cleaner-29497860-zg4ld\" (UID: \"4359ffb3-e292-485f-b762-e131f9a9e869\") " pod="glance-kuttl-tests/glance-cache-glance-default-single-1-cleaner-29497860-zg4ld" Jan 31 15:00:00 crc kubenswrapper[4751]: I0131 15:00:00.309809 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zp59q\" (UniqueName: \"kubernetes.io/projected/c304f066-d32f-4ebc-af80-09f3680a14cd-kube-api-access-zp59q\") pod \"collect-profiles-29497860-xjmxh\" (UID: \"c304f066-d32f-4ebc-af80-09f3680a14cd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497860-xjmxh" Jan 31 15:00:00 crc kubenswrapper[4751]: I0131 15:00:00.309838 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c304f066-d32f-4ebc-af80-09f3680a14cd-config-volume\") pod \"collect-profiles-29497860-xjmxh\" (UID: \"c304f066-d32f-4ebc-af80-09f3680a14cd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497860-xjmxh" Jan 31 15:00:00 crc kubenswrapper[4751]: I0131 15:00:00.309916 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/4359ffb3-e292-485f-b762-e131f9a9e869-image-cache-config-data\") pod \"glance-cache-glance-default-single-1-cleaner-29497860-zg4ld\" (UID: \"4359ffb3-e292-485f-b762-e131f9a9e869\") " pod="glance-kuttl-tests/glance-cache-glance-default-single-1-cleaner-29497860-zg4ld" Jan 31 15:00:00 crc kubenswrapper[4751]: I0131 15:00:00.309947 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c304f066-d32f-4ebc-af80-09f3680a14cd-secret-volume\") pod \"collect-profiles-29497860-xjmxh\" (UID: \"c304f066-d32f-4ebc-af80-09f3680a14cd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497860-xjmxh" Jan 31 15:00:00 crc kubenswrapper[4751]: I0131 15:00:00.314793 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/4359ffb3-e292-485f-b762-e131f9a9e869-image-cache-config-data\") pod \"glance-cache-glance-default-single-1-cleaner-29497860-zg4ld\" (UID: \"4359ffb3-e292-485f-b762-e131f9a9e869\") " pod="glance-kuttl-tests/glance-cache-glance-default-single-1-cleaner-29497860-zg4ld" Jan 31 15:00:00 crc kubenswrapper[4751]: I0131 15:00:00.323734 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfv7v\" (UniqueName: \"kubernetes.io/projected/4359ffb3-e292-485f-b762-e131f9a9e869-kube-api-access-gfv7v\") pod \"glance-cache-glance-default-single-1-cleaner-29497860-zg4ld\" (UID: \"4359ffb3-e292-485f-b762-e131f9a9e869\") " pod="glance-kuttl-tests/glance-cache-glance-default-single-1-cleaner-29497860-zg4ld" Jan 31 15:00:00 crc kubenswrapper[4751]: I0131 15:00:00.326391 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-cache-glance-default-single-0-cleaner-29497860-lwlxh\" (UID: \"d5db9258-7fae-47d2-acf9-c523d3d87193\") " pod="glance-kuttl-tests/glance-cache-glance-default-single-0-cleaner-29497860-lwlxh" Jan 31 15:00:00 crc kubenswrapper[4751]: I0131 15:00:00.411275 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c304f066-d32f-4ebc-af80-09f3680a14cd-secret-volume\") pod \"collect-profiles-29497860-xjmxh\" (UID: \"c304f066-d32f-4ebc-af80-09f3680a14cd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497860-xjmxh" Jan 31 15:00:00 crc kubenswrapper[4751]: I0131 15:00:00.411329 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mdt5\" (UniqueName: \"kubernetes.io/projected/d5db9258-7fae-47d2-acf9-c523d3d87193-kube-api-access-7mdt5\") pod \"glance-cache-glance-default-single-0-cleaner-29497860-lwlxh\" (UID: \"d5db9258-7fae-47d2-acf9-c523d3d87193\") " pod="glance-kuttl-tests/glance-cache-glance-default-single-0-cleaner-29497860-lwlxh" Jan 31 15:00:00 crc kubenswrapper[4751]: I0131 15:00:00.411378 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/d5db9258-7fae-47d2-acf9-c523d3d87193-image-cache-config-data\") pod \"glance-cache-glance-default-single-0-cleaner-29497860-lwlxh\" (UID: \"d5db9258-7fae-47d2-acf9-c523d3d87193\") " pod="glance-kuttl-tests/glance-cache-glance-default-single-0-cleaner-29497860-lwlxh" Jan 31 15:00:00 crc kubenswrapper[4751]: I0131 15:00:00.411413 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zp59q\" (UniqueName: \"kubernetes.io/projected/c304f066-d32f-4ebc-af80-09f3680a14cd-kube-api-access-zp59q\") pod \"collect-profiles-29497860-xjmxh\" (UID: \"c304f066-d32f-4ebc-af80-09f3680a14cd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497860-xjmxh" Jan 31 15:00:00 crc kubenswrapper[4751]: I0131 15:00:00.411443 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c304f066-d32f-4ebc-af80-09f3680a14cd-config-volume\") pod \"collect-profiles-29497860-xjmxh\" (UID: \"c304f066-d32f-4ebc-af80-09f3680a14cd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497860-xjmxh" Jan 31 15:00:00 crc kubenswrapper[4751]: I0131 15:00:00.414477 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c304f066-d32f-4ebc-af80-09f3680a14cd-config-volume\") pod \"collect-profiles-29497860-xjmxh\" (UID: \"c304f066-d32f-4ebc-af80-09f3680a14cd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497860-xjmxh" Jan 31 15:00:00 crc kubenswrapper[4751]: I0131 15:00:00.418704 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/d5db9258-7fae-47d2-acf9-c523d3d87193-image-cache-config-data\") pod \"glance-cache-glance-default-single-0-cleaner-29497860-lwlxh\" (UID: \"d5db9258-7fae-47d2-acf9-c523d3d87193\") " pod="glance-kuttl-tests/glance-cache-glance-default-single-0-cleaner-29497860-lwlxh" Jan 31 15:00:00 crc kubenswrapper[4751]: I0131 15:00:00.419629 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c304f066-d32f-4ebc-af80-09f3680a14cd-secret-volume\") pod \"collect-profiles-29497860-xjmxh\" (UID: \"c304f066-d32f-4ebc-af80-09f3680a14cd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497860-xjmxh" Jan 31 15:00:00 crc kubenswrapper[4751]: I0131 15:00:00.436253 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zp59q\" (UniqueName: \"kubernetes.io/projected/c304f066-d32f-4ebc-af80-09f3680a14cd-kube-api-access-zp59q\") pod \"collect-profiles-29497860-xjmxh\" (UID: \"c304f066-d32f-4ebc-af80-09f3680a14cd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497860-xjmxh" Jan 31 15:00:00 crc kubenswrapper[4751]: I0131 15:00:00.436507 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mdt5\" (UniqueName: \"kubernetes.io/projected/d5db9258-7fae-47d2-acf9-c523d3d87193-kube-api-access-7mdt5\") pod \"glance-cache-glance-default-single-0-cleaner-29497860-lwlxh\" (UID: \"d5db9258-7fae-47d2-acf9-c523d3d87193\") " pod="glance-kuttl-tests/glance-cache-glance-default-single-0-cleaner-29497860-lwlxh" Jan 31 15:00:00 crc kubenswrapper[4751]: I0131 15:00:00.512955 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-cache-glance-default-single-1-cleaner-29497860-zg4ld" Jan 31 15:00:00 crc kubenswrapper[4751]: I0131 15:00:00.584735 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-cache-glance-default-single-0-cleaner-29497860-lwlxh" Jan 31 15:00:00 crc kubenswrapper[4751]: I0131 15:00:00.590424 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497860-xjmxh" Jan 31 15:00:01 crc kubenswrapper[4751]: I0131 15:00:00.923355 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497860-xjmxh"] Jan 31 15:00:01 crc kubenswrapper[4751]: W0131 15:00:00.967747 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4359ffb3_e292_485f_b762_e131f9a9e869.slice/crio-44f28ab8d2a42d4dde7d20dd573c7f68bed70bab0f7831c586028f44bc7d7976 WatchSource:0}: Error finding container 44f28ab8d2a42d4dde7d20dd573c7f68bed70bab0f7831c586028f44bc7d7976: Status 404 returned error can't find the container with id 44f28ab8d2a42d4dde7d20dd573c7f68bed70bab0f7831c586028f44bc7d7976 Jan 31 15:00:01 crc kubenswrapper[4751]: I0131 15:00:00.968371 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-single-1-cleaner-29497860-zg4ld"] Jan 31 15:00:01 crc kubenswrapper[4751]: I0131 15:00:01.043480 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"53e80c85-256f-4e3a-8338-091b69c8a111","Type":"ContainerStarted","Data":"52fb0acaeff7876fc2ee5ab2cce699867c40acf2d6cd815e7e538721bdb941cf"} Jan 31 15:00:01 crc kubenswrapper[4751]: I0131 15:00:01.046630 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497860-xjmxh" event={"ID":"c304f066-d32f-4ebc-af80-09f3680a14cd","Type":"ContainerStarted","Data":"a126fb63db8c50b9a53697e61efa57746e16f049f13c9c76511062853100b24e"} Jan 31 15:00:01 crc kubenswrapper[4751]: I0131 15:00:01.054921 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-cache-glance-default-single-1-cleaner-29497860-zg4ld" event={"ID":"4359ffb3-e292-485f-b762-e131f9a9e869","Type":"ContainerStarted","Data":"44f28ab8d2a42d4dde7d20dd573c7f68bed70bab0f7831c586028f44bc7d7976"} Jan 31 15:00:01 crc kubenswrapper[4751]: I0131 15:00:01.055105 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-single-0-cleaner-29497860-lwlxh"] Jan 31 15:00:01 crc kubenswrapper[4751]: W0131 15:00:01.062571 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5db9258_7fae_47d2_acf9_c523d3d87193.slice/crio-a8ecc46e6208031fb6034bc65dbf5c3f1388055248427b8ae008f3d6bbbb7ead WatchSource:0}: Error finding container a8ecc46e6208031fb6034bc65dbf5c3f1388055248427b8ae008f3d6bbbb7ead: Status 404 returned error can't find the container with id a8ecc46e6208031fb6034bc65dbf5c3f1388055248427b8ae008f3d6bbbb7ead Jan 31 15:00:02 crc kubenswrapper[4751]: I0131 15:00:02.063151 4751 generic.go:334] "Generic (PLEG): container finished" podID="c304f066-d32f-4ebc-af80-09f3680a14cd" containerID="5407593bbe6a5401ebbef23950b4e278ee81abe2ac79b5eccd91e19538dc1615" exitCode=0 Jan 31 15:00:02 crc kubenswrapper[4751]: I0131 15:00:02.063220 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497860-xjmxh" event={"ID":"c304f066-d32f-4ebc-af80-09f3680a14cd","Type":"ContainerDied","Data":"5407593bbe6a5401ebbef23950b4e278ee81abe2ac79b5eccd91e19538dc1615"} Jan 31 15:00:02 crc kubenswrapper[4751]: I0131 15:00:02.065048 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-cache-glance-default-single-0-cleaner-29497860-lwlxh" event={"ID":"d5db9258-7fae-47d2-acf9-c523d3d87193","Type":"ContainerStarted","Data":"522600cf4dfb7197c49e6a2fb7abef1d560bd673fb2da9388c38a54462595db0"} Jan 31 15:00:02 crc kubenswrapper[4751]: I0131 15:00:02.065086 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-cache-glance-default-single-0-cleaner-29497860-lwlxh" event={"ID":"d5db9258-7fae-47d2-acf9-c523d3d87193","Type":"ContainerStarted","Data":"a8ecc46e6208031fb6034bc65dbf5c3f1388055248427b8ae008f3d6bbbb7ead"} Jan 31 15:00:02 crc kubenswrapper[4751]: I0131 15:00:02.066917 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"ac0f9efc-607e-4d26-8677-3cfdbcae5644","Type":"ContainerStarted","Data":"97cea10f80c97e4fa02eebedc4adec6206a4c26e20c548ca8591a97a3b1570e1"} Jan 31 15:00:02 crc kubenswrapper[4751]: I0131 15:00:02.066937 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"ac0f9efc-607e-4d26-8677-3cfdbcae5644","Type":"ContainerStarted","Data":"5ae99f624e52366771ec3f54c793000f4df8c5d1267908fa6b69f7cac8418069"} Jan 31 15:00:02 crc kubenswrapper[4751]: I0131 15:00:02.079616 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-cache-glance-default-single-1-cleaner-29497860-zg4ld" event={"ID":"4359ffb3-e292-485f-b762-e131f9a9e869","Type":"ContainerStarted","Data":"fdff4dbce192cc3ad36befaed2781dd7252206ed773249a695ca5b7f5682312b"} Jan 31 15:00:02 crc kubenswrapper[4751]: I0131 15:00:02.127540 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-0" podStartSLOduration=10.127514936 podStartE2EDuration="10.127514936s" podCreationTimestamp="2026-01-31 14:59:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 15:00:02.112939712 +0000 UTC m=+1106.487652597" watchObservedRunningTime="2026-01-31 15:00:02.127514936 +0000 UTC m=+1106.502227821" Jan 31 15:00:02 crc kubenswrapper[4751]: I0131 15:00:02.161470 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-1" podStartSLOduration=9.161451402 podStartE2EDuration="9.161451402s" podCreationTimestamp="2026-01-31 14:59:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 15:00:02.145220803 +0000 UTC m=+1106.519933688" watchObservedRunningTime="2026-01-31 15:00:02.161451402 +0000 UTC m=+1106.536164277" Jan 31 15:00:02 crc kubenswrapper[4751]: I0131 15:00:02.167877 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-cache-glance-default-single-1-cleaner-29497860-zg4ld" podStartSLOduration=2.167859101 podStartE2EDuration="2.167859101s" podCreationTimestamp="2026-01-31 15:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 15:00:02.160910527 +0000 UTC m=+1106.535623432" watchObservedRunningTime="2026-01-31 15:00:02.167859101 +0000 UTC m=+1106.542571986" Jan 31 15:00:02 crc kubenswrapper[4751]: I0131 15:00:02.182164 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-cache-glance-default-single-0-cleaner-29497860-lwlxh" podStartSLOduration=2.182146298 podStartE2EDuration="2.182146298s" podCreationTimestamp="2026-01-31 15:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 15:00:02.18071767 +0000 UTC m=+1106.555430575" watchObservedRunningTime="2026-01-31 15:00:02.182146298 +0000 UTC m=+1106.556859183" Jan 31 15:00:03 crc kubenswrapper[4751]: I0131 15:00:03.089583 4751 generic.go:334] "Generic (PLEG): container finished" podID="4359ffb3-e292-485f-b762-e131f9a9e869" containerID="fdff4dbce192cc3ad36befaed2781dd7252206ed773249a695ca5b7f5682312b" exitCode=0 Jan 31 15:00:03 crc kubenswrapper[4751]: I0131 15:00:03.089660 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-cache-glance-default-single-1-cleaner-29497860-zg4ld" event={"ID":"4359ffb3-e292-485f-b762-e131f9a9e869","Type":"ContainerDied","Data":"fdff4dbce192cc3ad36befaed2781dd7252206ed773249a695ca5b7f5682312b"} Jan 31 15:00:03 crc kubenswrapper[4751]: I0131 15:00:03.092450 4751 generic.go:334] "Generic (PLEG): container finished" podID="d5db9258-7fae-47d2-acf9-c523d3d87193" containerID="522600cf4dfb7197c49e6a2fb7abef1d560bd673fb2da9388c38a54462595db0" exitCode=0 Jan 31 15:00:03 crc kubenswrapper[4751]: I0131 15:00:03.092501 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-cache-glance-default-single-0-cleaner-29497860-lwlxh" event={"ID":"d5db9258-7fae-47d2-acf9-c523d3d87193","Type":"ContainerDied","Data":"522600cf4dfb7197c49e6a2fb7abef1d560bd673fb2da9388c38a54462595db0"} Jan 31 15:00:03 crc kubenswrapper[4751]: I0131 15:00:03.445215 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497860-xjmxh" Jan 31 15:00:03 crc kubenswrapper[4751]: I0131 15:00:03.482291 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:00:03 crc kubenswrapper[4751]: I0131 15:00:03.482341 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:00:03 crc kubenswrapper[4751]: I0131 15:00:03.515718 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:00:03 crc kubenswrapper[4751]: I0131 15:00:03.528318 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:00:03 crc kubenswrapper[4751]: I0131 15:00:03.568770 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c304f066-d32f-4ebc-af80-09f3680a14cd-secret-volume\") pod \"c304f066-d32f-4ebc-af80-09f3680a14cd\" (UID: \"c304f066-d32f-4ebc-af80-09f3680a14cd\") " Jan 31 15:00:03 crc kubenswrapper[4751]: I0131 15:00:03.568854 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zp59q\" (UniqueName: \"kubernetes.io/projected/c304f066-d32f-4ebc-af80-09f3680a14cd-kube-api-access-zp59q\") pod \"c304f066-d32f-4ebc-af80-09f3680a14cd\" (UID: \"c304f066-d32f-4ebc-af80-09f3680a14cd\") " Jan 31 15:00:03 crc kubenswrapper[4751]: I0131 15:00:03.568960 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c304f066-d32f-4ebc-af80-09f3680a14cd-config-volume\") pod \"c304f066-d32f-4ebc-af80-09f3680a14cd\" (UID: \"c304f066-d32f-4ebc-af80-09f3680a14cd\") " Jan 31 15:00:03 crc kubenswrapper[4751]: I0131 15:00:03.569705 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c304f066-d32f-4ebc-af80-09f3680a14cd-config-volume" (OuterVolumeSpecName: "config-volume") pod "c304f066-d32f-4ebc-af80-09f3680a14cd" (UID: "c304f066-d32f-4ebc-af80-09f3680a14cd"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 15:00:03 crc kubenswrapper[4751]: I0131 15:00:03.570099 4751 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c304f066-d32f-4ebc-af80-09f3680a14cd-config-volume\") on node \"crc\" DevicePath \"\"" Jan 31 15:00:03 crc kubenswrapper[4751]: I0131 15:00:03.574170 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c304f066-d32f-4ebc-af80-09f3680a14cd-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c304f066-d32f-4ebc-af80-09f3680a14cd" (UID: "c304f066-d32f-4ebc-af80-09f3680a14cd"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:00:03 crc kubenswrapper[4751]: I0131 15:00:03.574536 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c304f066-d32f-4ebc-af80-09f3680a14cd-kube-api-access-zp59q" (OuterVolumeSpecName: "kube-api-access-zp59q") pod "c304f066-d32f-4ebc-af80-09f3680a14cd" (UID: "c304f066-d32f-4ebc-af80-09f3680a14cd"). InnerVolumeSpecName "kube-api-access-zp59q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:00:03 crc kubenswrapper[4751]: I0131 15:00:03.622976 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:03 crc kubenswrapper[4751]: I0131 15:00:03.623019 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:03 crc kubenswrapper[4751]: I0131 15:00:03.646996 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:03 crc kubenswrapper[4751]: I0131 15:00:03.660619 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:03 crc kubenswrapper[4751]: I0131 15:00:03.671039 4751 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c304f066-d32f-4ebc-af80-09f3680a14cd-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 31 15:00:03 crc kubenswrapper[4751]: I0131 15:00:03.671087 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zp59q\" (UniqueName: \"kubernetes.io/projected/c304f066-d32f-4ebc-af80-09f3680a14cd-kube-api-access-zp59q\") on node \"crc\" DevicePath \"\"" Jan 31 15:00:04 crc kubenswrapper[4751]: I0131 15:00:04.103428 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497860-xjmxh" event={"ID":"c304f066-d32f-4ebc-af80-09f3680a14cd","Type":"ContainerDied","Data":"a126fb63db8c50b9a53697e61efa57746e16f049f13c9c76511062853100b24e"} Jan 31 15:00:04 crc kubenswrapper[4751]: I0131 15:00:04.103828 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a126fb63db8c50b9a53697e61efa57746e16f049f13c9c76511062853100b24e" Jan 31 15:00:04 crc kubenswrapper[4751]: I0131 15:00:04.103782 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497860-xjmxh" Jan 31 15:00:04 crc kubenswrapper[4751]: I0131 15:00:04.104282 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:04 crc kubenswrapper[4751]: I0131 15:00:04.104317 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:00:04 crc kubenswrapper[4751]: I0131 15:00:04.104351 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:04 crc kubenswrapper[4751]: I0131 15:00:04.104363 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:00:04 crc kubenswrapper[4751]: I0131 15:00:04.425097 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-cache-glance-default-single-1-cleaner-29497860-zg4ld" Jan 31 15:00:04 crc kubenswrapper[4751]: I0131 15:00:04.485869 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/4359ffb3-e292-485f-b762-e131f9a9e869-image-cache-config-data\") pod \"4359ffb3-e292-485f-b762-e131f9a9e869\" (UID: \"4359ffb3-e292-485f-b762-e131f9a9e869\") " Jan 31 15:00:04 crc kubenswrapper[4751]: I0131 15:00:04.485927 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfv7v\" (UniqueName: \"kubernetes.io/projected/4359ffb3-e292-485f-b762-e131f9a9e869-kube-api-access-gfv7v\") pod \"4359ffb3-e292-485f-b762-e131f9a9e869\" (UID: \"4359ffb3-e292-485f-b762-e131f9a9e869\") " Jan 31 15:00:04 crc kubenswrapper[4751]: I0131 15:00:04.486010 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"4359ffb3-e292-485f-b762-e131f9a9e869\" (UID: \"4359ffb3-e292-485f-b762-e131f9a9e869\") " Jan 31 15:00:04 crc kubenswrapper[4751]: I0131 15:00:04.497796 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4359ffb3-e292-485f-b762-e131f9a9e869-image-cache-config-data" (OuterVolumeSpecName: "image-cache-config-data") pod "4359ffb3-e292-485f-b762-e131f9a9e869" (UID: "4359ffb3-e292-485f-b762-e131f9a9e869"). InnerVolumeSpecName "image-cache-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:00:04 crc kubenswrapper[4751]: I0131 15:00:04.499606 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance-cache") pod "4359ffb3-e292-485f-b762-e131f9a9e869" (UID: "4359ffb3-e292-485f-b762-e131f9a9e869"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:00:04 crc kubenswrapper[4751]: I0131 15:00:04.504680 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4359ffb3-e292-485f-b762-e131f9a9e869-kube-api-access-gfv7v" (OuterVolumeSpecName: "kube-api-access-gfv7v") pod "4359ffb3-e292-485f-b762-e131f9a9e869" (UID: "4359ffb3-e292-485f-b762-e131f9a9e869"). InnerVolumeSpecName "kube-api-access-gfv7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:00:04 crc kubenswrapper[4751]: I0131 15:00:04.542614 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-cache-glance-default-single-0-cleaner-29497860-lwlxh" Jan 31 15:00:04 crc kubenswrapper[4751]: I0131 15:00:04.588131 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mdt5\" (UniqueName: \"kubernetes.io/projected/d5db9258-7fae-47d2-acf9-c523d3d87193-kube-api-access-7mdt5\") pod \"d5db9258-7fae-47d2-acf9-c523d3d87193\" (UID: \"d5db9258-7fae-47d2-acf9-c523d3d87193\") " Jan 31 15:00:04 crc kubenswrapper[4751]: I0131 15:00:04.588194 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/d5db9258-7fae-47d2-acf9-c523d3d87193-image-cache-config-data\") pod \"d5db9258-7fae-47d2-acf9-c523d3d87193\" (UID: \"d5db9258-7fae-47d2-acf9-c523d3d87193\") " Jan 31 15:00:04 crc kubenswrapper[4751]: I0131 15:00:04.588220 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"d5db9258-7fae-47d2-acf9-c523d3d87193\" (UID: \"d5db9258-7fae-47d2-acf9-c523d3d87193\") " Jan 31 15:00:04 crc kubenswrapper[4751]: I0131 15:00:04.588628 4751 reconciler_common.go:293] "Volume detached for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/4359ffb3-e292-485f-b762-e131f9a9e869-image-cache-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 15:00:04 crc kubenswrapper[4751]: I0131 15:00:04.588642 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfv7v\" (UniqueName: \"kubernetes.io/projected/4359ffb3-e292-485f-b762-e131f9a9e869-kube-api-access-gfv7v\") on node \"crc\" DevicePath \"\"" Jan 31 15:00:04 crc kubenswrapper[4751]: I0131 15:00:04.590986 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5db9258-7fae-47d2-acf9-c523d3d87193-image-cache-config-data" (OuterVolumeSpecName: "image-cache-config-data") pod "d5db9258-7fae-47d2-acf9-c523d3d87193" (UID: "d5db9258-7fae-47d2-acf9-c523d3d87193"). InnerVolumeSpecName "image-cache-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:00:04 crc kubenswrapper[4751]: I0131 15:00:04.592339 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance-cache") pod "d5db9258-7fae-47d2-acf9-c523d3d87193" (UID: "d5db9258-7fae-47d2-acf9-c523d3d87193"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:00:04 crc kubenswrapper[4751]: I0131 15:00:04.592392 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5db9258-7fae-47d2-acf9-c523d3d87193-kube-api-access-7mdt5" (OuterVolumeSpecName: "kube-api-access-7mdt5") pod "d5db9258-7fae-47d2-acf9-c523d3d87193" (UID: "d5db9258-7fae-47d2-acf9-c523d3d87193"). InnerVolumeSpecName "kube-api-access-7mdt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:00:04 crc kubenswrapper[4751]: I0131 15:00:04.690466 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mdt5\" (UniqueName: \"kubernetes.io/projected/d5db9258-7fae-47d2-acf9-c523d3d87193-kube-api-access-7mdt5\") on node \"crc\" DevicePath \"\"" Jan 31 15:00:04 crc kubenswrapper[4751]: I0131 15:00:04.690514 4751 reconciler_common.go:293] "Volume detached for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/d5db9258-7fae-47d2-acf9-c523d3d87193-image-cache-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 15:00:05 crc kubenswrapper[4751]: I0131 15:00:05.111370 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-cache-glance-default-single-1-cleaner-29497860-zg4ld" Jan 31 15:00:05 crc kubenswrapper[4751]: I0131 15:00:05.111413 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-cache-glance-default-single-1-cleaner-29497860-zg4ld" event={"ID":"4359ffb3-e292-485f-b762-e131f9a9e869","Type":"ContainerDied","Data":"44f28ab8d2a42d4dde7d20dd573c7f68bed70bab0f7831c586028f44bc7d7976"} Jan 31 15:00:05 crc kubenswrapper[4751]: I0131 15:00:05.111456 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44f28ab8d2a42d4dde7d20dd573c7f68bed70bab0f7831c586028f44bc7d7976" Jan 31 15:00:05 crc kubenswrapper[4751]: I0131 15:00:05.113097 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-cache-glance-default-single-0-cleaner-29497860-lwlxh" event={"ID":"d5db9258-7fae-47d2-acf9-c523d3d87193","Type":"ContainerDied","Data":"a8ecc46e6208031fb6034bc65dbf5c3f1388055248427b8ae008f3d6bbbb7ead"} Jan 31 15:00:05 crc kubenswrapper[4751]: I0131 15:00:05.113155 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8ecc46e6208031fb6034bc65dbf5c3f1388055248427b8ae008f3d6bbbb7ead" Jan 31 15:00:05 crc kubenswrapper[4751]: I0131 15:00:05.113159 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-cache-glance-default-single-0-cleaner-29497860-lwlxh" Jan 31 15:00:06 crc kubenswrapper[4751]: I0131 15:00:06.135922 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:00:07 crc kubenswrapper[4751]: I0131 15:00:07.049461 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:00:07 crc kubenswrapper[4751]: I0131 15:00:07.121021 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Jan 31 15:00:07 crc kubenswrapper[4751]: I0131 15:00:07.121252 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="ac0f9efc-607e-4d26-8677-3cfdbcae5644" containerName="glance-log" containerID="cri-o://5ae99f624e52366771ec3f54c793000f4df8c5d1267908fa6b69f7cac8418069" gracePeriod=30 Jan 31 15:00:07 crc kubenswrapper[4751]: I0131 15:00:07.121894 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="ac0f9efc-607e-4d26-8677-3cfdbcae5644" containerName="glance-httpd" containerID="cri-o://97cea10f80c97e4fa02eebedc4adec6206a4c26e20c548ca8591a97a3b1570e1" gracePeriod=30 Jan 31 15:00:07 crc kubenswrapper[4751]: I0131 15:00:07.132939 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-single-0" podUID="ac0f9efc-607e-4d26-8677-3cfdbcae5644" containerName="glance-log" probeResult="failure" output="Get \"http://10.217.0.100:9292/healthcheck\": EOF" Jan 31 15:00:07 crc kubenswrapper[4751]: I0131 15:00:07.139349 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-single-0" podUID="ac0f9efc-607e-4d26-8677-3cfdbcae5644" containerName="glance-httpd" probeResult="failure" output="Get \"http://10.217.0.100:9292/healthcheck\": EOF" Jan 31 15:00:08 crc kubenswrapper[4751]: I0131 15:00:08.149154 4751 generic.go:334] "Generic (PLEG): container finished" podID="ac0f9efc-607e-4d26-8677-3cfdbcae5644" containerID="5ae99f624e52366771ec3f54c793000f4df8c5d1267908fa6b69f7cac8418069" exitCode=143 Jan 31 15:00:08 crc kubenswrapper[4751]: I0131 15:00:08.149227 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"ac0f9efc-607e-4d26-8677-3cfdbcae5644","Type":"ContainerDied","Data":"5ae99f624e52366771ec3f54c793000f4df8c5d1267908fa6b69f7cac8418069"} Jan 31 15:00:14 crc kubenswrapper[4751]: I0131 15:00:14.216630 4751 generic.go:334] "Generic (PLEG): container finished" podID="ac0f9efc-607e-4d26-8677-3cfdbcae5644" containerID="97cea10f80c97e4fa02eebedc4adec6206a4c26e20c548ca8591a97a3b1570e1" exitCode=0 Jan 31 15:00:14 crc kubenswrapper[4751]: I0131 15:00:14.216809 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"ac0f9efc-607e-4d26-8677-3cfdbcae5644","Type":"ContainerDied","Data":"97cea10f80c97e4fa02eebedc4adec6206a4c26e20c548ca8591a97a3b1570e1"} Jan 31 15:00:14 crc kubenswrapper[4751]: I0131 15:00:14.693333 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:14 crc kubenswrapper[4751]: I0131 15:00:14.826735 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac0f9efc-607e-4d26-8677-3cfdbcae5644-config-data\") pod \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\" (UID: \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\") " Jan 31 15:00:14 crc kubenswrapper[4751]: I0131 15:00:14.826821 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ac0f9efc-607e-4d26-8677-3cfdbcae5644-sys\") pod \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\" (UID: \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\") " Jan 31 15:00:14 crc kubenswrapper[4751]: I0131 15:00:14.826861 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ac0f9efc-607e-4d26-8677-3cfdbcae5644-dev\") pod \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\" (UID: \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\") " Jan 31 15:00:14 crc kubenswrapper[4751]: I0131 15:00:14.826907 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ac0f9efc-607e-4d26-8677-3cfdbcae5644-run\") pod \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\" (UID: \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\") " Jan 31 15:00:14 crc kubenswrapper[4751]: I0131 15:00:14.827011 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ac0f9efc-607e-4d26-8677-3cfdbcae5644-httpd-run\") pod \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\" (UID: \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\") " Jan 31 15:00:14 crc kubenswrapper[4751]: I0131 15:00:14.827053 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\" (UID: \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\") " Jan 31 15:00:14 crc kubenswrapper[4751]: I0131 15:00:14.827162 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t65v9\" (UniqueName: \"kubernetes.io/projected/ac0f9efc-607e-4d26-8677-3cfdbcae5644-kube-api-access-t65v9\") pod \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\" (UID: \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\") " Jan 31 15:00:14 crc kubenswrapper[4751]: I0131 15:00:14.827199 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac0f9efc-607e-4d26-8677-3cfdbcae5644-dev" (OuterVolumeSpecName: "dev") pod "ac0f9efc-607e-4d26-8677-3cfdbcae5644" (UID: "ac0f9efc-607e-4d26-8677-3cfdbcae5644"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:00:14 crc kubenswrapper[4751]: I0131 15:00:14.827243 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac0f9efc-607e-4d26-8677-3cfdbcae5644-scripts\") pod \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\" (UID: \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\") " Jan 31 15:00:14 crc kubenswrapper[4751]: I0131 15:00:14.827296 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ac0f9efc-607e-4d26-8677-3cfdbcae5644-etc-nvme\") pod \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\" (UID: \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\") " Jan 31 15:00:14 crc kubenswrapper[4751]: I0131 15:00:14.827400 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac0f9efc-607e-4d26-8677-3cfdbcae5644-logs\") pod \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\" (UID: \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\") " Jan 31 15:00:14 crc kubenswrapper[4751]: I0131 15:00:14.827469 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\" (UID: \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\") " Jan 31 15:00:14 crc kubenswrapper[4751]: I0131 15:00:14.827533 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ac0f9efc-607e-4d26-8677-3cfdbcae5644-lib-modules\") pod \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\" (UID: \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\") " Jan 31 15:00:14 crc kubenswrapper[4751]: I0131 15:00:14.827583 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ac0f9efc-607e-4d26-8677-3cfdbcae5644-etc-iscsi\") pod \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\" (UID: \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\") " Jan 31 15:00:14 crc kubenswrapper[4751]: I0131 15:00:14.827627 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ac0f9efc-607e-4d26-8677-3cfdbcae5644-var-locks-brick\") pod \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\" (UID: \"ac0f9efc-607e-4d26-8677-3cfdbcae5644\") " Jan 31 15:00:14 crc kubenswrapper[4751]: I0131 15:00:14.827865 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac0f9efc-607e-4d26-8677-3cfdbcae5644-sys" (OuterVolumeSpecName: "sys") pod "ac0f9efc-607e-4d26-8677-3cfdbcae5644" (UID: "ac0f9efc-607e-4d26-8677-3cfdbcae5644"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:00:14 crc kubenswrapper[4751]: I0131 15:00:14.827891 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac0f9efc-607e-4d26-8677-3cfdbcae5644-run" (OuterVolumeSpecName: "run") pod "ac0f9efc-607e-4d26-8677-3cfdbcae5644" (UID: "ac0f9efc-607e-4d26-8677-3cfdbcae5644"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:00:14 crc kubenswrapper[4751]: I0131 15:00:14.828083 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac0f9efc-607e-4d26-8677-3cfdbcae5644-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "ac0f9efc-607e-4d26-8677-3cfdbcae5644" (UID: "ac0f9efc-607e-4d26-8677-3cfdbcae5644"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:00:14 crc kubenswrapper[4751]: I0131 15:00:14.828160 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac0f9efc-607e-4d26-8677-3cfdbcae5644-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "ac0f9efc-607e-4d26-8677-3cfdbcae5644" (UID: "ac0f9efc-607e-4d26-8677-3cfdbcae5644"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:00:14 crc kubenswrapper[4751]: I0131 15:00:14.828155 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac0f9efc-607e-4d26-8677-3cfdbcae5644-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "ac0f9efc-607e-4d26-8677-3cfdbcae5644" (UID: "ac0f9efc-607e-4d26-8677-3cfdbcae5644"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:00:14 crc kubenswrapper[4751]: I0131 15:00:14.828193 4751 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ac0f9efc-607e-4d26-8677-3cfdbcae5644-lib-modules\") on node \"crc\" DevicePath \"\"" Jan 31 15:00:14 crc kubenswrapper[4751]: I0131 15:00:14.828210 4751 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ac0f9efc-607e-4d26-8677-3cfdbcae5644-sys\") on node \"crc\" DevicePath \"\"" Jan 31 15:00:14 crc kubenswrapper[4751]: I0131 15:00:14.828222 4751 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ac0f9efc-607e-4d26-8677-3cfdbcae5644-dev\") on node \"crc\" DevicePath \"\"" Jan 31 15:00:14 crc kubenswrapper[4751]: I0131 15:00:14.828231 4751 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ac0f9efc-607e-4d26-8677-3cfdbcae5644-run\") on node \"crc\" DevicePath \"\"" Jan 31 15:00:14 crc kubenswrapper[4751]: I0131 15:00:14.828225 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac0f9efc-607e-4d26-8677-3cfdbcae5644-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "ac0f9efc-607e-4d26-8677-3cfdbcae5644" (UID: "ac0f9efc-607e-4d26-8677-3cfdbcae5644"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:00:14 crc kubenswrapper[4751]: I0131 15:00:14.828401 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac0f9efc-607e-4d26-8677-3cfdbcae5644-logs" (OuterVolumeSpecName: "logs") pod "ac0f9efc-607e-4d26-8677-3cfdbcae5644" (UID: "ac0f9efc-607e-4d26-8677-3cfdbcae5644"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:00:14 crc kubenswrapper[4751]: I0131 15:00:14.828663 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac0f9efc-607e-4d26-8677-3cfdbcae5644-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ac0f9efc-607e-4d26-8677-3cfdbcae5644" (UID: "ac0f9efc-607e-4d26-8677-3cfdbcae5644"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:00:14 crc kubenswrapper[4751]: I0131 15:00:14.834260 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "ac0f9efc-607e-4d26-8677-3cfdbcae5644" (UID: "ac0f9efc-607e-4d26-8677-3cfdbcae5644"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:00:14 crc kubenswrapper[4751]: I0131 15:00:14.834780 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac0f9efc-607e-4d26-8677-3cfdbcae5644-kube-api-access-t65v9" (OuterVolumeSpecName: "kube-api-access-t65v9") pod "ac0f9efc-607e-4d26-8677-3cfdbcae5644" (UID: "ac0f9efc-607e-4d26-8677-3cfdbcae5644"). InnerVolumeSpecName "kube-api-access-t65v9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:00:14 crc kubenswrapper[4751]: I0131 15:00:14.838185 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance-cache") pod "ac0f9efc-607e-4d26-8677-3cfdbcae5644" (UID: "ac0f9efc-607e-4d26-8677-3cfdbcae5644"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:00:14 crc kubenswrapper[4751]: I0131 15:00:14.842954 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac0f9efc-607e-4d26-8677-3cfdbcae5644-scripts" (OuterVolumeSpecName: "scripts") pod "ac0f9efc-607e-4d26-8677-3cfdbcae5644" (UID: "ac0f9efc-607e-4d26-8677-3cfdbcae5644"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:00:14 crc kubenswrapper[4751]: I0131 15:00:14.882898 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac0f9efc-607e-4d26-8677-3cfdbcae5644-config-data" (OuterVolumeSpecName: "config-data") pod "ac0f9efc-607e-4d26-8677-3cfdbcae5644" (UID: "ac0f9efc-607e-4d26-8677-3cfdbcae5644"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:00:14 crc kubenswrapper[4751]: I0131 15:00:14.930182 4751 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac0f9efc-607e-4d26-8677-3cfdbcae5644-logs\") on node \"crc\" DevicePath \"\"" Jan 31 15:00:14 crc kubenswrapper[4751]: I0131 15:00:14.930245 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Jan 31 15:00:14 crc kubenswrapper[4751]: I0131 15:00:14.930263 4751 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ac0f9efc-607e-4d26-8677-3cfdbcae5644-etc-iscsi\") on node \"crc\" DevicePath \"\"" Jan 31 15:00:14 crc kubenswrapper[4751]: I0131 15:00:14.930275 4751 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ac0f9efc-607e-4d26-8677-3cfdbcae5644-var-locks-brick\") on node \"crc\" DevicePath \"\"" Jan 31 15:00:14 crc kubenswrapper[4751]: I0131 15:00:14.930290 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac0f9efc-607e-4d26-8677-3cfdbcae5644-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 15:00:14 crc kubenswrapper[4751]: I0131 15:00:14.930304 4751 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ac0f9efc-607e-4d26-8677-3cfdbcae5644-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 15:00:14 crc kubenswrapper[4751]: I0131 15:00:14.930330 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 31 15:00:14 crc kubenswrapper[4751]: I0131 15:00:14.930343 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t65v9\" (UniqueName: \"kubernetes.io/projected/ac0f9efc-607e-4d26-8677-3cfdbcae5644-kube-api-access-t65v9\") on node \"crc\" DevicePath \"\"" Jan 31 15:00:14 crc kubenswrapper[4751]: I0131 15:00:14.930356 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac0f9efc-607e-4d26-8677-3cfdbcae5644-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 15:00:14 crc kubenswrapper[4751]: I0131 15:00:14.930369 4751 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ac0f9efc-607e-4d26-8677-3cfdbcae5644-etc-nvme\") on node \"crc\" DevicePath \"\"" Jan 31 15:00:14 crc kubenswrapper[4751]: I0131 15:00:14.943296 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Jan 31 15:00:14 crc kubenswrapper[4751]: I0131 15:00:14.949049 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.043245 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.043299 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.226484 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"ac0f9efc-607e-4d26-8677-3cfdbcae5644","Type":"ContainerDied","Data":"ed8251e3c7704b6272035a19ef7b83f396abd3798eb2dc76182eb938556d6f1b"} Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.226539 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.226568 4751 scope.go:117] "RemoveContainer" containerID="97cea10f80c97e4fa02eebedc4adec6206a4c26e20c548ca8591a97a3b1570e1" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.249143 4751 scope.go:117] "RemoveContainer" containerID="5ae99f624e52366771ec3f54c793000f4df8c5d1267908fa6b69f7cac8418069" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.262495 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.270110 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.295184 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Jan 31 15:00:15 crc kubenswrapper[4751]: E0131 15:00:15.295623 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c304f066-d32f-4ebc-af80-09f3680a14cd" containerName="collect-profiles" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.295647 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="c304f066-d32f-4ebc-af80-09f3680a14cd" containerName="collect-profiles" Jan 31 15:00:15 crc kubenswrapper[4751]: E0131 15:00:15.295660 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4359ffb3-e292-485f-b762-e131f9a9e869" containerName="glance-cache-glance-default-single-1-cleaner" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.295672 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="4359ffb3-e292-485f-b762-e131f9a9e869" containerName="glance-cache-glance-default-single-1-cleaner" Jan 31 15:00:15 crc kubenswrapper[4751]: E0131 15:00:15.295698 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac0f9efc-607e-4d26-8677-3cfdbcae5644" containerName="glance-log" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.295709 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac0f9efc-607e-4d26-8677-3cfdbcae5644" containerName="glance-log" Jan 31 15:00:15 crc kubenswrapper[4751]: E0131 15:00:15.295736 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5db9258-7fae-47d2-acf9-c523d3d87193" containerName="glance-cache-glance-default-single-0-cleaner" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.295749 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5db9258-7fae-47d2-acf9-c523d3d87193" containerName="glance-cache-glance-default-single-0-cleaner" Jan 31 15:00:15 crc kubenswrapper[4751]: E0131 15:00:15.295767 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac0f9efc-607e-4d26-8677-3cfdbcae5644" containerName="glance-httpd" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.295777 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac0f9efc-607e-4d26-8677-3cfdbcae5644" containerName="glance-httpd" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.295975 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5db9258-7fae-47d2-acf9-c523d3d87193" containerName="glance-cache-glance-default-single-0-cleaner" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.296009 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="c304f066-d32f-4ebc-af80-09f3680a14cd" containerName="collect-profiles" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.296029 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac0f9efc-607e-4d26-8677-3cfdbcae5644" containerName="glance-httpd" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.296050 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="4359ffb3-e292-485f-b762-e131f9a9e869" containerName="glance-cache-glance-default-single-1-cleaner" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.296095 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac0f9efc-607e-4d26-8677-3cfdbcae5644" containerName="glance-log" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.297284 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.304893 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.447793 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-single-0\" (UID: \"a6f236ad-2ab6-4e51-b934-402f28844e69\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.448127 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6f236ad-2ab6-4e51-b934-402f28844e69-logs\") pod \"glance-default-single-0\" (UID: \"a6f236ad-2ab6-4e51-b934-402f28844e69\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.448159 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a6f236ad-2ab6-4e51-b934-402f28844e69-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"a6f236ad-2ab6-4e51-b934-402f28844e69\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.448178 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a6f236ad-2ab6-4e51-b934-402f28844e69-dev\") pod \"glance-default-single-0\" (UID: \"a6f236ad-2ab6-4e51-b934-402f28844e69\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.448247 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a6f236ad-2ab6-4e51-b934-402f28844e69-sys\") pod \"glance-default-single-0\" (UID: \"a6f236ad-2ab6-4e51-b934-402f28844e69\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.448283 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6f236ad-2ab6-4e51-b934-402f28844e69-config-data\") pod \"glance-default-single-0\" (UID: \"a6f236ad-2ab6-4e51-b934-402f28844e69\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.448312 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a6f236ad-2ab6-4e51-b934-402f28844e69-lib-modules\") pod \"glance-default-single-0\" (UID: \"a6f236ad-2ab6-4e51-b934-402f28844e69\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.448388 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a6f236ad-2ab6-4e51-b934-402f28844e69-run\") pod \"glance-default-single-0\" (UID: \"a6f236ad-2ab6-4e51-b934-402f28844e69\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.448438 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6f236ad-2ab6-4e51-b934-402f28844e69-scripts\") pod \"glance-default-single-0\" (UID: \"a6f236ad-2ab6-4e51-b934-402f28844e69\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.448469 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzc4c\" (UniqueName: \"kubernetes.io/projected/a6f236ad-2ab6-4e51-b934-402f28844e69-kube-api-access-wzc4c\") pod \"glance-default-single-0\" (UID: \"a6f236ad-2ab6-4e51-b934-402f28844e69\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.448511 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a6f236ad-2ab6-4e51-b934-402f28844e69-httpd-run\") pod \"glance-default-single-0\" (UID: \"a6f236ad-2ab6-4e51-b934-402f28844e69\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.448546 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a6f236ad-2ab6-4e51-b934-402f28844e69-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"a6f236ad-2ab6-4e51-b934-402f28844e69\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.448613 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a6f236ad-2ab6-4e51-b934-402f28844e69-etc-nvme\") pod \"glance-default-single-0\" (UID: \"a6f236ad-2ab6-4e51-b934-402f28844e69\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.448653 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-single-0\" (UID: \"a6f236ad-2ab6-4e51-b934-402f28844e69\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.549664 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6f236ad-2ab6-4e51-b934-402f28844e69-logs\") pod \"glance-default-single-0\" (UID: \"a6f236ad-2ab6-4e51-b934-402f28844e69\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.549707 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a6f236ad-2ab6-4e51-b934-402f28844e69-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"a6f236ad-2ab6-4e51-b934-402f28844e69\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.549744 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a6f236ad-2ab6-4e51-b934-402f28844e69-dev\") pod \"glance-default-single-0\" (UID: \"a6f236ad-2ab6-4e51-b934-402f28844e69\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.549786 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a6f236ad-2ab6-4e51-b934-402f28844e69-sys\") pod \"glance-default-single-0\" (UID: \"a6f236ad-2ab6-4e51-b934-402f28844e69\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.549823 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6f236ad-2ab6-4e51-b934-402f28844e69-config-data\") pod \"glance-default-single-0\" (UID: \"a6f236ad-2ab6-4e51-b934-402f28844e69\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.549851 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a6f236ad-2ab6-4e51-b934-402f28844e69-dev\") pod \"glance-default-single-0\" (UID: \"a6f236ad-2ab6-4e51-b934-402f28844e69\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.549864 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a6f236ad-2ab6-4e51-b934-402f28844e69-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"a6f236ad-2ab6-4e51-b934-402f28844e69\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.549887 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a6f236ad-2ab6-4e51-b934-402f28844e69-lib-modules\") pod \"glance-default-single-0\" (UID: \"a6f236ad-2ab6-4e51-b934-402f28844e69\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.549854 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a6f236ad-2ab6-4e51-b934-402f28844e69-lib-modules\") pod \"glance-default-single-0\" (UID: \"a6f236ad-2ab6-4e51-b934-402f28844e69\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.549907 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a6f236ad-2ab6-4e51-b934-402f28844e69-sys\") pod \"glance-default-single-0\" (UID: \"a6f236ad-2ab6-4e51-b934-402f28844e69\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.549929 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a6f236ad-2ab6-4e51-b934-402f28844e69-run\") pod \"glance-default-single-0\" (UID: \"a6f236ad-2ab6-4e51-b934-402f28844e69\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.549961 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a6f236ad-2ab6-4e51-b934-402f28844e69-run\") pod \"glance-default-single-0\" (UID: \"a6f236ad-2ab6-4e51-b934-402f28844e69\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.550001 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6f236ad-2ab6-4e51-b934-402f28844e69-scripts\") pod \"glance-default-single-0\" (UID: \"a6f236ad-2ab6-4e51-b934-402f28844e69\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.550023 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzc4c\" (UniqueName: \"kubernetes.io/projected/a6f236ad-2ab6-4e51-b934-402f28844e69-kube-api-access-wzc4c\") pod \"glance-default-single-0\" (UID: \"a6f236ad-2ab6-4e51-b934-402f28844e69\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.550039 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a6f236ad-2ab6-4e51-b934-402f28844e69-httpd-run\") pod \"glance-default-single-0\" (UID: \"a6f236ad-2ab6-4e51-b934-402f28844e69\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.550098 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a6f236ad-2ab6-4e51-b934-402f28844e69-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"a6f236ad-2ab6-4e51-b934-402f28844e69\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.550143 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a6f236ad-2ab6-4e51-b934-402f28844e69-etc-nvme\") pod \"glance-default-single-0\" (UID: \"a6f236ad-2ab6-4e51-b934-402f28844e69\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.550172 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-single-0\" (UID: \"a6f236ad-2ab6-4e51-b934-402f28844e69\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.550233 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-single-0\" (UID: \"a6f236ad-2ab6-4e51-b934-402f28844e69\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.550392 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-single-0\" (UID: \"a6f236ad-2ab6-4e51-b934-402f28844e69\") device mount path \"/mnt/openstack/pv03\"" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.550525 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a6f236ad-2ab6-4e51-b934-402f28844e69-etc-nvme\") pod \"glance-default-single-0\" (UID: \"a6f236ad-2ab6-4e51-b934-402f28844e69\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.550556 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a6f236ad-2ab6-4e51-b934-402f28844e69-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"a6f236ad-2ab6-4e51-b934-402f28844e69\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.550646 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a6f236ad-2ab6-4e51-b934-402f28844e69-httpd-run\") pod \"glance-default-single-0\" (UID: \"a6f236ad-2ab6-4e51-b934-402f28844e69\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.550679 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-single-0\" (UID: \"a6f236ad-2ab6-4e51-b934-402f28844e69\") device mount path \"/mnt/openstack/pv01\"" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.551105 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6f236ad-2ab6-4e51-b934-402f28844e69-logs\") pod \"glance-default-single-0\" (UID: \"a6f236ad-2ab6-4e51-b934-402f28844e69\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.557559 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6f236ad-2ab6-4e51-b934-402f28844e69-scripts\") pod \"glance-default-single-0\" (UID: \"a6f236ad-2ab6-4e51-b934-402f28844e69\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.562027 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6f236ad-2ab6-4e51-b934-402f28844e69-config-data\") pod \"glance-default-single-0\" (UID: \"a6f236ad-2ab6-4e51-b934-402f28844e69\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.582227 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzc4c\" (UniqueName: \"kubernetes.io/projected/a6f236ad-2ab6-4e51-b934-402f28844e69-kube-api-access-wzc4c\") pod \"glance-default-single-0\" (UID: \"a6f236ad-2ab6-4e51-b934-402f28844e69\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.584001 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-single-0\" (UID: \"a6f236ad-2ab6-4e51-b934-402f28844e69\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.594119 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-single-0\" (UID: \"a6f236ad-2ab6-4e51-b934-402f28844e69\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:15 crc kubenswrapper[4751]: I0131 15:00:15.630171 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:16 crc kubenswrapper[4751]: I0131 15:00:16.060120 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Jan 31 15:00:16 crc kubenswrapper[4751]: I0131 15:00:16.235944 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"a6f236ad-2ab6-4e51-b934-402f28844e69","Type":"ContainerStarted","Data":"0404afa0dee3bb2591b16ea7fdc6a0ed77a19e078e63d50f945b13286beb2ed9"} Jan 31 15:00:16 crc kubenswrapper[4751]: I0131 15:00:16.415352 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac0f9efc-607e-4d26-8677-3cfdbcae5644" path="/var/lib/kubelet/pods/ac0f9efc-607e-4d26-8677-3cfdbcae5644/volumes" Jan 31 15:00:17 crc kubenswrapper[4751]: I0131 15:00:17.245833 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"a6f236ad-2ab6-4e51-b934-402f28844e69","Type":"ContainerStarted","Data":"3d4c9658ef799ffc5bf29e9925845adf68e2321560adab813cd297c7b6dfe0e2"} Jan 31 15:00:17 crc kubenswrapper[4751]: I0131 15:00:17.247548 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"a6f236ad-2ab6-4e51-b934-402f28844e69","Type":"ContainerStarted","Data":"c1281edefeae3927db375b0c14eca77f9671b71769c05b60b41b1179bc1039fe"} Jan 31 15:00:17 crc kubenswrapper[4751]: I0131 15:00:17.276619 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-0" podStartSLOduration=2.276596604 podStartE2EDuration="2.276596604s" podCreationTimestamp="2026-01-31 15:00:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 15:00:17.269137637 +0000 UTC m=+1121.643850572" watchObservedRunningTime="2026-01-31 15:00:17.276596604 +0000 UTC m=+1121.651309499" Jan 31 15:00:25 crc kubenswrapper[4751]: I0131 15:00:25.630643 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:25 crc kubenswrapper[4751]: I0131 15:00:25.631284 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:25 crc kubenswrapper[4751]: I0131 15:00:25.663160 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:25 crc kubenswrapper[4751]: I0131 15:00:25.668685 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:26 crc kubenswrapper[4751]: I0131 15:00:26.317494 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:26 crc kubenswrapper[4751]: I0131 15:00:26.317566 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:28 crc kubenswrapper[4751]: I0131 15:00:28.221164 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:00:28 crc kubenswrapper[4751]: I0131 15:00:28.224626 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:01:00 crc kubenswrapper[4751]: I0131 15:01:00.175605 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-cron-29497861-5bd6d"] Jan 31 15:01:00 crc kubenswrapper[4751]: I0131 15:01:00.177579 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-cron-29497861-5bd6d" Jan 31 15:01:00 crc kubenswrapper[4751]: I0131 15:01:00.197512 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-cron-29497861-5bd6d"] Jan 31 15:01:00 crc kubenswrapper[4751]: I0131 15:01:00.257807 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bce6ceb9-5b0d-4ec7-9492-94dce9bb261d-fernet-keys\") pod \"keystone-cron-29497861-5bd6d\" (UID: \"bce6ceb9-5b0d-4ec7-9492-94dce9bb261d\") " pod="glance-kuttl-tests/keystone-cron-29497861-5bd6d" Jan 31 15:01:00 crc kubenswrapper[4751]: I0131 15:01:00.257906 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bce6ceb9-5b0d-4ec7-9492-94dce9bb261d-config-data\") pod \"keystone-cron-29497861-5bd6d\" (UID: \"bce6ceb9-5b0d-4ec7-9492-94dce9bb261d\") " pod="glance-kuttl-tests/keystone-cron-29497861-5bd6d" Jan 31 15:01:00 crc kubenswrapper[4751]: I0131 15:01:00.257960 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czpkb\" (UniqueName: \"kubernetes.io/projected/bce6ceb9-5b0d-4ec7-9492-94dce9bb261d-kube-api-access-czpkb\") pod \"keystone-cron-29497861-5bd6d\" (UID: \"bce6ceb9-5b0d-4ec7-9492-94dce9bb261d\") " pod="glance-kuttl-tests/keystone-cron-29497861-5bd6d" Jan 31 15:01:00 crc kubenswrapper[4751]: I0131 15:01:00.359020 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bce6ceb9-5b0d-4ec7-9492-94dce9bb261d-config-data\") pod \"keystone-cron-29497861-5bd6d\" (UID: \"bce6ceb9-5b0d-4ec7-9492-94dce9bb261d\") " pod="glance-kuttl-tests/keystone-cron-29497861-5bd6d" Jan 31 15:01:00 crc kubenswrapper[4751]: I0131 15:01:00.359119 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czpkb\" (UniqueName: \"kubernetes.io/projected/bce6ceb9-5b0d-4ec7-9492-94dce9bb261d-kube-api-access-czpkb\") pod \"keystone-cron-29497861-5bd6d\" (UID: \"bce6ceb9-5b0d-4ec7-9492-94dce9bb261d\") " pod="glance-kuttl-tests/keystone-cron-29497861-5bd6d" Jan 31 15:01:00 crc kubenswrapper[4751]: I0131 15:01:00.359227 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bce6ceb9-5b0d-4ec7-9492-94dce9bb261d-fernet-keys\") pod \"keystone-cron-29497861-5bd6d\" (UID: \"bce6ceb9-5b0d-4ec7-9492-94dce9bb261d\") " pod="glance-kuttl-tests/keystone-cron-29497861-5bd6d" Jan 31 15:01:00 crc kubenswrapper[4751]: I0131 15:01:00.369037 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bce6ceb9-5b0d-4ec7-9492-94dce9bb261d-config-data\") pod \"keystone-cron-29497861-5bd6d\" (UID: \"bce6ceb9-5b0d-4ec7-9492-94dce9bb261d\") " pod="glance-kuttl-tests/keystone-cron-29497861-5bd6d" Jan 31 15:01:00 crc kubenswrapper[4751]: I0131 15:01:00.376531 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bce6ceb9-5b0d-4ec7-9492-94dce9bb261d-fernet-keys\") pod \"keystone-cron-29497861-5bd6d\" (UID: \"bce6ceb9-5b0d-4ec7-9492-94dce9bb261d\") " pod="glance-kuttl-tests/keystone-cron-29497861-5bd6d" Jan 31 15:01:00 crc kubenswrapper[4751]: I0131 15:01:00.380904 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czpkb\" (UniqueName: \"kubernetes.io/projected/bce6ceb9-5b0d-4ec7-9492-94dce9bb261d-kube-api-access-czpkb\") pod \"keystone-cron-29497861-5bd6d\" (UID: \"bce6ceb9-5b0d-4ec7-9492-94dce9bb261d\") " pod="glance-kuttl-tests/keystone-cron-29497861-5bd6d" Jan 31 15:01:00 crc kubenswrapper[4751]: I0131 15:01:00.514807 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-cron-29497861-5bd6d" Jan 31 15:01:01 crc kubenswrapper[4751]: I0131 15:01:01.002209 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-cron-29497861-5bd6d"] Jan 31 15:01:01 crc kubenswrapper[4751]: W0131 15:01:01.004118 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbce6ceb9_5b0d_4ec7_9492_94dce9bb261d.slice/crio-e746be916cd4d51c51a7b8ba5c98afa99b72d12f05c9b8ae4e86df55efe466c0 WatchSource:0}: Error finding container e746be916cd4d51c51a7b8ba5c98afa99b72d12f05c9b8ae4e86df55efe466c0: Status 404 returned error can't find the container with id e746be916cd4d51c51a7b8ba5c98afa99b72d12f05c9b8ae4e86df55efe466c0 Jan 31 15:01:01 crc kubenswrapper[4751]: I0131 15:01:01.633225 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-cron-29497861-5bd6d" event={"ID":"bce6ceb9-5b0d-4ec7-9492-94dce9bb261d","Type":"ContainerStarted","Data":"4fd861ffb49593c05c4ca2dc031ea7913a88e0c31f7cbaf913eca6a5819336ff"} Jan 31 15:01:01 crc kubenswrapper[4751]: I0131 15:01:01.633519 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-cron-29497861-5bd6d" event={"ID":"bce6ceb9-5b0d-4ec7-9492-94dce9bb261d","Type":"ContainerStarted","Data":"e746be916cd4d51c51a7b8ba5c98afa99b72d12f05c9b8ae4e86df55efe466c0"} Jan 31 15:01:01 crc kubenswrapper[4751]: I0131 15:01:01.660333 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/keystone-cron-29497861-5bd6d" podStartSLOduration=1.6603033470000002 podStartE2EDuration="1.660303347s" podCreationTimestamp="2026-01-31 15:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 15:01:01.647935021 +0000 UTC m=+1166.022647906" watchObservedRunningTime="2026-01-31 15:01:01.660303347 +0000 UTC m=+1166.035016262" Jan 31 15:01:02 crc kubenswrapper[4751]: I0131 15:01:02.174149 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-sync-lvpgz"] Jan 31 15:01:02 crc kubenswrapper[4751]: I0131 15:01:02.187109 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-sync-lvpgz"] Jan 31 15:01:02 crc kubenswrapper[4751]: I0131 15:01:02.195584 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glancebf79-account-delete-vglq2"] Jan 31 15:01:02 crc kubenswrapper[4751]: I0131 15:01:02.196698 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glancebf79-account-delete-vglq2" Jan 31 15:01:02 crc kubenswrapper[4751]: I0131 15:01:02.224121 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glancebf79-account-delete-vglq2"] Jan 31 15:01:02 crc kubenswrapper[4751]: I0131 15:01:02.289364 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mtfm\" (UniqueName: \"kubernetes.io/projected/6de76201-fcd1-48a2-8bba-dcdf63bbdf20-kube-api-access-7mtfm\") pod \"glancebf79-account-delete-vglq2\" (UID: \"6de76201-fcd1-48a2-8bba-dcdf63bbdf20\") " pod="glance-kuttl-tests/glancebf79-account-delete-vglq2" Jan 31 15:01:02 crc kubenswrapper[4751]: I0131 15:01:02.289433 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6de76201-fcd1-48a2-8bba-dcdf63bbdf20-operator-scripts\") pod \"glancebf79-account-delete-vglq2\" (UID: \"6de76201-fcd1-48a2-8bba-dcdf63bbdf20\") " pod="glance-kuttl-tests/glancebf79-account-delete-vglq2" Jan 31 15:01:02 crc kubenswrapper[4751]: I0131 15:01:02.291271 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Jan 31 15:01:02 crc kubenswrapper[4751]: I0131 15:01:02.291542 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="a6f236ad-2ab6-4e51-b934-402f28844e69" containerName="glance-log" containerID="cri-o://3d4c9658ef799ffc5bf29e9925845adf68e2321560adab813cd297c7b6dfe0e2" gracePeriod=30 Jan 31 15:01:02 crc kubenswrapper[4751]: I0131 15:01:02.291915 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="a6f236ad-2ab6-4e51-b934-402f28844e69" containerName="glance-httpd" containerID="cri-o://c1281edefeae3927db375b0c14eca77f9671b71769c05b60b41b1179bc1039fe" gracePeriod=30 Jan 31 15:01:02 crc kubenswrapper[4751]: I0131 15:01:02.298595 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-single-1-cleaner-29497860-zg4ld"] Jan 31 15:01:02 crc kubenswrapper[4751]: I0131 15:01:02.309144 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-single-0-cleaner-29497860-lwlxh"] Jan 31 15:01:02 crc kubenswrapper[4751]: I0131 15:01:02.317556 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Jan 31 15:01:02 crc kubenswrapper[4751]: I0131 15:01:02.317871 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-1" podUID="53e80c85-256f-4e3a-8338-091b69c8a111" containerName="glance-log" containerID="cri-o://2470407eb06da53e051c9bcfd402a9b5b782f16d58c74eaa361abad6c79fcccd" gracePeriod=30 Jan 31 15:01:02 crc kubenswrapper[4751]: I0131 15:01:02.317946 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-1" podUID="53e80c85-256f-4e3a-8338-091b69c8a111" containerName="glance-httpd" containerID="cri-o://52fb0acaeff7876fc2ee5ab2cce699867c40acf2d6cd815e7e538721bdb941cf" gracePeriod=30 Jan 31 15:01:02 crc kubenswrapper[4751]: I0131 15:01:02.325132 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-single-1-cleaner-29497860-zg4ld"] Jan 31 15:01:02 crc kubenswrapper[4751]: I0131 15:01:02.329396 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-single-0-cleaner-29497860-lwlxh"] Jan 31 15:01:02 crc kubenswrapper[4751]: I0131 15:01:02.381576 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/openstackclient"] Jan 31 15:01:02 crc kubenswrapper[4751]: I0131 15:01:02.381960 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/openstackclient" podUID="d360673b-7556-44b9-b7bd-4805810da349" containerName="openstackclient" containerID="cri-o://44aed1a495029b4f70f43e4d99769c88600905aa98e64ff530f4a6570f61d3ae" gracePeriod=30 Jan 31 15:01:02 crc kubenswrapper[4751]: I0131 15:01:02.390459 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mtfm\" (UniqueName: \"kubernetes.io/projected/6de76201-fcd1-48a2-8bba-dcdf63bbdf20-kube-api-access-7mtfm\") pod \"glancebf79-account-delete-vglq2\" (UID: \"6de76201-fcd1-48a2-8bba-dcdf63bbdf20\") " pod="glance-kuttl-tests/glancebf79-account-delete-vglq2" Jan 31 15:01:02 crc kubenswrapper[4751]: I0131 15:01:02.390499 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6de76201-fcd1-48a2-8bba-dcdf63bbdf20-operator-scripts\") pod \"glancebf79-account-delete-vglq2\" (UID: \"6de76201-fcd1-48a2-8bba-dcdf63bbdf20\") " pod="glance-kuttl-tests/glancebf79-account-delete-vglq2" Jan 31 15:01:02 crc kubenswrapper[4751]: I0131 15:01:02.391343 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6de76201-fcd1-48a2-8bba-dcdf63bbdf20-operator-scripts\") pod \"glancebf79-account-delete-vglq2\" (UID: \"6de76201-fcd1-48a2-8bba-dcdf63bbdf20\") " pod="glance-kuttl-tests/glancebf79-account-delete-vglq2" Jan 31 15:01:02 crc kubenswrapper[4751]: I0131 15:01:02.418601 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4359ffb3-e292-485f-b762-e131f9a9e869" path="/var/lib/kubelet/pods/4359ffb3-e292-485f-b762-e131f9a9e869/volumes" Jan 31 15:01:02 crc kubenswrapper[4751]: I0131 15:01:02.419448 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c7b87c6-2803-4ae5-9257-1a7e12d26f61" path="/var/lib/kubelet/pods/5c7b87c6-2803-4ae5-9257-1a7e12d26f61/volumes" Jan 31 15:01:02 crc kubenswrapper[4751]: I0131 15:01:02.420132 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5db9258-7fae-47d2-acf9-c523d3d87193" path="/var/lib/kubelet/pods/d5db9258-7fae-47d2-acf9-c523d3d87193/volumes" Jan 31 15:01:02 crc kubenswrapper[4751]: I0131 15:01:02.421151 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mtfm\" (UniqueName: \"kubernetes.io/projected/6de76201-fcd1-48a2-8bba-dcdf63bbdf20-kube-api-access-7mtfm\") pod \"glancebf79-account-delete-vglq2\" (UID: \"6de76201-fcd1-48a2-8bba-dcdf63bbdf20\") " pod="glance-kuttl-tests/glancebf79-account-delete-vglq2" Jan 31 15:01:02 crc kubenswrapper[4751]: I0131 15:01:02.518088 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glancebf79-account-delete-vglq2" Jan 31 15:01:02 crc kubenswrapper[4751]: I0131 15:01:02.656315 4751 generic.go:334] "Generic (PLEG): container finished" podID="d360673b-7556-44b9-b7bd-4805810da349" containerID="44aed1a495029b4f70f43e4d99769c88600905aa98e64ff530f4a6570f61d3ae" exitCode=143 Jan 31 15:01:02 crc kubenswrapper[4751]: I0131 15:01:02.656648 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"d360673b-7556-44b9-b7bd-4805810da349","Type":"ContainerDied","Data":"44aed1a495029b4f70f43e4d99769c88600905aa98e64ff530f4a6570f61d3ae"} Jan 31 15:01:02 crc kubenswrapper[4751]: I0131 15:01:02.666964 4751 generic.go:334] "Generic (PLEG): container finished" podID="53e80c85-256f-4e3a-8338-091b69c8a111" containerID="2470407eb06da53e051c9bcfd402a9b5b782f16d58c74eaa361abad6c79fcccd" exitCode=143 Jan 31 15:01:02 crc kubenswrapper[4751]: I0131 15:01:02.667002 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"53e80c85-256f-4e3a-8338-091b69c8a111","Type":"ContainerDied","Data":"2470407eb06da53e051c9bcfd402a9b5b782f16d58c74eaa361abad6c79fcccd"} Jan 31 15:01:02 crc kubenswrapper[4751]: I0131 15:01:02.672992 4751 generic.go:334] "Generic (PLEG): container finished" podID="a6f236ad-2ab6-4e51-b934-402f28844e69" containerID="3d4c9658ef799ffc5bf29e9925845adf68e2321560adab813cd297c7b6dfe0e2" exitCode=143 Jan 31 15:01:02 crc kubenswrapper[4751]: I0131 15:01:02.673054 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"a6f236ad-2ab6-4e51-b934-402f28844e69","Type":"ContainerDied","Data":"3d4c9658ef799ffc5bf29e9925845adf68e2321560adab813cd297c7b6dfe0e2"} Jan 31 15:01:02 crc kubenswrapper[4751]: I0131 15:01:02.778620 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glancebf79-account-delete-vglq2"] Jan 31 15:01:02 crc kubenswrapper[4751]: W0131 15:01:02.788379 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6de76201_fcd1_48a2_8bba_dcdf63bbdf20.slice/crio-5bf286bf34a619105d42f4b703f7d5e3780035dd5d176dd0ce6e759bed3be659 WatchSource:0}: Error finding container 5bf286bf34a619105d42f4b703f7d5e3780035dd5d176dd0ce6e759bed3be659: Status 404 returned error can't find the container with id 5bf286bf34a619105d42f4b703f7d5e3780035dd5d176dd0ce6e759bed3be659 Jan 31 15:01:02 crc kubenswrapper[4751]: I0131 15:01:02.814007 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Jan 31 15:01:02 crc kubenswrapper[4751]: I0131 15:01:02.896082 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/d360673b-7556-44b9-b7bd-4805810da349-openstack-scripts\") pod \"d360673b-7556-44b9-b7bd-4805810da349\" (UID: \"d360673b-7556-44b9-b7bd-4805810da349\") " Jan 31 15:01:02 crc kubenswrapper[4751]: I0131 15:01:02.896404 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ch7w\" (UniqueName: \"kubernetes.io/projected/d360673b-7556-44b9-b7bd-4805810da349-kube-api-access-8ch7w\") pod \"d360673b-7556-44b9-b7bd-4805810da349\" (UID: \"d360673b-7556-44b9-b7bd-4805810da349\") " Jan 31 15:01:02 crc kubenswrapper[4751]: I0131 15:01:02.896437 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d360673b-7556-44b9-b7bd-4805810da349-openstack-config\") pod \"d360673b-7556-44b9-b7bd-4805810da349\" (UID: \"d360673b-7556-44b9-b7bd-4805810da349\") " Jan 31 15:01:02 crc kubenswrapper[4751]: I0131 15:01:02.896467 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d360673b-7556-44b9-b7bd-4805810da349-openstack-config-secret\") pod \"d360673b-7556-44b9-b7bd-4805810da349\" (UID: \"d360673b-7556-44b9-b7bd-4805810da349\") " Jan 31 15:01:02 crc kubenswrapper[4751]: I0131 15:01:02.896834 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d360673b-7556-44b9-b7bd-4805810da349-openstack-scripts" (OuterVolumeSpecName: "openstack-scripts") pod "d360673b-7556-44b9-b7bd-4805810da349" (UID: "d360673b-7556-44b9-b7bd-4805810da349"). InnerVolumeSpecName "openstack-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 15:01:02 crc kubenswrapper[4751]: I0131 15:01:02.897157 4751 reconciler_common.go:293] "Volume detached for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/d360673b-7556-44b9-b7bd-4805810da349-openstack-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:02 crc kubenswrapper[4751]: I0131 15:01:02.926608 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d360673b-7556-44b9-b7bd-4805810da349-kube-api-access-8ch7w" (OuterVolumeSpecName: "kube-api-access-8ch7w") pod "d360673b-7556-44b9-b7bd-4805810da349" (UID: "d360673b-7556-44b9-b7bd-4805810da349"). InnerVolumeSpecName "kube-api-access-8ch7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:01:02 crc kubenswrapper[4751]: I0131 15:01:02.927100 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d360673b-7556-44b9-b7bd-4805810da349-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "d360673b-7556-44b9-b7bd-4805810da349" (UID: "d360673b-7556-44b9-b7bd-4805810da349"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 15:01:02 crc kubenswrapper[4751]: I0131 15:01:02.938166 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d360673b-7556-44b9-b7bd-4805810da349-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "d360673b-7556-44b9-b7bd-4805810da349" (UID: "d360673b-7556-44b9-b7bd-4805810da349"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:01:02 crc kubenswrapper[4751]: I0131 15:01:02.999048 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ch7w\" (UniqueName: \"kubernetes.io/projected/d360673b-7556-44b9-b7bd-4805810da349-kube-api-access-8ch7w\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:02 crc kubenswrapper[4751]: I0131 15:01:02.999097 4751 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d360673b-7556-44b9-b7bd-4805810da349-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:02 crc kubenswrapper[4751]: I0131 15:01:02.999128 4751 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d360673b-7556-44b9-b7bd-4805810da349-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:03 crc kubenswrapper[4751]: I0131 15:01:03.680384 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Jan 31 15:01:03 crc kubenswrapper[4751]: I0131 15:01:03.682307 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"d360673b-7556-44b9-b7bd-4805810da349","Type":"ContainerDied","Data":"c2e3697d65b3597868569dcd055006b2a37c4a1b745665e4039d129867477c4a"} Jan 31 15:01:03 crc kubenswrapper[4751]: I0131 15:01:03.682379 4751 scope.go:117] "RemoveContainer" containerID="44aed1a495029b4f70f43e4d99769c88600905aa98e64ff530f4a6570f61d3ae" Jan 31 15:01:03 crc kubenswrapper[4751]: I0131 15:01:03.683902 4751 generic.go:334] "Generic (PLEG): container finished" podID="bce6ceb9-5b0d-4ec7-9492-94dce9bb261d" containerID="4fd861ffb49593c05c4ca2dc031ea7913a88e0c31f7cbaf913eca6a5819336ff" exitCode=0 Jan 31 15:01:03 crc kubenswrapper[4751]: I0131 15:01:03.683960 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-cron-29497861-5bd6d" event={"ID":"bce6ceb9-5b0d-4ec7-9492-94dce9bb261d","Type":"ContainerDied","Data":"4fd861ffb49593c05c4ca2dc031ea7913a88e0c31f7cbaf913eca6a5819336ff"} Jan 31 15:01:03 crc kubenswrapper[4751]: I0131 15:01:03.685961 4751 generic.go:334] "Generic (PLEG): container finished" podID="6de76201-fcd1-48a2-8bba-dcdf63bbdf20" containerID="2def03042cdbf5505276d6eb76695378d7a0c3b7b97a2d260b2bb7c00d1d66d9" exitCode=0 Jan 31 15:01:03 crc kubenswrapper[4751]: I0131 15:01:03.686028 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glancebf79-account-delete-vglq2" event={"ID":"6de76201-fcd1-48a2-8bba-dcdf63bbdf20","Type":"ContainerDied","Data":"2def03042cdbf5505276d6eb76695378d7a0c3b7b97a2d260b2bb7c00d1d66d9"} Jan 31 15:01:03 crc kubenswrapper[4751]: I0131 15:01:03.686083 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glancebf79-account-delete-vglq2" event={"ID":"6de76201-fcd1-48a2-8bba-dcdf63bbdf20","Type":"ContainerStarted","Data":"5bf286bf34a619105d42f4b703f7d5e3780035dd5d176dd0ce6e759bed3be659"} Jan 31 15:01:03 crc kubenswrapper[4751]: I0131 15:01:03.739368 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/openstackclient"] Jan 31 15:01:03 crc kubenswrapper[4751]: I0131 15:01:03.746017 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/openstackclient"] Jan 31 15:01:04 crc kubenswrapper[4751]: I0131 15:01:04.413926 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d360673b-7556-44b9-b7bd-4805810da349" path="/var/lib/kubelet/pods/d360673b-7556-44b9-b7bd-4805810da349/volumes" Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.007060 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glancebf79-account-delete-vglq2" Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.011626 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-cron-29497861-5bd6d" Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.134457 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6de76201-fcd1-48a2-8bba-dcdf63bbdf20-operator-scripts\") pod \"6de76201-fcd1-48a2-8bba-dcdf63bbdf20\" (UID: \"6de76201-fcd1-48a2-8bba-dcdf63bbdf20\") " Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.134596 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mtfm\" (UniqueName: \"kubernetes.io/projected/6de76201-fcd1-48a2-8bba-dcdf63bbdf20-kube-api-access-7mtfm\") pod \"6de76201-fcd1-48a2-8bba-dcdf63bbdf20\" (UID: \"6de76201-fcd1-48a2-8bba-dcdf63bbdf20\") " Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.134645 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czpkb\" (UniqueName: \"kubernetes.io/projected/bce6ceb9-5b0d-4ec7-9492-94dce9bb261d-kube-api-access-czpkb\") pod \"bce6ceb9-5b0d-4ec7-9492-94dce9bb261d\" (UID: \"bce6ceb9-5b0d-4ec7-9492-94dce9bb261d\") " Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.134717 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bce6ceb9-5b0d-4ec7-9492-94dce9bb261d-config-data\") pod \"bce6ceb9-5b0d-4ec7-9492-94dce9bb261d\" (UID: \"bce6ceb9-5b0d-4ec7-9492-94dce9bb261d\") " Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.134761 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bce6ceb9-5b0d-4ec7-9492-94dce9bb261d-fernet-keys\") pod \"bce6ceb9-5b0d-4ec7-9492-94dce9bb261d\" (UID: \"bce6ceb9-5b0d-4ec7-9492-94dce9bb261d\") " Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.135254 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6de76201-fcd1-48a2-8bba-dcdf63bbdf20-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6de76201-fcd1-48a2-8bba-dcdf63bbdf20" (UID: "6de76201-fcd1-48a2-8bba-dcdf63bbdf20"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.139852 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bce6ceb9-5b0d-4ec7-9492-94dce9bb261d-kube-api-access-czpkb" (OuterVolumeSpecName: "kube-api-access-czpkb") pod "bce6ceb9-5b0d-4ec7-9492-94dce9bb261d" (UID: "bce6ceb9-5b0d-4ec7-9492-94dce9bb261d"). InnerVolumeSpecName "kube-api-access-czpkb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.140145 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6de76201-fcd1-48a2-8bba-dcdf63bbdf20-kube-api-access-7mtfm" (OuterVolumeSpecName: "kube-api-access-7mtfm") pod "6de76201-fcd1-48a2-8bba-dcdf63bbdf20" (UID: "6de76201-fcd1-48a2-8bba-dcdf63bbdf20"). InnerVolumeSpecName "kube-api-access-7mtfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.151226 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bce6ceb9-5b0d-4ec7-9492-94dce9bb261d-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "bce6ceb9-5b0d-4ec7-9492-94dce9bb261d" (UID: "bce6ceb9-5b0d-4ec7-9492-94dce9bb261d"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.180109 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bce6ceb9-5b0d-4ec7-9492-94dce9bb261d-config-data" (OuterVolumeSpecName: "config-data") pod "bce6ceb9-5b0d-4ec7-9492-94dce9bb261d" (UID: "bce6ceb9-5b0d-4ec7-9492-94dce9bb261d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.236729 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mtfm\" (UniqueName: \"kubernetes.io/projected/6de76201-fcd1-48a2-8bba-dcdf63bbdf20-kube-api-access-7mtfm\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.236769 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czpkb\" (UniqueName: \"kubernetes.io/projected/bce6ceb9-5b0d-4ec7-9492-94dce9bb261d-kube-api-access-czpkb\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.236783 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bce6ceb9-5b0d-4ec7-9492-94dce9bb261d-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.236797 4751 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bce6ceb9-5b0d-4ec7-9492-94dce9bb261d-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.236811 4751 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6de76201-fcd1-48a2-8bba-dcdf63bbdf20-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.709843 4751 generic.go:334] "Generic (PLEG): container finished" podID="53e80c85-256f-4e3a-8338-091b69c8a111" containerID="52fb0acaeff7876fc2ee5ab2cce699867c40acf2d6cd815e7e538721bdb941cf" exitCode=0 Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.710159 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"53e80c85-256f-4e3a-8338-091b69c8a111","Type":"ContainerDied","Data":"52fb0acaeff7876fc2ee5ab2cce699867c40acf2d6cd815e7e538721bdb941cf"} Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.714728 4751 generic.go:334] "Generic (PLEG): container finished" podID="a6f236ad-2ab6-4e51-b934-402f28844e69" containerID="c1281edefeae3927db375b0c14eca77f9671b71769c05b60b41b1179bc1039fe" exitCode=0 Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.714787 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"a6f236ad-2ab6-4e51-b934-402f28844e69","Type":"ContainerDied","Data":"c1281edefeae3927db375b0c14eca77f9671b71769c05b60b41b1179bc1039fe"} Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.717176 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glancebf79-account-delete-vglq2" event={"ID":"6de76201-fcd1-48a2-8bba-dcdf63bbdf20","Type":"ContainerDied","Data":"5bf286bf34a619105d42f4b703f7d5e3780035dd5d176dd0ce6e759bed3be659"} Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.717202 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5bf286bf34a619105d42f4b703f7d5e3780035dd5d176dd0ce6e759bed3be659" Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.717278 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glancebf79-account-delete-vglq2" Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.730525 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-cron-29497861-5bd6d" event={"ID":"bce6ceb9-5b0d-4ec7-9492-94dce9bb261d","Type":"ContainerDied","Data":"e746be916cd4d51c51a7b8ba5c98afa99b72d12f05c9b8ae4e86df55efe466c0"} Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.730816 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e746be916cd4d51c51a7b8ba5c98afa99b72d12f05c9b8ae4e86df55efe466c0" Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.731710 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-cron-29497861-5bd6d" Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.813806 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.848616 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.962083 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6f236ad-2ab6-4e51-b934-402f28844e69-logs\") pod \"a6f236ad-2ab6-4e51-b934-402f28844e69\" (UID: \"a6f236ad-2ab6-4e51-b934-402f28844e69\") " Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.962132 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"53e80c85-256f-4e3a-8338-091b69c8a111\" (UID: \"53e80c85-256f-4e3a-8338-091b69c8a111\") " Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.962161 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a6f236ad-2ab6-4e51-b934-402f28844e69-dev\") pod \"a6f236ad-2ab6-4e51-b934-402f28844e69\" (UID: \"a6f236ad-2ab6-4e51-b934-402f28844e69\") " Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.962179 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a6f236ad-2ab6-4e51-b934-402f28844e69-sys\") pod \"a6f236ad-2ab6-4e51-b934-402f28844e69\" (UID: \"a6f236ad-2ab6-4e51-b934-402f28844e69\") " Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.962191 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/53e80c85-256f-4e3a-8338-091b69c8a111-lib-modules\") pod \"53e80c85-256f-4e3a-8338-091b69c8a111\" (UID: \"53e80c85-256f-4e3a-8338-091b69c8a111\") " Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.962213 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"53e80c85-256f-4e3a-8338-091b69c8a111\" (UID: \"53e80c85-256f-4e3a-8338-091b69c8a111\") " Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.962232 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/53e80c85-256f-4e3a-8338-091b69c8a111-httpd-run\") pod \"53e80c85-256f-4e3a-8338-091b69c8a111\" (UID: \"53e80c85-256f-4e3a-8338-091b69c8a111\") " Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.962247 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/53e80c85-256f-4e3a-8338-091b69c8a111-dev\") pod \"53e80c85-256f-4e3a-8338-091b69c8a111\" (UID: \"53e80c85-256f-4e3a-8338-091b69c8a111\") " Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.962263 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzc4c\" (UniqueName: \"kubernetes.io/projected/a6f236ad-2ab6-4e51-b934-402f28844e69-kube-api-access-wzc4c\") pod \"a6f236ad-2ab6-4e51-b934-402f28844e69\" (UID: \"a6f236ad-2ab6-4e51-b934-402f28844e69\") " Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.962258 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a6f236ad-2ab6-4e51-b934-402f28844e69-sys" (OuterVolumeSpecName: "sys") pod "a6f236ad-2ab6-4e51-b934-402f28844e69" (UID: "a6f236ad-2ab6-4e51-b934-402f28844e69"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.962286 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53e80c85-256f-4e3a-8338-091b69c8a111-config-data\") pod \"53e80c85-256f-4e3a-8338-091b69c8a111\" (UID: \"53e80c85-256f-4e3a-8338-091b69c8a111\") " Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.962315 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a6f236ad-2ab6-4e51-b934-402f28844e69-dev" (OuterVolumeSpecName: "dev") pod "a6f236ad-2ab6-4e51-b934-402f28844e69" (UID: "a6f236ad-2ab6-4e51-b934-402f28844e69"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.962320 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a6f236ad-2ab6-4e51-b934-402f28844e69-lib-modules\") pod \"a6f236ad-2ab6-4e51-b934-402f28844e69\" (UID: \"a6f236ad-2ab6-4e51-b934-402f28844e69\") " Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.962337 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/53e80c85-256f-4e3a-8338-091b69c8a111-var-locks-brick\") pod \"53e80c85-256f-4e3a-8338-091b69c8a111\" (UID: \"53e80c85-256f-4e3a-8338-091b69c8a111\") " Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.962349 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a6f236ad-2ab6-4e51-b934-402f28844e69-etc-nvme\") pod \"a6f236ad-2ab6-4e51-b934-402f28844e69\" (UID: \"a6f236ad-2ab6-4e51-b934-402f28844e69\") " Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.962370 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/53e80c85-256f-4e3a-8338-091b69c8a111-etc-nvme\") pod \"53e80c85-256f-4e3a-8338-091b69c8a111\" (UID: \"53e80c85-256f-4e3a-8338-091b69c8a111\") " Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.962399 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/53e80c85-256f-4e3a-8338-091b69c8a111-sys\") pod \"53e80c85-256f-4e3a-8338-091b69c8a111\" (UID: \"53e80c85-256f-4e3a-8338-091b69c8a111\") " Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.962412 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"a6f236ad-2ab6-4e51-b934-402f28844e69\" (UID: \"a6f236ad-2ab6-4e51-b934-402f28844e69\") " Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.962436 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53e80c85-256f-4e3a-8338-091b69c8a111-logs\") pod \"53e80c85-256f-4e3a-8338-091b69c8a111\" (UID: \"53e80c85-256f-4e3a-8338-091b69c8a111\") " Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.962453 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6f236ad-2ab6-4e51-b934-402f28844e69-config-data\") pod \"a6f236ad-2ab6-4e51-b934-402f28844e69\" (UID: \"a6f236ad-2ab6-4e51-b934-402f28844e69\") " Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.962468 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/53e80c85-256f-4e3a-8338-091b69c8a111-etc-iscsi\") pod \"53e80c85-256f-4e3a-8338-091b69c8a111\" (UID: \"53e80c85-256f-4e3a-8338-091b69c8a111\") " Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.962483 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53e80c85-256f-4e3a-8338-091b69c8a111-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "53e80c85-256f-4e3a-8338-091b69c8a111" (UID: "53e80c85-256f-4e3a-8338-091b69c8a111"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.962493 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/53e80c85-256f-4e3a-8338-091b69c8a111-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "53e80c85-256f-4e3a-8338-091b69c8a111" (UID: "53e80c85-256f-4e3a-8338-091b69c8a111"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.962500 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6f236ad-2ab6-4e51-b934-402f28844e69-logs" (OuterVolumeSpecName: "logs") pod "a6f236ad-2ab6-4e51-b934-402f28844e69" (UID: "a6f236ad-2ab6-4e51-b934-402f28844e69"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.962489 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a6f236ad-2ab6-4e51-b934-402f28844e69-var-locks-brick\") pod \"a6f236ad-2ab6-4e51-b934-402f28844e69\" (UID: \"a6f236ad-2ab6-4e51-b934-402f28844e69\") " Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.962517 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a6f236ad-2ab6-4e51-b934-402f28844e69-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "a6f236ad-2ab6-4e51-b934-402f28844e69" (UID: "a6f236ad-2ab6-4e51-b934-402f28844e69"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.962525 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/53e80c85-256f-4e3a-8338-091b69c8a111-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "53e80c85-256f-4e3a-8338-091b69c8a111" (UID: "53e80c85-256f-4e3a-8338-091b69c8a111"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.962550 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a6f236ad-2ab6-4e51-b934-402f28844e69-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "a6f236ad-2ab6-4e51-b934-402f28844e69" (UID: "a6f236ad-2ab6-4e51-b934-402f28844e69"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.962568 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/53e80c85-256f-4e3a-8338-091b69c8a111-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "53e80c85-256f-4e3a-8338-091b69c8a111" (UID: "53e80c85-256f-4e3a-8338-091b69c8a111"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.962571 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a6f236ad-2ab6-4e51-b934-402f28844e69-run\") pod \"a6f236ad-2ab6-4e51-b934-402f28844e69\" (UID: \"a6f236ad-2ab6-4e51-b934-402f28844e69\") " Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.962605 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6f236ad-2ab6-4e51-b934-402f28844e69-scripts\") pod \"a6f236ad-2ab6-4e51-b934-402f28844e69\" (UID: \"a6f236ad-2ab6-4e51-b934-402f28844e69\") " Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.962631 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/53e80c85-256f-4e3a-8338-091b69c8a111-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "53e80c85-256f-4e3a-8338-091b69c8a111" (UID: "53e80c85-256f-4e3a-8338-091b69c8a111"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.962640 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a6f236ad-2ab6-4e51-b934-402f28844e69-etc-iscsi\") pod \"a6f236ad-2ab6-4e51-b934-402f28844e69\" (UID: \"a6f236ad-2ab6-4e51-b934-402f28844e69\") " Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.962664 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/53e80c85-256f-4e3a-8338-091b69c8a111-dev" (OuterVolumeSpecName: "dev") pod "53e80c85-256f-4e3a-8338-091b69c8a111" (UID: "53e80c85-256f-4e3a-8338-091b69c8a111"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.962678 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53e80c85-256f-4e3a-8338-091b69c8a111-scripts\") pod \"53e80c85-256f-4e3a-8338-091b69c8a111\" (UID: \"53e80c85-256f-4e3a-8338-091b69c8a111\") " Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.962690 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a6f236ad-2ab6-4e51-b934-402f28844e69-run" (OuterVolumeSpecName: "run") pod "a6f236ad-2ab6-4e51-b934-402f28844e69" (UID: "a6f236ad-2ab6-4e51-b934-402f28844e69"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.962708 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2cjr\" (UniqueName: \"kubernetes.io/projected/53e80c85-256f-4e3a-8338-091b69c8a111-kube-api-access-p2cjr\") pod \"53e80c85-256f-4e3a-8338-091b69c8a111\" (UID: \"53e80c85-256f-4e3a-8338-091b69c8a111\") " Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.962745 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"a6f236ad-2ab6-4e51-b934-402f28844e69\" (UID: \"a6f236ad-2ab6-4e51-b934-402f28844e69\") " Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.962784 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/53e80c85-256f-4e3a-8338-091b69c8a111-run\") pod \"53e80c85-256f-4e3a-8338-091b69c8a111\" (UID: \"53e80c85-256f-4e3a-8338-091b69c8a111\") " Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.962805 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a6f236ad-2ab6-4e51-b934-402f28844e69-httpd-run\") pod \"a6f236ad-2ab6-4e51-b934-402f28844e69\" (UID: \"a6f236ad-2ab6-4e51-b934-402f28844e69\") " Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.962847 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/53e80c85-256f-4e3a-8338-091b69c8a111-sys" (OuterVolumeSpecName: "sys") pod "53e80c85-256f-4e3a-8338-091b69c8a111" (UID: "53e80c85-256f-4e3a-8338-091b69c8a111"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.962874 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53e80c85-256f-4e3a-8338-091b69c8a111-logs" (OuterVolumeSpecName: "logs") pod "53e80c85-256f-4e3a-8338-091b69c8a111" (UID: "53e80c85-256f-4e3a-8338-091b69c8a111"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.962904 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a6f236ad-2ab6-4e51-b934-402f28844e69-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "a6f236ad-2ab6-4e51-b934-402f28844e69" (UID: "a6f236ad-2ab6-4e51-b934-402f28844e69"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.963301 4751 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a6f236ad-2ab6-4e51-b934-402f28844e69-dev\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.963314 4751 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a6f236ad-2ab6-4e51-b934-402f28844e69-sys\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.963326 4751 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/53e80c85-256f-4e3a-8338-091b69c8a111-lib-modules\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.963337 4751 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/53e80c85-256f-4e3a-8338-091b69c8a111-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.963347 4751 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/53e80c85-256f-4e3a-8338-091b69c8a111-dev\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.963358 4751 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a6f236ad-2ab6-4e51-b934-402f28844e69-lib-modules\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.963367 4751 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/53e80c85-256f-4e3a-8338-091b69c8a111-var-locks-brick\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.963377 4751 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/53e80c85-256f-4e3a-8338-091b69c8a111-etc-nvme\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.963387 4751 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/53e80c85-256f-4e3a-8338-091b69c8a111-sys\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.963396 4751 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53e80c85-256f-4e3a-8338-091b69c8a111-logs\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.963405 4751 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/53e80c85-256f-4e3a-8338-091b69c8a111-etc-iscsi\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.963415 4751 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a6f236ad-2ab6-4e51-b934-402f28844e69-var-locks-brick\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.963426 4751 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6f236ad-2ab6-4e51-b934-402f28844e69-logs\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.963680 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6f236ad-2ab6-4e51-b934-402f28844e69-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a6f236ad-2ab6-4e51-b934-402f28844e69" (UID: "a6f236ad-2ab6-4e51-b934-402f28844e69"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.966322 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a6f236ad-2ab6-4e51-b934-402f28844e69-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "a6f236ad-2ab6-4e51-b934-402f28844e69" (UID: "a6f236ad-2ab6-4e51-b934-402f28844e69"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.966254 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/53e80c85-256f-4e3a-8338-091b69c8a111-run" (OuterVolumeSpecName: "run") pod "53e80c85-256f-4e3a-8338-091b69c8a111" (UID: "53e80c85-256f-4e3a-8338-091b69c8a111"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.967305 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "a6f236ad-2ab6-4e51-b934-402f28844e69" (UID: "a6f236ad-2ab6-4e51-b934-402f28844e69"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.967905 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6f236ad-2ab6-4e51-b934-402f28844e69-kube-api-access-wzc4c" (OuterVolumeSpecName: "kube-api-access-wzc4c") pod "a6f236ad-2ab6-4e51-b934-402f28844e69" (UID: "a6f236ad-2ab6-4e51-b934-402f28844e69"). InnerVolumeSpecName "kube-api-access-wzc4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.967926 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance-cache") pod "53e80c85-256f-4e3a-8338-091b69c8a111" (UID: "53e80c85-256f-4e3a-8338-091b69c8a111"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.968009 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53e80c85-256f-4e3a-8338-091b69c8a111-kube-api-access-p2cjr" (OuterVolumeSpecName: "kube-api-access-p2cjr") pod "53e80c85-256f-4e3a-8338-091b69c8a111" (UID: "53e80c85-256f-4e3a-8338-091b69c8a111"). InnerVolumeSpecName "kube-api-access-p2cjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.971143 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6f236ad-2ab6-4e51-b934-402f28844e69-scripts" (OuterVolumeSpecName: "scripts") pod "a6f236ad-2ab6-4e51-b934-402f28844e69" (UID: "a6f236ad-2ab6-4e51-b934-402f28844e69"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.971253 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance-cache") pod "a6f236ad-2ab6-4e51-b934-402f28844e69" (UID: "a6f236ad-2ab6-4e51-b934-402f28844e69"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.972142 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53e80c85-256f-4e3a-8338-091b69c8a111-scripts" (OuterVolumeSpecName: "scripts") pod "53e80c85-256f-4e3a-8338-091b69c8a111" (UID: "53e80c85-256f-4e3a-8338-091b69c8a111"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:01:05 crc kubenswrapper[4751]: I0131 15:01:05.972644 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "53e80c85-256f-4e3a-8338-091b69c8a111" (UID: "53e80c85-256f-4e3a-8338-091b69c8a111"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:01:06 crc kubenswrapper[4751]: I0131 15:01:06.026609 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6f236ad-2ab6-4e51-b934-402f28844e69-config-data" (OuterVolumeSpecName: "config-data") pod "a6f236ad-2ab6-4e51-b934-402f28844e69" (UID: "a6f236ad-2ab6-4e51-b934-402f28844e69"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:01:06 crc kubenswrapper[4751]: I0131 15:01:06.045055 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53e80c85-256f-4e3a-8338-091b69c8a111-config-data" (OuterVolumeSpecName: "config-data") pod "53e80c85-256f-4e3a-8338-091b69c8a111" (UID: "53e80c85-256f-4e3a-8338-091b69c8a111"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:01:06 crc kubenswrapper[4751]: I0131 15:01:06.064916 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Jan 31 15:01:06 crc kubenswrapper[4751]: I0131 15:01:06.064942 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Jan 31 15:01:06 crc kubenswrapper[4751]: I0131 15:01:06.064957 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzc4c\" (UniqueName: \"kubernetes.io/projected/a6f236ad-2ab6-4e51-b934-402f28844e69-kube-api-access-wzc4c\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:06 crc kubenswrapper[4751]: I0131 15:01:06.064973 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53e80c85-256f-4e3a-8338-091b69c8a111-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:06 crc kubenswrapper[4751]: I0131 15:01:06.064985 4751 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a6f236ad-2ab6-4e51-b934-402f28844e69-etc-nvme\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:06 crc kubenswrapper[4751]: I0131 15:01:06.065001 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Jan 31 15:01:06 crc kubenswrapper[4751]: I0131 15:01:06.065012 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6f236ad-2ab6-4e51-b934-402f28844e69-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:06 crc kubenswrapper[4751]: I0131 15:01:06.065025 4751 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a6f236ad-2ab6-4e51-b934-402f28844e69-run\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:06 crc kubenswrapper[4751]: I0131 15:01:06.065036 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6f236ad-2ab6-4e51-b934-402f28844e69-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:06 crc kubenswrapper[4751]: I0131 15:01:06.065047 4751 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a6f236ad-2ab6-4e51-b934-402f28844e69-etc-iscsi\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:06 crc kubenswrapper[4751]: I0131 15:01:06.065058 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2cjr\" (UniqueName: \"kubernetes.io/projected/53e80c85-256f-4e3a-8338-091b69c8a111-kube-api-access-p2cjr\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:06 crc kubenswrapper[4751]: I0131 15:01:06.065089 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53e80c85-256f-4e3a-8338-091b69c8a111-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:06 crc kubenswrapper[4751]: I0131 15:01:06.065107 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 31 15:01:06 crc kubenswrapper[4751]: I0131 15:01:06.065118 4751 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/53e80c85-256f-4e3a-8338-091b69c8a111-run\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:06 crc kubenswrapper[4751]: I0131 15:01:06.065128 4751 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a6f236ad-2ab6-4e51-b934-402f28844e69-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:06 crc kubenswrapper[4751]: I0131 15:01:06.080552 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 31 15:01:06 crc kubenswrapper[4751]: I0131 15:01:06.083867 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Jan 31 15:01:06 crc kubenswrapper[4751]: I0131 15:01:06.084388 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Jan 31 15:01:06 crc kubenswrapper[4751]: I0131 15:01:06.092385 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Jan 31 15:01:06 crc kubenswrapper[4751]: I0131 15:01:06.166608 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:06 crc kubenswrapper[4751]: I0131 15:01:06.166662 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:06 crc kubenswrapper[4751]: I0131 15:01:06.166687 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:06 crc kubenswrapper[4751]: I0131 15:01:06.166712 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:06 crc kubenswrapper[4751]: I0131 15:01:06.743534 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"53e80c85-256f-4e3a-8338-091b69c8a111","Type":"ContainerDied","Data":"d0fc51af73d94f86e3cd1b0621a38ca7cd14201bdbba30a0fccb4019efc30e6f"} Jan 31 15:01:06 crc kubenswrapper[4751]: I0131 15:01:06.743563 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:01:06 crc kubenswrapper[4751]: I0131 15:01:06.743991 4751 scope.go:117] "RemoveContainer" containerID="52fb0acaeff7876fc2ee5ab2cce699867c40acf2d6cd815e7e538721bdb941cf" Jan 31 15:01:06 crc kubenswrapper[4751]: I0131 15:01:06.746998 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"a6f236ad-2ab6-4e51-b934-402f28844e69","Type":"ContainerDied","Data":"0404afa0dee3bb2591b16ea7fdc6a0ed77a19e078e63d50f945b13286beb2ed9"} Jan 31 15:01:06 crc kubenswrapper[4751]: I0131 15:01:06.747113 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:01:06 crc kubenswrapper[4751]: I0131 15:01:06.781120 4751 scope.go:117] "RemoveContainer" containerID="2470407eb06da53e051c9bcfd402a9b5b782f16d58c74eaa361abad6c79fcccd" Jan 31 15:01:06 crc kubenswrapper[4751]: I0131 15:01:06.787132 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Jan 31 15:01:06 crc kubenswrapper[4751]: I0131 15:01:06.803900 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Jan 31 15:01:06 crc kubenswrapper[4751]: I0131 15:01:06.817709 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Jan 31 15:01:06 crc kubenswrapper[4751]: I0131 15:01:06.821694 4751 scope.go:117] "RemoveContainer" containerID="c1281edefeae3927db375b0c14eca77f9671b71769c05b60b41b1179bc1039fe" Jan 31 15:01:06 crc kubenswrapper[4751]: I0131 15:01:06.828241 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Jan 31 15:01:06 crc kubenswrapper[4751]: I0131 15:01:06.843595 4751 scope.go:117] "RemoveContainer" containerID="3d4c9658ef799ffc5bf29e9925845adf68e2321560adab813cd297c7b6dfe0e2" Jan 31 15:01:07 crc kubenswrapper[4751]: I0131 15:01:07.212419 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-create-b8nfw"] Jan 31 15:01:07 crc kubenswrapper[4751]: I0131 15:01:07.218944 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-create-b8nfw"] Jan 31 15:01:07 crc kubenswrapper[4751]: I0131 15:01:07.226083 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-bf79-account-create-update-whmk8"] Jan 31 15:01:07 crc kubenswrapper[4751]: I0131 15:01:07.232879 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glancebf79-account-delete-vglq2"] Jan 31 15:01:07 crc kubenswrapper[4751]: I0131 15:01:07.238650 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glancebf79-account-delete-vglq2"] Jan 31 15:01:07 crc kubenswrapper[4751]: I0131 15:01:07.244943 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-bf79-account-create-update-whmk8"] Jan 31 15:01:08 crc kubenswrapper[4751]: I0131 15:01:08.413642 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c9ad1c0-9bb7-4d3e-8e68-8310292d89fa" path="/var/lib/kubelet/pods/4c9ad1c0-9bb7-4d3e-8e68-8310292d89fa/volumes" Jan 31 15:01:08 crc kubenswrapper[4751]: I0131 15:01:08.414748 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53e80c85-256f-4e3a-8338-091b69c8a111" path="/var/lib/kubelet/pods/53e80c85-256f-4e3a-8338-091b69c8a111/volumes" Jan 31 15:01:08 crc kubenswrapper[4751]: I0131 15:01:08.415562 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6de76201-fcd1-48a2-8bba-dcdf63bbdf20" path="/var/lib/kubelet/pods/6de76201-fcd1-48a2-8bba-dcdf63bbdf20/volumes" Jan 31 15:01:08 crc kubenswrapper[4751]: I0131 15:01:08.416844 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="892f8632-f7d8-46b0-a39a-4a84f5e3a2aa" path="/var/lib/kubelet/pods/892f8632-f7d8-46b0-a39a-4a84f5e3a2aa/volumes" Jan 31 15:01:08 crc kubenswrapper[4751]: I0131 15:01:08.417723 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6f236ad-2ab6-4e51-b934-402f28844e69" path="/var/lib/kubelet/pods/a6f236ad-2ab6-4e51-b934-402f28844e69/volumes" Jan 31 15:01:09 crc kubenswrapper[4751]: I0131 15:01:09.268787 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-bd3b-account-create-update-nbdd9"] Jan 31 15:01:09 crc kubenswrapper[4751]: E0131 15:01:09.269483 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6f236ad-2ab6-4e51-b934-402f28844e69" containerName="glance-httpd" Jan 31 15:01:09 crc kubenswrapper[4751]: I0131 15:01:09.269500 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6f236ad-2ab6-4e51-b934-402f28844e69" containerName="glance-httpd" Jan 31 15:01:09 crc kubenswrapper[4751]: E0131 15:01:09.269515 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6de76201-fcd1-48a2-8bba-dcdf63bbdf20" containerName="mariadb-account-delete" Jan 31 15:01:09 crc kubenswrapper[4751]: I0131 15:01:09.269523 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="6de76201-fcd1-48a2-8bba-dcdf63bbdf20" containerName="mariadb-account-delete" Jan 31 15:01:09 crc kubenswrapper[4751]: E0131 15:01:09.269546 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6f236ad-2ab6-4e51-b934-402f28844e69" containerName="glance-log" Jan 31 15:01:09 crc kubenswrapper[4751]: I0131 15:01:09.269552 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6f236ad-2ab6-4e51-b934-402f28844e69" containerName="glance-log" Jan 31 15:01:09 crc kubenswrapper[4751]: E0131 15:01:09.269561 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53e80c85-256f-4e3a-8338-091b69c8a111" containerName="glance-log" Jan 31 15:01:09 crc kubenswrapper[4751]: I0131 15:01:09.269567 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="53e80c85-256f-4e3a-8338-091b69c8a111" containerName="glance-log" Jan 31 15:01:09 crc kubenswrapper[4751]: E0131 15:01:09.269582 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bce6ceb9-5b0d-4ec7-9492-94dce9bb261d" containerName="keystone-cron" Jan 31 15:01:09 crc kubenswrapper[4751]: I0131 15:01:09.269588 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="bce6ceb9-5b0d-4ec7-9492-94dce9bb261d" containerName="keystone-cron" Jan 31 15:01:09 crc kubenswrapper[4751]: E0131 15:01:09.269599 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53e80c85-256f-4e3a-8338-091b69c8a111" containerName="glance-httpd" Jan 31 15:01:09 crc kubenswrapper[4751]: I0131 15:01:09.269605 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="53e80c85-256f-4e3a-8338-091b69c8a111" containerName="glance-httpd" Jan 31 15:01:09 crc kubenswrapper[4751]: E0131 15:01:09.269614 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d360673b-7556-44b9-b7bd-4805810da349" containerName="openstackclient" Jan 31 15:01:09 crc kubenswrapper[4751]: I0131 15:01:09.269633 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="d360673b-7556-44b9-b7bd-4805810da349" containerName="openstackclient" Jan 31 15:01:09 crc kubenswrapper[4751]: I0131 15:01:09.269807 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="bce6ceb9-5b0d-4ec7-9492-94dce9bb261d" containerName="keystone-cron" Jan 31 15:01:09 crc kubenswrapper[4751]: I0131 15:01:09.269828 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="53e80c85-256f-4e3a-8338-091b69c8a111" containerName="glance-httpd" Jan 31 15:01:09 crc kubenswrapper[4751]: I0131 15:01:09.269838 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="d360673b-7556-44b9-b7bd-4805810da349" containerName="openstackclient" Jan 31 15:01:09 crc kubenswrapper[4751]: I0131 15:01:09.269847 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6f236ad-2ab6-4e51-b934-402f28844e69" containerName="glance-httpd" Jan 31 15:01:09 crc kubenswrapper[4751]: I0131 15:01:09.269858 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="53e80c85-256f-4e3a-8338-091b69c8a111" containerName="glance-log" Jan 31 15:01:09 crc kubenswrapper[4751]: I0131 15:01:09.269869 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6f236ad-2ab6-4e51-b934-402f28844e69" containerName="glance-log" Jan 31 15:01:09 crc kubenswrapper[4751]: I0131 15:01:09.269879 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="6de76201-fcd1-48a2-8bba-dcdf63bbdf20" containerName="mariadb-account-delete" Jan 31 15:01:09 crc kubenswrapper[4751]: I0131 15:01:09.270517 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-bd3b-account-create-update-nbdd9" Jan 31 15:01:09 crc kubenswrapper[4751]: I0131 15:01:09.274022 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-db-secret" Jan 31 15:01:09 crc kubenswrapper[4751]: I0131 15:01:09.283428 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-create-fm54m"] Jan 31 15:01:09 crc kubenswrapper[4751]: I0131 15:01:09.284381 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-fm54m" Jan 31 15:01:09 crc kubenswrapper[4751]: I0131 15:01:09.291539 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-fm54m"] Jan 31 15:01:09 crc kubenswrapper[4751]: I0131 15:01:09.297945 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-bd3b-account-create-update-nbdd9"] Jan 31 15:01:09 crc kubenswrapper[4751]: I0131 15:01:09.416277 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31decdae-8d23-4756-b743-4cd4f7709654-operator-scripts\") pod \"glance-db-create-fm54m\" (UID: \"31decdae-8d23-4756-b743-4cd4f7709654\") " pod="glance-kuttl-tests/glance-db-create-fm54m" Jan 31 15:01:09 crc kubenswrapper[4751]: I0131 15:01:09.416352 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cb65\" (UniqueName: \"kubernetes.io/projected/b491fa19-1dde-4e28-919f-f120c0c772b7-kube-api-access-7cb65\") pod \"glance-bd3b-account-create-update-nbdd9\" (UID: \"b491fa19-1dde-4e28-919f-f120c0c772b7\") " pod="glance-kuttl-tests/glance-bd3b-account-create-update-nbdd9" Jan 31 15:01:09 crc kubenswrapper[4751]: I0131 15:01:09.416406 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b491fa19-1dde-4e28-919f-f120c0c772b7-operator-scripts\") pod \"glance-bd3b-account-create-update-nbdd9\" (UID: \"b491fa19-1dde-4e28-919f-f120c0c772b7\") " pod="glance-kuttl-tests/glance-bd3b-account-create-update-nbdd9" Jan 31 15:01:09 crc kubenswrapper[4751]: I0131 15:01:09.416462 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjwvz\" (UniqueName: \"kubernetes.io/projected/31decdae-8d23-4756-b743-4cd4f7709654-kube-api-access-wjwvz\") pod \"glance-db-create-fm54m\" (UID: \"31decdae-8d23-4756-b743-4cd4f7709654\") " pod="glance-kuttl-tests/glance-db-create-fm54m" Jan 31 15:01:09 crc kubenswrapper[4751]: I0131 15:01:09.518351 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31decdae-8d23-4756-b743-4cd4f7709654-operator-scripts\") pod \"glance-db-create-fm54m\" (UID: \"31decdae-8d23-4756-b743-4cd4f7709654\") " pod="glance-kuttl-tests/glance-db-create-fm54m" Jan 31 15:01:09 crc kubenswrapper[4751]: I0131 15:01:09.518446 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cb65\" (UniqueName: \"kubernetes.io/projected/b491fa19-1dde-4e28-919f-f120c0c772b7-kube-api-access-7cb65\") pod \"glance-bd3b-account-create-update-nbdd9\" (UID: \"b491fa19-1dde-4e28-919f-f120c0c772b7\") " pod="glance-kuttl-tests/glance-bd3b-account-create-update-nbdd9" Jan 31 15:01:09 crc kubenswrapper[4751]: I0131 15:01:09.518521 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b491fa19-1dde-4e28-919f-f120c0c772b7-operator-scripts\") pod \"glance-bd3b-account-create-update-nbdd9\" (UID: \"b491fa19-1dde-4e28-919f-f120c0c772b7\") " pod="glance-kuttl-tests/glance-bd3b-account-create-update-nbdd9" Jan 31 15:01:09 crc kubenswrapper[4751]: I0131 15:01:09.518598 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjwvz\" (UniqueName: \"kubernetes.io/projected/31decdae-8d23-4756-b743-4cd4f7709654-kube-api-access-wjwvz\") pod \"glance-db-create-fm54m\" (UID: \"31decdae-8d23-4756-b743-4cd4f7709654\") " pod="glance-kuttl-tests/glance-db-create-fm54m" Jan 31 15:01:09 crc kubenswrapper[4751]: I0131 15:01:09.519063 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31decdae-8d23-4756-b743-4cd4f7709654-operator-scripts\") pod \"glance-db-create-fm54m\" (UID: \"31decdae-8d23-4756-b743-4cd4f7709654\") " pod="glance-kuttl-tests/glance-db-create-fm54m" Jan 31 15:01:09 crc kubenswrapper[4751]: I0131 15:01:09.519719 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b491fa19-1dde-4e28-919f-f120c0c772b7-operator-scripts\") pod \"glance-bd3b-account-create-update-nbdd9\" (UID: \"b491fa19-1dde-4e28-919f-f120c0c772b7\") " pod="glance-kuttl-tests/glance-bd3b-account-create-update-nbdd9" Jan 31 15:01:09 crc kubenswrapper[4751]: I0131 15:01:09.537639 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cb65\" (UniqueName: \"kubernetes.io/projected/b491fa19-1dde-4e28-919f-f120c0c772b7-kube-api-access-7cb65\") pod \"glance-bd3b-account-create-update-nbdd9\" (UID: \"b491fa19-1dde-4e28-919f-f120c0c772b7\") " pod="glance-kuttl-tests/glance-bd3b-account-create-update-nbdd9" Jan 31 15:01:09 crc kubenswrapper[4751]: I0131 15:01:09.538788 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjwvz\" (UniqueName: \"kubernetes.io/projected/31decdae-8d23-4756-b743-4cd4f7709654-kube-api-access-wjwvz\") pod \"glance-db-create-fm54m\" (UID: \"31decdae-8d23-4756-b743-4cd4f7709654\") " pod="glance-kuttl-tests/glance-db-create-fm54m" Jan 31 15:01:09 crc kubenswrapper[4751]: I0131 15:01:09.586665 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-bd3b-account-create-update-nbdd9" Jan 31 15:01:09 crc kubenswrapper[4751]: I0131 15:01:09.606726 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-fm54m" Jan 31 15:01:09 crc kubenswrapper[4751]: I0131 15:01:09.830802 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-bd3b-account-create-update-nbdd9"] Jan 31 15:01:09 crc kubenswrapper[4751]: I0131 15:01:09.887637 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-fm54m"] Jan 31 15:01:09 crc kubenswrapper[4751]: W0131 15:01:09.896496 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31decdae_8d23_4756_b743_4cd4f7709654.slice/crio-7aa6089965c3133848800a7608f1a364a30fa95ef92c6adc517359800a60e4e1 WatchSource:0}: Error finding container 7aa6089965c3133848800a7608f1a364a30fa95ef92c6adc517359800a60e4e1: Status 404 returned error can't find the container with id 7aa6089965c3133848800a7608f1a364a30fa95ef92c6adc517359800a60e4e1 Jan 31 15:01:10 crc kubenswrapper[4751]: I0131 15:01:10.803087 4751 generic.go:334] "Generic (PLEG): container finished" podID="31decdae-8d23-4756-b743-4cd4f7709654" containerID="7e789eeabd8afc4f9d1d5096f902a1d03746cbe8acdf7df1c1fc6d2741b5975c" exitCode=0 Jan 31 15:01:10 crc kubenswrapper[4751]: I0131 15:01:10.803133 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-fm54m" event={"ID":"31decdae-8d23-4756-b743-4cd4f7709654","Type":"ContainerDied","Data":"7e789eeabd8afc4f9d1d5096f902a1d03746cbe8acdf7df1c1fc6d2741b5975c"} Jan 31 15:01:10 crc kubenswrapper[4751]: I0131 15:01:10.803403 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-fm54m" event={"ID":"31decdae-8d23-4756-b743-4cd4f7709654","Type":"ContainerStarted","Data":"7aa6089965c3133848800a7608f1a364a30fa95ef92c6adc517359800a60e4e1"} Jan 31 15:01:10 crc kubenswrapper[4751]: I0131 15:01:10.804650 4751 generic.go:334] "Generic (PLEG): container finished" podID="b491fa19-1dde-4e28-919f-f120c0c772b7" containerID="1b08739497c3b40bf4675eac8a3f77cfbe93709c363b0f7d316a1a53ab0f3eab" exitCode=0 Jan 31 15:01:10 crc kubenswrapper[4751]: I0131 15:01:10.804694 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-bd3b-account-create-update-nbdd9" event={"ID":"b491fa19-1dde-4e28-919f-f120c0c772b7","Type":"ContainerDied","Data":"1b08739497c3b40bf4675eac8a3f77cfbe93709c363b0f7d316a1a53ab0f3eab"} Jan 31 15:01:10 crc kubenswrapper[4751]: I0131 15:01:10.804710 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-bd3b-account-create-update-nbdd9" event={"ID":"b491fa19-1dde-4e28-919f-f120c0c772b7","Type":"ContainerStarted","Data":"05b4f104de7d71eab37b450445806c311fba7d5643451d20c4dfecb872c69cf1"} Jan 31 15:01:12 crc kubenswrapper[4751]: I0131 15:01:12.116385 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-bd3b-account-create-update-nbdd9" Jan 31 15:01:12 crc kubenswrapper[4751]: I0131 15:01:12.121280 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-fm54m" Jan 31 15:01:12 crc kubenswrapper[4751]: I0131 15:01:12.257251 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b491fa19-1dde-4e28-919f-f120c0c772b7-operator-scripts\") pod \"b491fa19-1dde-4e28-919f-f120c0c772b7\" (UID: \"b491fa19-1dde-4e28-919f-f120c0c772b7\") " Jan 31 15:01:12 crc kubenswrapper[4751]: I0131 15:01:12.257324 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7cb65\" (UniqueName: \"kubernetes.io/projected/b491fa19-1dde-4e28-919f-f120c0c772b7-kube-api-access-7cb65\") pod \"b491fa19-1dde-4e28-919f-f120c0c772b7\" (UID: \"b491fa19-1dde-4e28-919f-f120c0c772b7\") " Jan 31 15:01:12 crc kubenswrapper[4751]: I0131 15:01:12.257348 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31decdae-8d23-4756-b743-4cd4f7709654-operator-scripts\") pod \"31decdae-8d23-4756-b743-4cd4f7709654\" (UID: \"31decdae-8d23-4756-b743-4cd4f7709654\") " Jan 31 15:01:12 crc kubenswrapper[4751]: I0131 15:01:12.257414 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjwvz\" (UniqueName: \"kubernetes.io/projected/31decdae-8d23-4756-b743-4cd4f7709654-kube-api-access-wjwvz\") pod \"31decdae-8d23-4756-b743-4cd4f7709654\" (UID: \"31decdae-8d23-4756-b743-4cd4f7709654\") " Jan 31 15:01:12 crc kubenswrapper[4751]: I0131 15:01:12.258018 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31decdae-8d23-4756-b743-4cd4f7709654-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "31decdae-8d23-4756-b743-4cd4f7709654" (UID: "31decdae-8d23-4756-b743-4cd4f7709654"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 15:01:12 crc kubenswrapper[4751]: I0131 15:01:12.258372 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b491fa19-1dde-4e28-919f-f120c0c772b7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b491fa19-1dde-4e28-919f-f120c0c772b7" (UID: "b491fa19-1dde-4e28-919f-f120c0c772b7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 15:01:12 crc kubenswrapper[4751]: I0131 15:01:12.258441 4751 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b491fa19-1dde-4e28-919f-f120c0c772b7-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:12 crc kubenswrapper[4751]: I0131 15:01:12.258457 4751 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31decdae-8d23-4756-b743-4cd4f7709654-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:12 crc kubenswrapper[4751]: I0131 15:01:12.262743 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31decdae-8d23-4756-b743-4cd4f7709654-kube-api-access-wjwvz" (OuterVolumeSpecName: "kube-api-access-wjwvz") pod "31decdae-8d23-4756-b743-4cd4f7709654" (UID: "31decdae-8d23-4756-b743-4cd4f7709654"). InnerVolumeSpecName "kube-api-access-wjwvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:01:12 crc kubenswrapper[4751]: I0131 15:01:12.263588 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b491fa19-1dde-4e28-919f-f120c0c772b7-kube-api-access-7cb65" (OuterVolumeSpecName: "kube-api-access-7cb65") pod "b491fa19-1dde-4e28-919f-f120c0c772b7" (UID: "b491fa19-1dde-4e28-919f-f120c0c772b7"). InnerVolumeSpecName "kube-api-access-7cb65". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:01:12 crc kubenswrapper[4751]: I0131 15:01:12.360208 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7cb65\" (UniqueName: \"kubernetes.io/projected/b491fa19-1dde-4e28-919f-f120c0c772b7-kube-api-access-7cb65\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:12 crc kubenswrapper[4751]: I0131 15:01:12.360270 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjwvz\" (UniqueName: \"kubernetes.io/projected/31decdae-8d23-4756-b743-4cd4f7709654-kube-api-access-wjwvz\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:12 crc kubenswrapper[4751]: I0131 15:01:12.821176 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-bd3b-account-create-update-nbdd9" event={"ID":"b491fa19-1dde-4e28-919f-f120c0c772b7","Type":"ContainerDied","Data":"05b4f104de7d71eab37b450445806c311fba7d5643451d20c4dfecb872c69cf1"} Jan 31 15:01:12 crc kubenswrapper[4751]: I0131 15:01:12.821244 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05b4f104de7d71eab37b450445806c311fba7d5643451d20c4dfecb872c69cf1" Jan 31 15:01:12 crc kubenswrapper[4751]: I0131 15:01:12.821330 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-bd3b-account-create-update-nbdd9" Jan 31 15:01:12 crc kubenswrapper[4751]: I0131 15:01:12.823827 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-fm54m" event={"ID":"31decdae-8d23-4756-b743-4cd4f7709654","Type":"ContainerDied","Data":"7aa6089965c3133848800a7608f1a364a30fa95ef92c6adc517359800a60e4e1"} Jan 31 15:01:12 crc kubenswrapper[4751]: I0131 15:01:12.823883 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7aa6089965c3133848800a7608f1a364a30fa95ef92c6adc517359800a60e4e1" Jan 31 15:01:12 crc kubenswrapper[4751]: I0131 15:01:12.823937 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-fm54m" Jan 31 15:01:14 crc kubenswrapper[4751]: I0131 15:01:14.481553 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-sync-qslfl"] Jan 31 15:01:14 crc kubenswrapper[4751]: E0131 15:01:14.482037 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b491fa19-1dde-4e28-919f-f120c0c772b7" containerName="mariadb-account-create-update" Jan 31 15:01:14 crc kubenswrapper[4751]: I0131 15:01:14.482050 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="b491fa19-1dde-4e28-919f-f120c0c772b7" containerName="mariadb-account-create-update" Jan 31 15:01:14 crc kubenswrapper[4751]: E0131 15:01:14.482089 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31decdae-8d23-4756-b743-4cd4f7709654" containerName="mariadb-database-create" Jan 31 15:01:14 crc kubenswrapper[4751]: I0131 15:01:14.482095 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="31decdae-8d23-4756-b743-4cd4f7709654" containerName="mariadb-database-create" Jan 31 15:01:14 crc kubenswrapper[4751]: I0131 15:01:14.482235 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="b491fa19-1dde-4e28-919f-f120c0c772b7" containerName="mariadb-account-create-update" Jan 31 15:01:14 crc kubenswrapper[4751]: I0131 15:01:14.482247 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="31decdae-8d23-4756-b743-4cd4f7709654" containerName="mariadb-database-create" Jan 31 15:01:14 crc kubenswrapper[4751]: I0131 15:01:14.482663 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-qslfl" Jan 31 15:01:14 crc kubenswrapper[4751]: I0131 15:01:14.485311 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-2pnvw" Jan 31 15:01:14 crc kubenswrapper[4751]: I0131 15:01:14.485713 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"combined-ca-bundle" Jan 31 15:01:14 crc kubenswrapper[4751]: I0131 15:01:14.486336 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-config-data" Jan 31 15:01:14 crc kubenswrapper[4751]: I0131 15:01:14.495195 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-qslfl"] Jan 31 15:01:14 crc kubenswrapper[4751]: I0131 15:01:14.598405 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7wvr\" (UniqueName: \"kubernetes.io/projected/ec8366b9-bf19-46a4-9033-a05dabe579a4-kube-api-access-n7wvr\") pod \"glance-db-sync-qslfl\" (UID: \"ec8366b9-bf19-46a4-9033-a05dabe579a4\") " pod="glance-kuttl-tests/glance-db-sync-qslfl" Jan 31 15:01:14 crc kubenswrapper[4751]: I0131 15:01:14.598470 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec8366b9-bf19-46a4-9033-a05dabe579a4-config-data\") pod \"glance-db-sync-qslfl\" (UID: \"ec8366b9-bf19-46a4-9033-a05dabe579a4\") " pod="glance-kuttl-tests/glance-db-sync-qslfl" Jan 31 15:01:14 crc kubenswrapper[4751]: I0131 15:01:14.598497 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec8366b9-bf19-46a4-9033-a05dabe579a4-combined-ca-bundle\") pod \"glance-db-sync-qslfl\" (UID: \"ec8366b9-bf19-46a4-9033-a05dabe579a4\") " pod="glance-kuttl-tests/glance-db-sync-qslfl" Jan 31 15:01:14 crc kubenswrapper[4751]: I0131 15:01:14.598576 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ec8366b9-bf19-46a4-9033-a05dabe579a4-db-sync-config-data\") pod \"glance-db-sync-qslfl\" (UID: \"ec8366b9-bf19-46a4-9033-a05dabe579a4\") " pod="glance-kuttl-tests/glance-db-sync-qslfl" Jan 31 15:01:14 crc kubenswrapper[4751]: I0131 15:01:14.700606 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7wvr\" (UniqueName: \"kubernetes.io/projected/ec8366b9-bf19-46a4-9033-a05dabe579a4-kube-api-access-n7wvr\") pod \"glance-db-sync-qslfl\" (UID: \"ec8366b9-bf19-46a4-9033-a05dabe579a4\") " pod="glance-kuttl-tests/glance-db-sync-qslfl" Jan 31 15:01:14 crc kubenswrapper[4751]: I0131 15:01:14.701016 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec8366b9-bf19-46a4-9033-a05dabe579a4-config-data\") pod \"glance-db-sync-qslfl\" (UID: \"ec8366b9-bf19-46a4-9033-a05dabe579a4\") " pod="glance-kuttl-tests/glance-db-sync-qslfl" Jan 31 15:01:14 crc kubenswrapper[4751]: I0131 15:01:14.701310 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec8366b9-bf19-46a4-9033-a05dabe579a4-combined-ca-bundle\") pod \"glance-db-sync-qslfl\" (UID: \"ec8366b9-bf19-46a4-9033-a05dabe579a4\") " pod="glance-kuttl-tests/glance-db-sync-qslfl" Jan 31 15:01:14 crc kubenswrapper[4751]: I0131 15:01:14.701618 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ec8366b9-bf19-46a4-9033-a05dabe579a4-db-sync-config-data\") pod \"glance-db-sync-qslfl\" (UID: \"ec8366b9-bf19-46a4-9033-a05dabe579a4\") " pod="glance-kuttl-tests/glance-db-sync-qslfl" Jan 31 15:01:14 crc kubenswrapper[4751]: I0131 15:01:14.708602 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec8366b9-bf19-46a4-9033-a05dabe579a4-config-data\") pod \"glance-db-sync-qslfl\" (UID: \"ec8366b9-bf19-46a4-9033-a05dabe579a4\") " pod="glance-kuttl-tests/glance-db-sync-qslfl" Jan 31 15:01:14 crc kubenswrapper[4751]: I0131 15:01:14.709029 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec8366b9-bf19-46a4-9033-a05dabe579a4-combined-ca-bundle\") pod \"glance-db-sync-qslfl\" (UID: \"ec8366b9-bf19-46a4-9033-a05dabe579a4\") " pod="glance-kuttl-tests/glance-db-sync-qslfl" Jan 31 15:01:14 crc kubenswrapper[4751]: I0131 15:01:14.716785 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ec8366b9-bf19-46a4-9033-a05dabe579a4-db-sync-config-data\") pod \"glance-db-sync-qslfl\" (UID: \"ec8366b9-bf19-46a4-9033-a05dabe579a4\") " pod="glance-kuttl-tests/glance-db-sync-qslfl" Jan 31 15:01:14 crc kubenswrapper[4751]: I0131 15:01:14.732861 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7wvr\" (UniqueName: \"kubernetes.io/projected/ec8366b9-bf19-46a4-9033-a05dabe579a4-kube-api-access-n7wvr\") pod \"glance-db-sync-qslfl\" (UID: \"ec8366b9-bf19-46a4-9033-a05dabe579a4\") " pod="glance-kuttl-tests/glance-db-sync-qslfl" Jan 31 15:01:14 crc kubenswrapper[4751]: I0131 15:01:14.812264 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-qslfl" Jan 31 15:01:15 crc kubenswrapper[4751]: I0131 15:01:15.300642 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-qslfl"] Jan 31 15:01:15 crc kubenswrapper[4751]: I0131 15:01:15.851641 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-qslfl" event={"ID":"ec8366b9-bf19-46a4-9033-a05dabe579a4","Type":"ContainerStarted","Data":"f2d3ac70f8ddad94f9d969f2045d3e2ecc9acc9f7ef1ceb69fe7a6e69910af4e"} Jan 31 15:01:15 crc kubenswrapper[4751]: I0131 15:01:15.852009 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-qslfl" event={"ID":"ec8366b9-bf19-46a4-9033-a05dabe579a4","Type":"ContainerStarted","Data":"28ee5450b21d710d3178a37262a40b9ef50fcb2817d95afa9be7d36b03349a2b"} Jan 31 15:01:15 crc kubenswrapper[4751]: I0131 15:01:15.870395 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-sync-qslfl" podStartSLOduration=1.870372776 podStartE2EDuration="1.870372776s" podCreationTimestamp="2026-01-31 15:01:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 15:01:15.865492177 +0000 UTC m=+1180.240205062" watchObservedRunningTime="2026-01-31 15:01:15.870372776 +0000 UTC m=+1180.245085681" Jan 31 15:01:18 crc kubenswrapper[4751]: I0131 15:01:18.881605 4751 generic.go:334] "Generic (PLEG): container finished" podID="ec8366b9-bf19-46a4-9033-a05dabe579a4" containerID="f2d3ac70f8ddad94f9d969f2045d3e2ecc9acc9f7ef1ceb69fe7a6e69910af4e" exitCode=0 Jan 31 15:01:18 crc kubenswrapper[4751]: I0131 15:01:18.881741 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-qslfl" event={"ID":"ec8366b9-bf19-46a4-9033-a05dabe579a4","Type":"ContainerDied","Data":"f2d3ac70f8ddad94f9d969f2045d3e2ecc9acc9f7ef1ceb69fe7a6e69910af4e"} Jan 31 15:01:20 crc kubenswrapper[4751]: I0131 15:01:20.211574 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-qslfl" Jan 31 15:01:20 crc kubenswrapper[4751]: I0131 15:01:20.289857 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec8366b9-bf19-46a4-9033-a05dabe579a4-combined-ca-bundle\") pod \"ec8366b9-bf19-46a4-9033-a05dabe579a4\" (UID: \"ec8366b9-bf19-46a4-9033-a05dabe579a4\") " Jan 31 15:01:20 crc kubenswrapper[4751]: I0131 15:01:20.290469 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ec8366b9-bf19-46a4-9033-a05dabe579a4-db-sync-config-data\") pod \"ec8366b9-bf19-46a4-9033-a05dabe579a4\" (UID: \"ec8366b9-bf19-46a4-9033-a05dabe579a4\") " Jan 31 15:01:20 crc kubenswrapper[4751]: I0131 15:01:20.290498 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec8366b9-bf19-46a4-9033-a05dabe579a4-config-data\") pod \"ec8366b9-bf19-46a4-9033-a05dabe579a4\" (UID: \"ec8366b9-bf19-46a4-9033-a05dabe579a4\") " Jan 31 15:01:20 crc kubenswrapper[4751]: I0131 15:01:20.290535 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7wvr\" (UniqueName: \"kubernetes.io/projected/ec8366b9-bf19-46a4-9033-a05dabe579a4-kube-api-access-n7wvr\") pod \"ec8366b9-bf19-46a4-9033-a05dabe579a4\" (UID: \"ec8366b9-bf19-46a4-9033-a05dabe579a4\") " Jan 31 15:01:20 crc kubenswrapper[4751]: I0131 15:01:20.306483 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec8366b9-bf19-46a4-9033-a05dabe579a4-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ec8366b9-bf19-46a4-9033-a05dabe579a4" (UID: "ec8366b9-bf19-46a4-9033-a05dabe579a4"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:01:20 crc kubenswrapper[4751]: I0131 15:01:20.318612 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec8366b9-bf19-46a4-9033-a05dabe579a4-kube-api-access-n7wvr" (OuterVolumeSpecName: "kube-api-access-n7wvr") pod "ec8366b9-bf19-46a4-9033-a05dabe579a4" (UID: "ec8366b9-bf19-46a4-9033-a05dabe579a4"). InnerVolumeSpecName "kube-api-access-n7wvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:01:20 crc kubenswrapper[4751]: I0131 15:01:20.336282 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec8366b9-bf19-46a4-9033-a05dabe579a4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec8366b9-bf19-46a4-9033-a05dabe579a4" (UID: "ec8366b9-bf19-46a4-9033-a05dabe579a4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:01:20 crc kubenswrapper[4751]: I0131 15:01:20.367243 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec8366b9-bf19-46a4-9033-a05dabe579a4-config-data" (OuterVolumeSpecName: "config-data") pod "ec8366b9-bf19-46a4-9033-a05dabe579a4" (UID: "ec8366b9-bf19-46a4-9033-a05dabe579a4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:01:20 crc kubenswrapper[4751]: I0131 15:01:20.392580 4751 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ec8366b9-bf19-46a4-9033-a05dabe579a4-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:20 crc kubenswrapper[4751]: I0131 15:01:20.392623 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec8366b9-bf19-46a4-9033-a05dabe579a4-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:20 crc kubenswrapper[4751]: I0131 15:01:20.392636 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7wvr\" (UniqueName: \"kubernetes.io/projected/ec8366b9-bf19-46a4-9033-a05dabe579a4-kube-api-access-n7wvr\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:20 crc kubenswrapper[4751]: I0131 15:01:20.392651 4751 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec8366b9-bf19-46a4-9033-a05dabe579a4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:20 crc kubenswrapper[4751]: I0131 15:01:20.903592 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-qslfl" event={"ID":"ec8366b9-bf19-46a4-9033-a05dabe579a4","Type":"ContainerDied","Data":"28ee5450b21d710d3178a37262a40b9ef50fcb2817d95afa9be7d36b03349a2b"} Jan 31 15:01:20 crc kubenswrapper[4751]: I0131 15:01:20.903629 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28ee5450b21d710d3178a37262a40b9ef50fcb2817d95afa9be7d36b03349a2b" Jan 31 15:01:20 crc kubenswrapper[4751]: I0131 15:01:20.903651 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-qslfl" Jan 31 15:01:22 crc kubenswrapper[4751]: I0131 15:01:22.181385 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Jan 31 15:01:22 crc kubenswrapper[4751]: E0131 15:01:22.182637 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec8366b9-bf19-46a4-9033-a05dabe579a4" containerName="glance-db-sync" Jan 31 15:01:22 crc kubenswrapper[4751]: I0131 15:01:22.182767 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec8366b9-bf19-46a4-9033-a05dabe579a4" containerName="glance-db-sync" Jan 31 15:01:22 crc kubenswrapper[4751]: I0131 15:01:22.183023 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec8366b9-bf19-46a4-9033-a05dabe579a4" containerName="glance-db-sync" Jan 31 15:01:22 crc kubenswrapper[4751]: I0131 15:01:22.183993 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:01:22 crc kubenswrapper[4751]: I0131 15:01:22.187006 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Jan 31 15:01:22 crc kubenswrapper[4751]: I0131 15:01:22.187370 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"combined-ca-bundle" Jan 31 15:01:22 crc kubenswrapper[4751]: I0131 15:01:22.187733 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-single-config-data" Jan 31 15:01:22 crc kubenswrapper[4751]: I0131 15:01:22.188089 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"cert-glance-default-public-svc" Jan 31 15:01:22 crc kubenswrapper[4751]: I0131 15:01:22.188253 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-2pnvw" Jan 31 15:01:22 crc kubenswrapper[4751]: I0131 15:01:22.188417 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"cert-glance-default-internal-svc" Jan 31 15:01:22 crc kubenswrapper[4751]: I0131 15:01:22.205317 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Jan 31 15:01:22 crc kubenswrapper[4751]: I0131 15:01:22.318945 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/255bf0e7-10e4-4d84-8607-14c83ac28044-internal-tls-certs\") pod \"glance-default-single-0\" (UID: \"255bf0e7-10e4-4d84-8607-14c83ac28044\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:01:22 crc kubenswrapper[4751]: I0131 15:01:22.319309 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfmdr\" (UniqueName: \"kubernetes.io/projected/255bf0e7-10e4-4d84-8607-14c83ac28044-kube-api-access-bfmdr\") pod \"glance-default-single-0\" (UID: \"255bf0e7-10e4-4d84-8607-14c83ac28044\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:01:22 crc kubenswrapper[4751]: I0131 15:01:22.319447 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/255bf0e7-10e4-4d84-8607-14c83ac28044-logs\") pod \"glance-default-single-0\" (UID: \"255bf0e7-10e4-4d84-8607-14c83ac28044\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:01:22 crc kubenswrapper[4751]: I0131 15:01:22.319604 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/255bf0e7-10e4-4d84-8607-14c83ac28044-scripts\") pod \"glance-default-single-0\" (UID: \"255bf0e7-10e4-4d84-8607-14c83ac28044\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:01:22 crc kubenswrapper[4751]: I0131 15:01:22.319729 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-single-0\" (UID: \"255bf0e7-10e4-4d84-8607-14c83ac28044\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:01:22 crc kubenswrapper[4751]: I0131 15:01:22.319840 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/255bf0e7-10e4-4d84-8607-14c83ac28044-httpd-run\") pod \"glance-default-single-0\" (UID: \"255bf0e7-10e4-4d84-8607-14c83ac28044\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:01:22 crc kubenswrapper[4751]: I0131 15:01:22.319946 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/255bf0e7-10e4-4d84-8607-14c83ac28044-combined-ca-bundle\") pod \"glance-default-single-0\" (UID: \"255bf0e7-10e4-4d84-8607-14c83ac28044\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:01:22 crc kubenswrapper[4751]: I0131 15:01:22.320089 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/255bf0e7-10e4-4d84-8607-14c83ac28044-public-tls-certs\") pod \"glance-default-single-0\" (UID: \"255bf0e7-10e4-4d84-8607-14c83ac28044\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:01:22 crc kubenswrapper[4751]: I0131 15:01:22.320235 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/255bf0e7-10e4-4d84-8607-14c83ac28044-config-data\") pod \"glance-default-single-0\" (UID: \"255bf0e7-10e4-4d84-8607-14c83ac28044\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:01:22 crc kubenswrapper[4751]: I0131 15:01:22.421638 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/255bf0e7-10e4-4d84-8607-14c83ac28044-internal-tls-certs\") pod \"glance-default-single-0\" (UID: \"255bf0e7-10e4-4d84-8607-14c83ac28044\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:01:22 crc kubenswrapper[4751]: I0131 15:01:22.421702 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfmdr\" (UniqueName: \"kubernetes.io/projected/255bf0e7-10e4-4d84-8607-14c83ac28044-kube-api-access-bfmdr\") pod \"glance-default-single-0\" (UID: \"255bf0e7-10e4-4d84-8607-14c83ac28044\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:01:22 crc kubenswrapper[4751]: I0131 15:01:22.421738 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/255bf0e7-10e4-4d84-8607-14c83ac28044-logs\") pod \"glance-default-single-0\" (UID: \"255bf0e7-10e4-4d84-8607-14c83ac28044\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:01:22 crc kubenswrapper[4751]: I0131 15:01:22.421770 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/255bf0e7-10e4-4d84-8607-14c83ac28044-scripts\") pod \"glance-default-single-0\" (UID: \"255bf0e7-10e4-4d84-8607-14c83ac28044\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:01:22 crc kubenswrapper[4751]: I0131 15:01:22.421792 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-single-0\" (UID: \"255bf0e7-10e4-4d84-8607-14c83ac28044\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:01:22 crc kubenswrapper[4751]: I0131 15:01:22.421811 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/255bf0e7-10e4-4d84-8607-14c83ac28044-httpd-run\") pod \"glance-default-single-0\" (UID: \"255bf0e7-10e4-4d84-8607-14c83ac28044\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:01:22 crc kubenswrapper[4751]: I0131 15:01:22.421837 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/255bf0e7-10e4-4d84-8607-14c83ac28044-combined-ca-bundle\") pod \"glance-default-single-0\" (UID: \"255bf0e7-10e4-4d84-8607-14c83ac28044\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:01:22 crc kubenswrapper[4751]: I0131 15:01:22.421863 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/255bf0e7-10e4-4d84-8607-14c83ac28044-public-tls-certs\") pod \"glance-default-single-0\" (UID: \"255bf0e7-10e4-4d84-8607-14c83ac28044\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:01:22 crc kubenswrapper[4751]: I0131 15:01:22.421883 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/255bf0e7-10e4-4d84-8607-14c83ac28044-config-data\") pod \"glance-default-single-0\" (UID: \"255bf0e7-10e4-4d84-8607-14c83ac28044\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:01:22 crc kubenswrapper[4751]: I0131 15:01:22.422490 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/255bf0e7-10e4-4d84-8607-14c83ac28044-logs\") pod \"glance-default-single-0\" (UID: \"255bf0e7-10e4-4d84-8607-14c83ac28044\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:01:22 crc kubenswrapper[4751]: I0131 15:01:22.422632 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/255bf0e7-10e4-4d84-8607-14c83ac28044-httpd-run\") pod \"glance-default-single-0\" (UID: \"255bf0e7-10e4-4d84-8607-14c83ac28044\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:01:22 crc kubenswrapper[4751]: I0131 15:01:22.422862 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-single-0\" (UID: \"255bf0e7-10e4-4d84-8607-14c83ac28044\") device mount path \"/mnt/openstack/pv13\"" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:01:22 crc kubenswrapper[4751]: I0131 15:01:22.426305 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/255bf0e7-10e4-4d84-8607-14c83ac28044-internal-tls-certs\") pod \"glance-default-single-0\" (UID: \"255bf0e7-10e4-4d84-8607-14c83ac28044\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:01:22 crc kubenswrapper[4751]: I0131 15:01:22.426802 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/255bf0e7-10e4-4d84-8607-14c83ac28044-public-tls-certs\") pod \"glance-default-single-0\" (UID: \"255bf0e7-10e4-4d84-8607-14c83ac28044\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:01:22 crc kubenswrapper[4751]: I0131 15:01:22.426846 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/255bf0e7-10e4-4d84-8607-14c83ac28044-scripts\") pod \"glance-default-single-0\" (UID: \"255bf0e7-10e4-4d84-8607-14c83ac28044\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:01:22 crc kubenswrapper[4751]: I0131 15:01:22.427563 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/255bf0e7-10e4-4d84-8607-14c83ac28044-combined-ca-bundle\") pod \"glance-default-single-0\" (UID: \"255bf0e7-10e4-4d84-8607-14c83ac28044\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:01:22 crc kubenswrapper[4751]: I0131 15:01:22.428337 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/255bf0e7-10e4-4d84-8607-14c83ac28044-config-data\") pod \"glance-default-single-0\" (UID: \"255bf0e7-10e4-4d84-8607-14c83ac28044\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:01:22 crc kubenswrapper[4751]: I0131 15:01:22.440895 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfmdr\" (UniqueName: \"kubernetes.io/projected/255bf0e7-10e4-4d84-8607-14c83ac28044-kube-api-access-bfmdr\") pod \"glance-default-single-0\" (UID: \"255bf0e7-10e4-4d84-8607-14c83ac28044\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:01:22 crc kubenswrapper[4751]: I0131 15:01:22.448670 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-single-0\" (UID: \"255bf0e7-10e4-4d84-8607-14c83ac28044\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:01:22 crc kubenswrapper[4751]: I0131 15:01:22.499291 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:01:23 crc kubenswrapper[4751]: I0131 15:01:23.307314 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Jan 31 15:01:23 crc kubenswrapper[4751]: I0131 15:01:23.929736 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"255bf0e7-10e4-4d84-8607-14c83ac28044","Type":"ContainerStarted","Data":"ab0fad05949e27dc661aeaec62db4c80b70388679d98c799d711588e4b30d20c"} Jan 31 15:01:23 crc kubenswrapper[4751]: I0131 15:01:23.929988 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"255bf0e7-10e4-4d84-8607-14c83ac28044","Type":"ContainerStarted","Data":"f8952d07c6de0f133fd84db9683a5d98d942ed60617f64795edab9f81a8ffdfb"} Jan 31 15:01:24 crc kubenswrapper[4751]: I0131 15:01:24.940332 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"255bf0e7-10e4-4d84-8607-14c83ac28044","Type":"ContainerStarted","Data":"a08737138511460d108c2e58fd850d74f67395f3f43148d743fdd1308994567c"} Jan 31 15:01:24 crc kubenswrapper[4751]: I0131 15:01:24.967476 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-0" podStartSLOduration=2.967461269 podStartE2EDuration="2.967461269s" podCreationTimestamp="2026-01-31 15:01:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 15:01:24.964709486 +0000 UTC m=+1189.339422431" watchObservedRunningTime="2026-01-31 15:01:24.967461269 +0000 UTC m=+1189.342174154" Jan 31 15:01:32 crc kubenswrapper[4751]: I0131 15:01:32.500417 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:01:32 crc kubenswrapper[4751]: I0131 15:01:32.500873 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:01:32 crc kubenswrapper[4751]: I0131 15:01:32.539342 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:01:32 crc kubenswrapper[4751]: I0131 15:01:32.548896 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:01:33 crc kubenswrapper[4751]: I0131 15:01:33.006120 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:01:33 crc kubenswrapper[4751]: I0131 15:01:33.006153 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:01:34 crc kubenswrapper[4751]: I0131 15:01:34.957111 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:01:34 crc kubenswrapper[4751]: I0131 15:01:34.962250 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:01:36 crc kubenswrapper[4751]: I0131 15:01:36.467832 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-sync-qslfl"] Jan 31 15:01:36 crc kubenswrapper[4751]: I0131 15:01:36.475783 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-sync-qslfl"] Jan 31 15:01:36 crc kubenswrapper[4751]: I0131 15:01:36.510651 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glancebd3b-account-delete-wcqq6"] Jan 31 15:01:36 crc kubenswrapper[4751]: I0131 15:01:36.511611 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glancebd3b-account-delete-wcqq6" Jan 31 15:01:36 crc kubenswrapper[4751]: I0131 15:01:36.523726 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glancebd3b-account-delete-wcqq6"] Jan 31 15:01:36 crc kubenswrapper[4751]: I0131 15:01:36.554167 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdtdl\" (UniqueName: \"kubernetes.io/projected/3b7181ac-f336-4658-bffc-63553f8972d9-kube-api-access-xdtdl\") pod \"glancebd3b-account-delete-wcqq6\" (UID: \"3b7181ac-f336-4658-bffc-63553f8972d9\") " pod="glance-kuttl-tests/glancebd3b-account-delete-wcqq6" Jan 31 15:01:36 crc kubenswrapper[4751]: I0131 15:01:36.554285 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b7181ac-f336-4658-bffc-63553f8972d9-operator-scripts\") pod \"glancebd3b-account-delete-wcqq6\" (UID: \"3b7181ac-f336-4658-bffc-63553f8972d9\") " pod="glance-kuttl-tests/glancebd3b-account-delete-wcqq6" Jan 31 15:01:36 crc kubenswrapper[4751]: I0131 15:01:36.569024 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Jan 31 15:01:36 crc kubenswrapper[4751]: I0131 15:01:36.655748 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdtdl\" (UniqueName: \"kubernetes.io/projected/3b7181ac-f336-4658-bffc-63553f8972d9-kube-api-access-xdtdl\") pod \"glancebd3b-account-delete-wcqq6\" (UID: \"3b7181ac-f336-4658-bffc-63553f8972d9\") " pod="glance-kuttl-tests/glancebd3b-account-delete-wcqq6" Jan 31 15:01:36 crc kubenswrapper[4751]: I0131 15:01:36.655821 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b7181ac-f336-4658-bffc-63553f8972d9-operator-scripts\") pod \"glancebd3b-account-delete-wcqq6\" (UID: \"3b7181ac-f336-4658-bffc-63553f8972d9\") " pod="glance-kuttl-tests/glancebd3b-account-delete-wcqq6" Jan 31 15:01:36 crc kubenswrapper[4751]: I0131 15:01:36.656684 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b7181ac-f336-4658-bffc-63553f8972d9-operator-scripts\") pod \"glancebd3b-account-delete-wcqq6\" (UID: \"3b7181ac-f336-4658-bffc-63553f8972d9\") " pod="glance-kuttl-tests/glancebd3b-account-delete-wcqq6" Jan 31 15:01:36 crc kubenswrapper[4751]: I0131 15:01:36.673201 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdtdl\" (UniqueName: \"kubernetes.io/projected/3b7181ac-f336-4658-bffc-63553f8972d9-kube-api-access-xdtdl\") pod \"glancebd3b-account-delete-wcqq6\" (UID: \"3b7181ac-f336-4658-bffc-63553f8972d9\") " pod="glance-kuttl-tests/glancebd3b-account-delete-wcqq6" Jan 31 15:01:36 crc kubenswrapper[4751]: I0131 15:01:36.836853 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glancebd3b-account-delete-wcqq6" Jan 31 15:01:37 crc kubenswrapper[4751]: I0131 15:01:37.051912 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="255bf0e7-10e4-4d84-8607-14c83ac28044" containerName="glance-log" containerID="cri-o://ab0fad05949e27dc661aeaec62db4c80b70388679d98c799d711588e4b30d20c" gracePeriod=30 Jan 31 15:01:37 crc kubenswrapper[4751]: I0131 15:01:37.052007 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="255bf0e7-10e4-4d84-8607-14c83ac28044" containerName="glance-httpd" containerID="cri-o://a08737138511460d108c2e58fd850d74f67395f3f43148d743fdd1308994567c" gracePeriod=30 Jan 31 15:01:37 crc kubenswrapper[4751]: I0131 15:01:37.057120 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-single-0" podUID="255bf0e7-10e4-4d84-8607-14c83ac28044" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.110:9292/healthcheck\": EOF" Jan 31 15:01:37 crc kubenswrapper[4751]: I0131 15:01:37.058810 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-single-0" podUID="255bf0e7-10e4-4d84-8607-14c83ac28044" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.110:9292/healthcheck\": EOF" Jan 31 15:01:37 crc kubenswrapper[4751]: I0131 15:01:37.257712 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glancebd3b-account-delete-wcqq6"] Jan 31 15:01:38 crc kubenswrapper[4751]: I0131 15:01:38.062824 4751 generic.go:334] "Generic (PLEG): container finished" podID="255bf0e7-10e4-4d84-8607-14c83ac28044" containerID="ab0fad05949e27dc661aeaec62db4c80b70388679d98c799d711588e4b30d20c" exitCode=143 Jan 31 15:01:38 crc kubenswrapper[4751]: I0131 15:01:38.062920 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"255bf0e7-10e4-4d84-8607-14c83ac28044","Type":"ContainerDied","Data":"ab0fad05949e27dc661aeaec62db4c80b70388679d98c799d711588e4b30d20c"} Jan 31 15:01:38 crc kubenswrapper[4751]: I0131 15:01:38.065256 4751 generic.go:334] "Generic (PLEG): container finished" podID="3b7181ac-f336-4658-bffc-63553f8972d9" containerID="5da73e1408c3942c575e820ab3bbf5f7e673d6aadac72064d98cb22aab529aa9" exitCode=0 Jan 31 15:01:38 crc kubenswrapper[4751]: I0131 15:01:38.065294 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glancebd3b-account-delete-wcqq6" event={"ID":"3b7181ac-f336-4658-bffc-63553f8972d9","Type":"ContainerDied","Data":"5da73e1408c3942c575e820ab3bbf5f7e673d6aadac72064d98cb22aab529aa9"} Jan 31 15:01:38 crc kubenswrapper[4751]: I0131 15:01:38.065345 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glancebd3b-account-delete-wcqq6" event={"ID":"3b7181ac-f336-4658-bffc-63553f8972d9","Type":"ContainerStarted","Data":"45627c6bb5b845b7afa1eefcb62304dc9bb91b2ca087df0c88c11323e14f29b4"} Jan 31 15:01:38 crc kubenswrapper[4751]: I0131 15:01:38.414417 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec8366b9-bf19-46a4-9033-a05dabe579a4" path="/var/lib/kubelet/pods/ec8366b9-bf19-46a4-9033-a05dabe579a4/volumes" Jan 31 15:01:38 crc kubenswrapper[4751]: I0131 15:01:38.896761 4751 patch_prober.go:28] interesting pod/machine-config-daemon-2wpj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 15:01:38 crc kubenswrapper[4751]: I0131 15:01:38.897086 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podUID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 15:01:39 crc kubenswrapper[4751]: I0131 15:01:39.375956 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glancebd3b-account-delete-wcqq6" Jan 31 15:01:39 crc kubenswrapper[4751]: I0131 15:01:39.493596 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b7181ac-f336-4658-bffc-63553f8972d9-operator-scripts\") pod \"3b7181ac-f336-4658-bffc-63553f8972d9\" (UID: \"3b7181ac-f336-4658-bffc-63553f8972d9\") " Jan 31 15:01:39 crc kubenswrapper[4751]: I0131 15:01:39.493644 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdtdl\" (UniqueName: \"kubernetes.io/projected/3b7181ac-f336-4658-bffc-63553f8972d9-kube-api-access-xdtdl\") pod \"3b7181ac-f336-4658-bffc-63553f8972d9\" (UID: \"3b7181ac-f336-4658-bffc-63553f8972d9\") " Jan 31 15:01:39 crc kubenswrapper[4751]: I0131 15:01:39.494247 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b7181ac-f336-4658-bffc-63553f8972d9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3b7181ac-f336-4658-bffc-63553f8972d9" (UID: "3b7181ac-f336-4658-bffc-63553f8972d9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 15:01:39 crc kubenswrapper[4751]: I0131 15:01:39.498266 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b7181ac-f336-4658-bffc-63553f8972d9-kube-api-access-xdtdl" (OuterVolumeSpecName: "kube-api-access-xdtdl") pod "3b7181ac-f336-4658-bffc-63553f8972d9" (UID: "3b7181ac-f336-4658-bffc-63553f8972d9"). InnerVolumeSpecName "kube-api-access-xdtdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:01:39 crc kubenswrapper[4751]: I0131 15:01:39.595062 4751 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b7181ac-f336-4658-bffc-63553f8972d9-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:39 crc kubenswrapper[4751]: I0131 15:01:39.595113 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdtdl\" (UniqueName: \"kubernetes.io/projected/3b7181ac-f336-4658-bffc-63553f8972d9-kube-api-access-xdtdl\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:40 crc kubenswrapper[4751]: I0131 15:01:40.083356 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glancebd3b-account-delete-wcqq6" event={"ID":"3b7181ac-f336-4658-bffc-63553f8972d9","Type":"ContainerDied","Data":"45627c6bb5b845b7afa1eefcb62304dc9bb91b2ca087df0c88c11323e14f29b4"} Jan 31 15:01:40 crc kubenswrapper[4751]: I0131 15:01:40.083410 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45627c6bb5b845b7afa1eefcb62304dc9bb91b2ca087df0c88c11323e14f29b4" Jan 31 15:01:40 crc kubenswrapper[4751]: I0131 15:01:40.083545 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glancebd3b-account-delete-wcqq6" Jan 31 15:01:40 crc kubenswrapper[4751]: I0131 15:01:40.554432 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:01:40 crc kubenswrapper[4751]: I0131 15:01:40.607890 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfmdr\" (UniqueName: \"kubernetes.io/projected/255bf0e7-10e4-4d84-8607-14c83ac28044-kube-api-access-bfmdr\") pod \"255bf0e7-10e4-4d84-8607-14c83ac28044\" (UID: \"255bf0e7-10e4-4d84-8607-14c83ac28044\") " Jan 31 15:01:40 crc kubenswrapper[4751]: I0131 15:01:40.607942 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/255bf0e7-10e4-4d84-8607-14c83ac28044-internal-tls-certs\") pod \"255bf0e7-10e4-4d84-8607-14c83ac28044\" (UID: \"255bf0e7-10e4-4d84-8607-14c83ac28044\") " Jan 31 15:01:40 crc kubenswrapper[4751]: I0131 15:01:40.608024 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/255bf0e7-10e4-4d84-8607-14c83ac28044-config-data\") pod \"255bf0e7-10e4-4d84-8607-14c83ac28044\" (UID: \"255bf0e7-10e4-4d84-8607-14c83ac28044\") " Jan 31 15:01:40 crc kubenswrapper[4751]: I0131 15:01:40.608318 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/255bf0e7-10e4-4d84-8607-14c83ac28044-combined-ca-bundle\") pod \"255bf0e7-10e4-4d84-8607-14c83ac28044\" (UID: \"255bf0e7-10e4-4d84-8607-14c83ac28044\") " Jan 31 15:01:40 crc kubenswrapper[4751]: I0131 15:01:40.608362 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/255bf0e7-10e4-4d84-8607-14c83ac28044-public-tls-certs\") pod \"255bf0e7-10e4-4d84-8607-14c83ac28044\" (UID: \"255bf0e7-10e4-4d84-8607-14c83ac28044\") " Jan 31 15:01:40 crc kubenswrapper[4751]: I0131 15:01:40.608432 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"255bf0e7-10e4-4d84-8607-14c83ac28044\" (UID: \"255bf0e7-10e4-4d84-8607-14c83ac28044\") " Jan 31 15:01:40 crc kubenswrapper[4751]: I0131 15:01:40.608457 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/255bf0e7-10e4-4d84-8607-14c83ac28044-scripts\") pod \"255bf0e7-10e4-4d84-8607-14c83ac28044\" (UID: \"255bf0e7-10e4-4d84-8607-14c83ac28044\") " Jan 31 15:01:40 crc kubenswrapper[4751]: I0131 15:01:40.608490 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/255bf0e7-10e4-4d84-8607-14c83ac28044-httpd-run\") pod \"255bf0e7-10e4-4d84-8607-14c83ac28044\" (UID: \"255bf0e7-10e4-4d84-8607-14c83ac28044\") " Jan 31 15:01:40 crc kubenswrapper[4751]: I0131 15:01:40.608544 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/255bf0e7-10e4-4d84-8607-14c83ac28044-logs\") pod \"255bf0e7-10e4-4d84-8607-14c83ac28044\" (UID: \"255bf0e7-10e4-4d84-8607-14c83ac28044\") " Jan 31 15:01:40 crc kubenswrapper[4751]: I0131 15:01:40.609484 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/255bf0e7-10e4-4d84-8607-14c83ac28044-logs" (OuterVolumeSpecName: "logs") pod "255bf0e7-10e4-4d84-8607-14c83ac28044" (UID: "255bf0e7-10e4-4d84-8607-14c83ac28044"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:01:40 crc kubenswrapper[4751]: I0131 15:01:40.609665 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/255bf0e7-10e4-4d84-8607-14c83ac28044-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "255bf0e7-10e4-4d84-8607-14c83ac28044" (UID: "255bf0e7-10e4-4d84-8607-14c83ac28044"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:01:40 crc kubenswrapper[4751]: I0131 15:01:40.614253 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/255bf0e7-10e4-4d84-8607-14c83ac28044-scripts" (OuterVolumeSpecName: "scripts") pod "255bf0e7-10e4-4d84-8607-14c83ac28044" (UID: "255bf0e7-10e4-4d84-8607-14c83ac28044"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:01:40 crc kubenswrapper[4751]: I0131 15:01:40.614252 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/255bf0e7-10e4-4d84-8607-14c83ac28044-kube-api-access-bfmdr" (OuterVolumeSpecName: "kube-api-access-bfmdr") pod "255bf0e7-10e4-4d84-8607-14c83ac28044" (UID: "255bf0e7-10e4-4d84-8607-14c83ac28044"). InnerVolumeSpecName "kube-api-access-bfmdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:01:40 crc kubenswrapper[4751]: I0131 15:01:40.614904 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage13-crc" (OuterVolumeSpecName: "glance") pod "255bf0e7-10e4-4d84-8607-14c83ac28044" (UID: "255bf0e7-10e4-4d84-8607-14c83ac28044"). InnerVolumeSpecName "local-storage13-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:01:40 crc kubenswrapper[4751]: I0131 15:01:40.627932 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/255bf0e7-10e4-4d84-8607-14c83ac28044-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "255bf0e7-10e4-4d84-8607-14c83ac28044" (UID: "255bf0e7-10e4-4d84-8607-14c83ac28044"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:01:40 crc kubenswrapper[4751]: I0131 15:01:40.643764 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/255bf0e7-10e4-4d84-8607-14c83ac28044-config-data" (OuterVolumeSpecName: "config-data") pod "255bf0e7-10e4-4d84-8607-14c83ac28044" (UID: "255bf0e7-10e4-4d84-8607-14c83ac28044"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:01:40 crc kubenswrapper[4751]: I0131 15:01:40.644304 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/255bf0e7-10e4-4d84-8607-14c83ac28044-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "255bf0e7-10e4-4d84-8607-14c83ac28044" (UID: "255bf0e7-10e4-4d84-8607-14c83ac28044"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:01:40 crc kubenswrapper[4751]: I0131 15:01:40.645892 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/255bf0e7-10e4-4d84-8607-14c83ac28044-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "255bf0e7-10e4-4d84-8607-14c83ac28044" (UID: "255bf0e7-10e4-4d84-8607-14c83ac28044"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:01:40 crc kubenswrapper[4751]: I0131 15:01:40.709895 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/255bf0e7-10e4-4d84-8607-14c83ac28044-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:40 crc kubenswrapper[4751]: I0131 15:01:40.709923 4751 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/255bf0e7-10e4-4d84-8607-14c83ac28044-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:40 crc kubenswrapper[4751]: I0131 15:01:40.709935 4751 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/255bf0e7-10e4-4d84-8607-14c83ac28044-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:40 crc kubenswrapper[4751]: I0131 15:01:40.709943 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/255bf0e7-10e4-4d84-8607-14c83ac28044-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:40 crc kubenswrapper[4751]: I0131 15:01:40.709967 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" " Jan 31 15:01:40 crc kubenswrapper[4751]: I0131 15:01:40.709976 4751 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/255bf0e7-10e4-4d84-8607-14c83ac28044-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:40 crc kubenswrapper[4751]: I0131 15:01:40.709984 4751 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/255bf0e7-10e4-4d84-8607-14c83ac28044-logs\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:40 crc kubenswrapper[4751]: I0131 15:01:40.709998 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfmdr\" (UniqueName: \"kubernetes.io/projected/255bf0e7-10e4-4d84-8607-14c83ac28044-kube-api-access-bfmdr\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:40 crc kubenswrapper[4751]: I0131 15:01:40.710014 4751 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/255bf0e7-10e4-4d84-8607-14c83ac28044-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:40 crc kubenswrapper[4751]: I0131 15:01:40.725996 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage13-crc" (UniqueName: "kubernetes.io/local-volume/local-storage13-crc") on node "crc" Jan 31 15:01:40 crc kubenswrapper[4751]: I0131 15:01:40.811793 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:41 crc kubenswrapper[4751]: I0131 15:01:41.093646 4751 generic.go:334] "Generic (PLEG): container finished" podID="255bf0e7-10e4-4d84-8607-14c83ac28044" containerID="a08737138511460d108c2e58fd850d74f67395f3f43148d743fdd1308994567c" exitCode=0 Jan 31 15:01:41 crc kubenswrapper[4751]: I0131 15:01:41.093695 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"255bf0e7-10e4-4d84-8607-14c83ac28044","Type":"ContainerDied","Data":"a08737138511460d108c2e58fd850d74f67395f3f43148d743fdd1308994567c"} Jan 31 15:01:41 crc kubenswrapper[4751]: I0131 15:01:41.093713 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:01:41 crc kubenswrapper[4751]: I0131 15:01:41.093737 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"255bf0e7-10e4-4d84-8607-14c83ac28044","Type":"ContainerDied","Data":"f8952d07c6de0f133fd84db9683a5d98d942ed60617f64795edab9f81a8ffdfb"} Jan 31 15:01:41 crc kubenswrapper[4751]: I0131 15:01:41.093760 4751 scope.go:117] "RemoveContainer" containerID="a08737138511460d108c2e58fd850d74f67395f3f43148d743fdd1308994567c" Jan 31 15:01:41 crc kubenswrapper[4751]: I0131 15:01:41.119303 4751 scope.go:117] "RemoveContainer" containerID="ab0fad05949e27dc661aeaec62db4c80b70388679d98c799d711588e4b30d20c" Jan 31 15:01:41 crc kubenswrapper[4751]: I0131 15:01:41.127597 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Jan 31 15:01:41 crc kubenswrapper[4751]: I0131 15:01:41.133942 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Jan 31 15:01:41 crc kubenswrapper[4751]: I0131 15:01:41.149817 4751 scope.go:117] "RemoveContainer" containerID="a08737138511460d108c2e58fd850d74f67395f3f43148d743fdd1308994567c" Jan 31 15:01:41 crc kubenswrapper[4751]: E0131 15:01:41.150303 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a08737138511460d108c2e58fd850d74f67395f3f43148d743fdd1308994567c\": container with ID starting with a08737138511460d108c2e58fd850d74f67395f3f43148d743fdd1308994567c not found: ID does not exist" containerID="a08737138511460d108c2e58fd850d74f67395f3f43148d743fdd1308994567c" Jan 31 15:01:41 crc kubenswrapper[4751]: I0131 15:01:41.150336 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a08737138511460d108c2e58fd850d74f67395f3f43148d743fdd1308994567c"} err="failed to get container status \"a08737138511460d108c2e58fd850d74f67395f3f43148d743fdd1308994567c\": rpc error: code = NotFound desc = could not find container \"a08737138511460d108c2e58fd850d74f67395f3f43148d743fdd1308994567c\": container with ID starting with a08737138511460d108c2e58fd850d74f67395f3f43148d743fdd1308994567c not found: ID does not exist" Jan 31 15:01:41 crc kubenswrapper[4751]: I0131 15:01:41.150359 4751 scope.go:117] "RemoveContainer" containerID="ab0fad05949e27dc661aeaec62db4c80b70388679d98c799d711588e4b30d20c" Jan 31 15:01:41 crc kubenswrapper[4751]: E0131 15:01:41.150843 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab0fad05949e27dc661aeaec62db4c80b70388679d98c799d711588e4b30d20c\": container with ID starting with ab0fad05949e27dc661aeaec62db4c80b70388679d98c799d711588e4b30d20c not found: ID does not exist" containerID="ab0fad05949e27dc661aeaec62db4c80b70388679d98c799d711588e4b30d20c" Jan 31 15:01:41 crc kubenswrapper[4751]: I0131 15:01:41.150906 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab0fad05949e27dc661aeaec62db4c80b70388679d98c799d711588e4b30d20c"} err="failed to get container status \"ab0fad05949e27dc661aeaec62db4c80b70388679d98c799d711588e4b30d20c\": rpc error: code = NotFound desc = could not find container \"ab0fad05949e27dc661aeaec62db4c80b70388679d98c799d711588e4b30d20c\": container with ID starting with ab0fad05949e27dc661aeaec62db4c80b70388679d98c799d711588e4b30d20c not found: ID does not exist" Jan 31 15:01:41 crc kubenswrapper[4751]: I0131 15:01:41.519000 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-create-fm54m"] Jan 31 15:01:41 crc kubenswrapper[4751]: I0131 15:01:41.525177 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-create-fm54m"] Jan 31 15:01:41 crc kubenswrapper[4751]: I0131 15:01:41.535466 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-bd3b-account-create-update-nbdd9"] Jan 31 15:01:41 crc kubenswrapper[4751]: I0131 15:01:41.540831 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glancebd3b-account-delete-wcqq6"] Jan 31 15:01:41 crc kubenswrapper[4751]: I0131 15:01:41.545835 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glancebd3b-account-delete-wcqq6"] Jan 31 15:01:41 crc kubenswrapper[4751]: I0131 15:01:41.550965 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-bd3b-account-create-update-nbdd9"] Jan 31 15:01:42 crc kubenswrapper[4751]: I0131 15:01:42.087004 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-create-mcgm2"] Jan 31 15:01:42 crc kubenswrapper[4751]: E0131 15:01:42.087375 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="255bf0e7-10e4-4d84-8607-14c83ac28044" containerName="glance-log" Jan 31 15:01:42 crc kubenswrapper[4751]: I0131 15:01:42.087391 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="255bf0e7-10e4-4d84-8607-14c83ac28044" containerName="glance-log" Jan 31 15:01:42 crc kubenswrapper[4751]: E0131 15:01:42.087416 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="255bf0e7-10e4-4d84-8607-14c83ac28044" containerName="glance-httpd" Jan 31 15:01:42 crc kubenswrapper[4751]: I0131 15:01:42.087424 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="255bf0e7-10e4-4d84-8607-14c83ac28044" containerName="glance-httpd" Jan 31 15:01:42 crc kubenswrapper[4751]: E0131 15:01:42.087456 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b7181ac-f336-4658-bffc-63553f8972d9" containerName="mariadb-account-delete" Jan 31 15:01:42 crc kubenswrapper[4751]: I0131 15:01:42.087468 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b7181ac-f336-4658-bffc-63553f8972d9" containerName="mariadb-account-delete" Jan 31 15:01:42 crc kubenswrapper[4751]: I0131 15:01:42.087623 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="255bf0e7-10e4-4d84-8607-14c83ac28044" containerName="glance-httpd" Jan 31 15:01:42 crc kubenswrapper[4751]: I0131 15:01:42.087641 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="255bf0e7-10e4-4d84-8607-14c83ac28044" containerName="glance-log" Jan 31 15:01:42 crc kubenswrapper[4751]: I0131 15:01:42.087660 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b7181ac-f336-4658-bffc-63553f8972d9" containerName="mariadb-account-delete" Jan 31 15:01:42 crc kubenswrapper[4751]: I0131 15:01:42.088342 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-mcgm2" Jan 31 15:01:42 crc kubenswrapper[4751]: I0131 15:01:42.093957 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-a977-account-create-update-tlstz"] Jan 31 15:01:42 crc kubenswrapper[4751]: I0131 15:01:42.095051 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-a977-account-create-update-tlstz" Jan 31 15:01:42 crc kubenswrapper[4751]: I0131 15:01:42.099971 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-db-secret" Jan 31 15:01:42 crc kubenswrapper[4751]: I0131 15:01:42.101716 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-mcgm2"] Jan 31 15:01:42 crc kubenswrapper[4751]: I0131 15:01:42.111564 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-a977-account-create-update-tlstz"] Jan 31 15:01:42 crc kubenswrapper[4751]: I0131 15:01:42.132044 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtkk5\" (UniqueName: \"kubernetes.io/projected/e9730563-64d8-44a2-9d93-7fe5fcd4c8d4-kube-api-access-xtkk5\") pod \"glance-a977-account-create-update-tlstz\" (UID: \"e9730563-64d8-44a2-9d93-7fe5fcd4c8d4\") " pod="glance-kuttl-tests/glance-a977-account-create-update-tlstz" Jan 31 15:01:42 crc kubenswrapper[4751]: I0131 15:01:42.132135 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9730563-64d8-44a2-9d93-7fe5fcd4c8d4-operator-scripts\") pod \"glance-a977-account-create-update-tlstz\" (UID: \"e9730563-64d8-44a2-9d93-7fe5fcd4c8d4\") " pod="glance-kuttl-tests/glance-a977-account-create-update-tlstz" Jan 31 15:01:42 crc kubenswrapper[4751]: I0131 15:01:42.132186 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mlch\" (UniqueName: \"kubernetes.io/projected/d9e826f0-62a4-4a7c-8945-0c29cd34e667-kube-api-access-4mlch\") pod \"glance-db-create-mcgm2\" (UID: \"d9e826f0-62a4-4a7c-8945-0c29cd34e667\") " pod="glance-kuttl-tests/glance-db-create-mcgm2" Jan 31 15:01:42 crc kubenswrapper[4751]: I0131 15:01:42.132211 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d9e826f0-62a4-4a7c-8945-0c29cd34e667-operator-scripts\") pod \"glance-db-create-mcgm2\" (UID: \"d9e826f0-62a4-4a7c-8945-0c29cd34e667\") " pod="glance-kuttl-tests/glance-db-create-mcgm2" Jan 31 15:01:42 crc kubenswrapper[4751]: I0131 15:01:42.233562 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtkk5\" (UniqueName: \"kubernetes.io/projected/e9730563-64d8-44a2-9d93-7fe5fcd4c8d4-kube-api-access-xtkk5\") pod \"glance-a977-account-create-update-tlstz\" (UID: \"e9730563-64d8-44a2-9d93-7fe5fcd4c8d4\") " pod="glance-kuttl-tests/glance-a977-account-create-update-tlstz" Jan 31 15:01:42 crc kubenswrapper[4751]: I0131 15:01:42.233678 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9730563-64d8-44a2-9d93-7fe5fcd4c8d4-operator-scripts\") pod \"glance-a977-account-create-update-tlstz\" (UID: \"e9730563-64d8-44a2-9d93-7fe5fcd4c8d4\") " pod="glance-kuttl-tests/glance-a977-account-create-update-tlstz" Jan 31 15:01:42 crc kubenswrapper[4751]: I0131 15:01:42.234520 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9730563-64d8-44a2-9d93-7fe5fcd4c8d4-operator-scripts\") pod \"glance-a977-account-create-update-tlstz\" (UID: \"e9730563-64d8-44a2-9d93-7fe5fcd4c8d4\") " pod="glance-kuttl-tests/glance-a977-account-create-update-tlstz" Jan 31 15:01:42 crc kubenswrapper[4751]: I0131 15:01:42.234601 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mlch\" (UniqueName: \"kubernetes.io/projected/d9e826f0-62a4-4a7c-8945-0c29cd34e667-kube-api-access-4mlch\") pod \"glance-db-create-mcgm2\" (UID: \"d9e826f0-62a4-4a7c-8945-0c29cd34e667\") " pod="glance-kuttl-tests/glance-db-create-mcgm2" Jan 31 15:01:42 crc kubenswrapper[4751]: I0131 15:01:42.234632 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d9e826f0-62a4-4a7c-8945-0c29cd34e667-operator-scripts\") pod \"glance-db-create-mcgm2\" (UID: \"d9e826f0-62a4-4a7c-8945-0c29cd34e667\") " pod="glance-kuttl-tests/glance-db-create-mcgm2" Jan 31 15:01:42 crc kubenswrapper[4751]: I0131 15:01:42.235591 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d9e826f0-62a4-4a7c-8945-0c29cd34e667-operator-scripts\") pod \"glance-db-create-mcgm2\" (UID: \"d9e826f0-62a4-4a7c-8945-0c29cd34e667\") " pod="glance-kuttl-tests/glance-db-create-mcgm2" Jan 31 15:01:42 crc kubenswrapper[4751]: I0131 15:01:42.250454 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mlch\" (UniqueName: \"kubernetes.io/projected/d9e826f0-62a4-4a7c-8945-0c29cd34e667-kube-api-access-4mlch\") pod \"glance-db-create-mcgm2\" (UID: \"d9e826f0-62a4-4a7c-8945-0c29cd34e667\") " pod="glance-kuttl-tests/glance-db-create-mcgm2" Jan 31 15:01:42 crc kubenswrapper[4751]: I0131 15:01:42.252014 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtkk5\" (UniqueName: \"kubernetes.io/projected/e9730563-64d8-44a2-9d93-7fe5fcd4c8d4-kube-api-access-xtkk5\") pod \"glance-a977-account-create-update-tlstz\" (UID: \"e9730563-64d8-44a2-9d93-7fe5fcd4c8d4\") " pod="glance-kuttl-tests/glance-a977-account-create-update-tlstz" Jan 31 15:01:42 crc kubenswrapper[4751]: I0131 15:01:42.408485 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-mcgm2" Jan 31 15:01:42 crc kubenswrapper[4751]: I0131 15:01:42.413519 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-a977-account-create-update-tlstz" Jan 31 15:01:42 crc kubenswrapper[4751]: I0131 15:01:42.415111 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="255bf0e7-10e4-4d84-8607-14c83ac28044" path="/var/lib/kubelet/pods/255bf0e7-10e4-4d84-8607-14c83ac28044/volumes" Jan 31 15:01:42 crc kubenswrapper[4751]: I0131 15:01:42.415850 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31decdae-8d23-4756-b743-4cd4f7709654" path="/var/lib/kubelet/pods/31decdae-8d23-4756-b743-4cd4f7709654/volumes" Jan 31 15:01:42 crc kubenswrapper[4751]: I0131 15:01:42.416395 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b7181ac-f336-4658-bffc-63553f8972d9" path="/var/lib/kubelet/pods/3b7181ac-f336-4658-bffc-63553f8972d9/volumes" Jan 31 15:01:42 crc kubenswrapper[4751]: I0131 15:01:42.417350 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b491fa19-1dde-4e28-919f-f120c0c772b7" path="/var/lib/kubelet/pods/b491fa19-1dde-4e28-919f-f120c0c772b7/volumes" Jan 31 15:01:42 crc kubenswrapper[4751]: I0131 15:01:42.818403 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-mcgm2"] Jan 31 15:01:42 crc kubenswrapper[4751]: I0131 15:01:42.888867 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-a977-account-create-update-tlstz"] Jan 31 15:01:43 crc kubenswrapper[4751]: I0131 15:01:43.120117 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-mcgm2" event={"ID":"d9e826f0-62a4-4a7c-8945-0c29cd34e667","Type":"ContainerStarted","Data":"488f4cd159917294625dbe3f504270e4c6cae704ed670c29ddb28b43bab332ff"} Jan 31 15:01:43 crc kubenswrapper[4751]: I0131 15:01:43.120469 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-mcgm2" event={"ID":"d9e826f0-62a4-4a7c-8945-0c29cd34e667","Type":"ContainerStarted","Data":"bac6aaf151aa682a72b79c98606148bc73e67cc5fae8b0736586855de1506b67"} Jan 31 15:01:43 crc kubenswrapper[4751]: I0131 15:01:43.123318 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-a977-account-create-update-tlstz" event={"ID":"e9730563-64d8-44a2-9d93-7fe5fcd4c8d4","Type":"ContainerStarted","Data":"9903c977627bd13e9ad2f5f25c1001bf58623795a6fa400f5ca5b3724b524577"} Jan 31 15:01:43 crc kubenswrapper[4751]: I0131 15:01:43.123362 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-a977-account-create-update-tlstz" event={"ID":"e9730563-64d8-44a2-9d93-7fe5fcd4c8d4","Type":"ContainerStarted","Data":"74dd271892eb653086ff0009d74ed2a422288b125c2bf6d7587b2c354b96d3a2"} Jan 31 15:01:43 crc kubenswrapper[4751]: I0131 15:01:43.139946 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-create-mcgm2" podStartSLOduration=1.139924001 podStartE2EDuration="1.139924001s" podCreationTimestamp="2026-01-31 15:01:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 15:01:43.132402363 +0000 UTC m=+1207.507115248" watchObservedRunningTime="2026-01-31 15:01:43.139924001 +0000 UTC m=+1207.514636886" Jan 31 15:01:43 crc kubenswrapper[4751]: I0131 15:01:43.154085 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-a977-account-create-update-tlstz" podStartSLOduration=1.1540374230000001 podStartE2EDuration="1.154037423s" podCreationTimestamp="2026-01-31 15:01:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 15:01:43.153348235 +0000 UTC m=+1207.528061130" watchObservedRunningTime="2026-01-31 15:01:43.154037423 +0000 UTC m=+1207.528750308" Jan 31 15:01:44 crc kubenswrapper[4751]: I0131 15:01:44.130327 4751 generic.go:334] "Generic (PLEG): container finished" podID="d9e826f0-62a4-4a7c-8945-0c29cd34e667" containerID="488f4cd159917294625dbe3f504270e4c6cae704ed670c29ddb28b43bab332ff" exitCode=0 Jan 31 15:01:44 crc kubenswrapper[4751]: I0131 15:01:44.130422 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-mcgm2" event={"ID":"d9e826f0-62a4-4a7c-8945-0c29cd34e667","Type":"ContainerDied","Data":"488f4cd159917294625dbe3f504270e4c6cae704ed670c29ddb28b43bab332ff"} Jan 31 15:01:44 crc kubenswrapper[4751]: I0131 15:01:44.132431 4751 generic.go:334] "Generic (PLEG): container finished" podID="e9730563-64d8-44a2-9d93-7fe5fcd4c8d4" containerID="9903c977627bd13e9ad2f5f25c1001bf58623795a6fa400f5ca5b3724b524577" exitCode=0 Jan 31 15:01:44 crc kubenswrapper[4751]: I0131 15:01:44.132469 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-a977-account-create-update-tlstz" event={"ID":"e9730563-64d8-44a2-9d93-7fe5fcd4c8d4","Type":"ContainerDied","Data":"9903c977627bd13e9ad2f5f25c1001bf58623795a6fa400f5ca5b3724b524577"} Jan 31 15:01:45 crc kubenswrapper[4751]: I0131 15:01:45.536179 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-a977-account-create-update-tlstz" Jan 31 15:01:45 crc kubenswrapper[4751]: I0131 15:01:45.544918 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-mcgm2" Jan 31 15:01:45 crc kubenswrapper[4751]: I0131 15:01:45.590627 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d9e826f0-62a4-4a7c-8945-0c29cd34e667-operator-scripts\") pod \"d9e826f0-62a4-4a7c-8945-0c29cd34e667\" (UID: \"d9e826f0-62a4-4a7c-8945-0c29cd34e667\") " Jan 31 15:01:45 crc kubenswrapper[4751]: I0131 15:01:45.590660 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9730563-64d8-44a2-9d93-7fe5fcd4c8d4-operator-scripts\") pod \"e9730563-64d8-44a2-9d93-7fe5fcd4c8d4\" (UID: \"e9730563-64d8-44a2-9d93-7fe5fcd4c8d4\") " Jan 31 15:01:45 crc kubenswrapper[4751]: I0131 15:01:45.590688 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtkk5\" (UniqueName: \"kubernetes.io/projected/e9730563-64d8-44a2-9d93-7fe5fcd4c8d4-kube-api-access-xtkk5\") pod \"e9730563-64d8-44a2-9d93-7fe5fcd4c8d4\" (UID: \"e9730563-64d8-44a2-9d93-7fe5fcd4c8d4\") " Jan 31 15:01:45 crc kubenswrapper[4751]: I0131 15:01:45.590718 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mlch\" (UniqueName: \"kubernetes.io/projected/d9e826f0-62a4-4a7c-8945-0c29cd34e667-kube-api-access-4mlch\") pod \"d9e826f0-62a4-4a7c-8945-0c29cd34e667\" (UID: \"d9e826f0-62a4-4a7c-8945-0c29cd34e667\") " Jan 31 15:01:45 crc kubenswrapper[4751]: I0131 15:01:45.591349 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9730563-64d8-44a2-9d93-7fe5fcd4c8d4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e9730563-64d8-44a2-9d93-7fe5fcd4c8d4" (UID: "e9730563-64d8-44a2-9d93-7fe5fcd4c8d4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 15:01:45 crc kubenswrapper[4751]: I0131 15:01:45.591490 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9e826f0-62a4-4a7c-8945-0c29cd34e667-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d9e826f0-62a4-4a7c-8945-0c29cd34e667" (UID: "d9e826f0-62a4-4a7c-8945-0c29cd34e667"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 15:01:45 crc kubenswrapper[4751]: I0131 15:01:45.595742 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9730563-64d8-44a2-9d93-7fe5fcd4c8d4-kube-api-access-xtkk5" (OuterVolumeSpecName: "kube-api-access-xtkk5") pod "e9730563-64d8-44a2-9d93-7fe5fcd4c8d4" (UID: "e9730563-64d8-44a2-9d93-7fe5fcd4c8d4"). InnerVolumeSpecName "kube-api-access-xtkk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:01:45 crc kubenswrapper[4751]: I0131 15:01:45.595821 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9e826f0-62a4-4a7c-8945-0c29cd34e667-kube-api-access-4mlch" (OuterVolumeSpecName: "kube-api-access-4mlch") pod "d9e826f0-62a4-4a7c-8945-0c29cd34e667" (UID: "d9e826f0-62a4-4a7c-8945-0c29cd34e667"). InnerVolumeSpecName "kube-api-access-4mlch". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:01:45 crc kubenswrapper[4751]: I0131 15:01:45.691949 4751 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d9e826f0-62a4-4a7c-8945-0c29cd34e667-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:45 crc kubenswrapper[4751]: I0131 15:01:45.691975 4751 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9730563-64d8-44a2-9d93-7fe5fcd4c8d4-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:45 crc kubenswrapper[4751]: I0131 15:01:45.691984 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtkk5\" (UniqueName: \"kubernetes.io/projected/e9730563-64d8-44a2-9d93-7fe5fcd4c8d4-kube-api-access-xtkk5\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:45 crc kubenswrapper[4751]: I0131 15:01:45.691995 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mlch\" (UniqueName: \"kubernetes.io/projected/d9e826f0-62a4-4a7c-8945-0c29cd34e667-kube-api-access-4mlch\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:46 crc kubenswrapper[4751]: I0131 15:01:46.168716 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-a977-account-create-update-tlstz" event={"ID":"e9730563-64d8-44a2-9d93-7fe5fcd4c8d4","Type":"ContainerDied","Data":"74dd271892eb653086ff0009d74ed2a422288b125c2bf6d7587b2c354b96d3a2"} Jan 31 15:01:46 crc kubenswrapper[4751]: I0131 15:01:46.168775 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74dd271892eb653086ff0009d74ed2a422288b125c2bf6d7587b2c354b96d3a2" Jan 31 15:01:46 crc kubenswrapper[4751]: I0131 15:01:46.168888 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-a977-account-create-update-tlstz" Jan 31 15:01:46 crc kubenswrapper[4751]: I0131 15:01:46.171557 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-mcgm2" event={"ID":"d9e826f0-62a4-4a7c-8945-0c29cd34e667","Type":"ContainerDied","Data":"bac6aaf151aa682a72b79c98606148bc73e67cc5fae8b0736586855de1506b67"} Jan 31 15:01:46 crc kubenswrapper[4751]: I0131 15:01:46.171679 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bac6aaf151aa682a72b79c98606148bc73e67cc5fae8b0736586855de1506b67" Jan 31 15:01:46 crc kubenswrapper[4751]: I0131 15:01:46.171789 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-mcgm2" Jan 31 15:01:47 crc kubenswrapper[4751]: I0131 15:01:47.229756 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-sync-mxvm7"] Jan 31 15:01:47 crc kubenswrapper[4751]: E0131 15:01:47.230371 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9730563-64d8-44a2-9d93-7fe5fcd4c8d4" containerName="mariadb-account-create-update" Jan 31 15:01:47 crc kubenswrapper[4751]: I0131 15:01:47.230389 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9730563-64d8-44a2-9d93-7fe5fcd4c8d4" containerName="mariadb-account-create-update" Jan 31 15:01:47 crc kubenswrapper[4751]: E0131 15:01:47.230413 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9e826f0-62a4-4a7c-8945-0c29cd34e667" containerName="mariadb-database-create" Jan 31 15:01:47 crc kubenswrapper[4751]: I0131 15:01:47.230420 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9e826f0-62a4-4a7c-8945-0c29cd34e667" containerName="mariadb-database-create" Jan 31 15:01:47 crc kubenswrapper[4751]: I0131 15:01:47.230551 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9730563-64d8-44a2-9d93-7fe5fcd4c8d4" containerName="mariadb-account-create-update" Jan 31 15:01:47 crc kubenswrapper[4751]: I0131 15:01:47.230571 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9e826f0-62a4-4a7c-8945-0c29cd34e667" containerName="mariadb-database-create" Jan 31 15:01:47 crc kubenswrapper[4751]: I0131 15:01:47.231022 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-mxvm7" Jan 31 15:01:47 crc kubenswrapper[4751]: I0131 15:01:47.234730 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-config-data" Jan 31 15:01:47 crc kubenswrapper[4751]: I0131 15:01:47.239003 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-6mvx9" Jan 31 15:01:47 crc kubenswrapper[4751]: I0131 15:01:47.245707 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-mxvm7"] Jan 31 15:01:47 crc kubenswrapper[4751]: I0131 15:01:47.314832 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbf741e4-9445-4080-84f2-601e270f7aa0-config-data\") pod \"glance-db-sync-mxvm7\" (UID: \"dbf741e4-9445-4080-84f2-601e270f7aa0\") " pod="glance-kuttl-tests/glance-db-sync-mxvm7" Jan 31 15:01:47 crc kubenswrapper[4751]: I0131 15:01:47.314906 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dbf741e4-9445-4080-84f2-601e270f7aa0-db-sync-config-data\") pod \"glance-db-sync-mxvm7\" (UID: \"dbf741e4-9445-4080-84f2-601e270f7aa0\") " pod="glance-kuttl-tests/glance-db-sync-mxvm7" Jan 31 15:01:47 crc kubenswrapper[4751]: I0131 15:01:47.315019 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hczzd\" (UniqueName: \"kubernetes.io/projected/dbf741e4-9445-4080-84f2-601e270f7aa0-kube-api-access-hczzd\") pod \"glance-db-sync-mxvm7\" (UID: \"dbf741e4-9445-4080-84f2-601e270f7aa0\") " pod="glance-kuttl-tests/glance-db-sync-mxvm7" Jan 31 15:01:47 crc kubenswrapper[4751]: I0131 15:01:47.415855 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dbf741e4-9445-4080-84f2-601e270f7aa0-db-sync-config-data\") pod \"glance-db-sync-mxvm7\" (UID: \"dbf741e4-9445-4080-84f2-601e270f7aa0\") " pod="glance-kuttl-tests/glance-db-sync-mxvm7" Jan 31 15:01:47 crc kubenswrapper[4751]: I0131 15:01:47.415914 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hczzd\" (UniqueName: \"kubernetes.io/projected/dbf741e4-9445-4080-84f2-601e270f7aa0-kube-api-access-hczzd\") pod \"glance-db-sync-mxvm7\" (UID: \"dbf741e4-9445-4080-84f2-601e270f7aa0\") " pod="glance-kuttl-tests/glance-db-sync-mxvm7" Jan 31 15:01:47 crc kubenswrapper[4751]: I0131 15:01:47.415981 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbf741e4-9445-4080-84f2-601e270f7aa0-config-data\") pod \"glance-db-sync-mxvm7\" (UID: \"dbf741e4-9445-4080-84f2-601e270f7aa0\") " pod="glance-kuttl-tests/glance-db-sync-mxvm7" Jan 31 15:01:47 crc kubenswrapper[4751]: I0131 15:01:47.420162 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbf741e4-9445-4080-84f2-601e270f7aa0-config-data\") pod \"glance-db-sync-mxvm7\" (UID: \"dbf741e4-9445-4080-84f2-601e270f7aa0\") " pod="glance-kuttl-tests/glance-db-sync-mxvm7" Jan 31 15:01:47 crc kubenswrapper[4751]: I0131 15:01:47.420615 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dbf741e4-9445-4080-84f2-601e270f7aa0-db-sync-config-data\") pod \"glance-db-sync-mxvm7\" (UID: \"dbf741e4-9445-4080-84f2-601e270f7aa0\") " pod="glance-kuttl-tests/glance-db-sync-mxvm7" Jan 31 15:01:47 crc kubenswrapper[4751]: I0131 15:01:47.432459 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hczzd\" (UniqueName: \"kubernetes.io/projected/dbf741e4-9445-4080-84f2-601e270f7aa0-kube-api-access-hczzd\") pod \"glance-db-sync-mxvm7\" (UID: \"dbf741e4-9445-4080-84f2-601e270f7aa0\") " pod="glance-kuttl-tests/glance-db-sync-mxvm7" Jan 31 15:01:47 crc kubenswrapper[4751]: I0131 15:01:47.547568 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-mxvm7" Jan 31 15:01:47 crc kubenswrapper[4751]: I0131 15:01:47.963498 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-mxvm7"] Jan 31 15:01:48 crc kubenswrapper[4751]: I0131 15:01:48.191830 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-mxvm7" event={"ID":"dbf741e4-9445-4080-84f2-601e270f7aa0","Type":"ContainerStarted","Data":"85b52c03617c4a88d648ff21fe628b61e341e359552690aead8461966d078b23"} Jan 31 15:01:49 crc kubenswrapper[4751]: I0131 15:01:49.200422 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-mxvm7" event={"ID":"dbf741e4-9445-4080-84f2-601e270f7aa0","Type":"ContainerStarted","Data":"f75375e8e6ad82f0f02e30825660a61882c0595e19792c2979a8125e9bf94686"} Jan 31 15:01:49 crc kubenswrapper[4751]: I0131 15:01:49.217541 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-sync-mxvm7" podStartSLOduration=2.217520098 podStartE2EDuration="2.217520098s" podCreationTimestamp="2026-01-31 15:01:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 15:01:49.215997558 +0000 UTC m=+1213.590710443" watchObservedRunningTime="2026-01-31 15:01:49.217520098 +0000 UTC m=+1213.592232983" Jan 31 15:01:52 crc kubenswrapper[4751]: I0131 15:01:52.228792 4751 generic.go:334] "Generic (PLEG): container finished" podID="dbf741e4-9445-4080-84f2-601e270f7aa0" containerID="f75375e8e6ad82f0f02e30825660a61882c0595e19792c2979a8125e9bf94686" exitCode=0 Jan 31 15:01:52 crc kubenswrapper[4751]: I0131 15:01:52.228863 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-mxvm7" event={"ID":"dbf741e4-9445-4080-84f2-601e270f7aa0","Type":"ContainerDied","Data":"f75375e8e6ad82f0f02e30825660a61882c0595e19792c2979a8125e9bf94686"} Jan 31 15:01:53 crc kubenswrapper[4751]: I0131 15:01:53.551470 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-mxvm7" Jan 31 15:01:53 crc kubenswrapper[4751]: I0131 15:01:53.630930 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hczzd\" (UniqueName: \"kubernetes.io/projected/dbf741e4-9445-4080-84f2-601e270f7aa0-kube-api-access-hczzd\") pod \"dbf741e4-9445-4080-84f2-601e270f7aa0\" (UID: \"dbf741e4-9445-4080-84f2-601e270f7aa0\") " Jan 31 15:01:53 crc kubenswrapper[4751]: I0131 15:01:53.630988 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dbf741e4-9445-4080-84f2-601e270f7aa0-db-sync-config-data\") pod \"dbf741e4-9445-4080-84f2-601e270f7aa0\" (UID: \"dbf741e4-9445-4080-84f2-601e270f7aa0\") " Jan 31 15:01:53 crc kubenswrapper[4751]: I0131 15:01:53.631017 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbf741e4-9445-4080-84f2-601e270f7aa0-config-data\") pod \"dbf741e4-9445-4080-84f2-601e270f7aa0\" (UID: \"dbf741e4-9445-4080-84f2-601e270f7aa0\") " Jan 31 15:01:53 crc kubenswrapper[4751]: I0131 15:01:53.636014 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbf741e4-9445-4080-84f2-601e270f7aa0-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "dbf741e4-9445-4080-84f2-601e270f7aa0" (UID: "dbf741e4-9445-4080-84f2-601e270f7aa0"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:01:53 crc kubenswrapper[4751]: I0131 15:01:53.636461 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbf741e4-9445-4080-84f2-601e270f7aa0-kube-api-access-hczzd" (OuterVolumeSpecName: "kube-api-access-hczzd") pod "dbf741e4-9445-4080-84f2-601e270f7aa0" (UID: "dbf741e4-9445-4080-84f2-601e270f7aa0"). InnerVolumeSpecName "kube-api-access-hczzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:01:53 crc kubenswrapper[4751]: I0131 15:01:53.664428 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbf741e4-9445-4080-84f2-601e270f7aa0-config-data" (OuterVolumeSpecName: "config-data") pod "dbf741e4-9445-4080-84f2-601e270f7aa0" (UID: "dbf741e4-9445-4080-84f2-601e270f7aa0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:01:53 crc kubenswrapper[4751]: I0131 15:01:53.732147 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hczzd\" (UniqueName: \"kubernetes.io/projected/dbf741e4-9445-4080-84f2-601e270f7aa0-kube-api-access-hczzd\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:53 crc kubenswrapper[4751]: I0131 15:01:53.732193 4751 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dbf741e4-9445-4080-84f2-601e270f7aa0-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:53 crc kubenswrapper[4751]: I0131 15:01:53.732207 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbf741e4-9445-4080-84f2-601e270f7aa0-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:54 crc kubenswrapper[4751]: I0131 15:01:54.264437 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-mxvm7" event={"ID":"dbf741e4-9445-4080-84f2-601e270f7aa0","Type":"ContainerDied","Data":"85b52c03617c4a88d648ff21fe628b61e341e359552690aead8461966d078b23"} Jan 31 15:01:54 crc kubenswrapper[4751]: I0131 15:01:54.264484 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85b52c03617c4a88d648ff21fe628b61e341e359552690aead8461966d078b23" Jan 31 15:01:54 crc kubenswrapper[4751]: I0131 15:01:54.264504 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-mxvm7" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.462664 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Jan 31 15:01:55 crc kubenswrapper[4751]: E0131 15:01:55.463143 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbf741e4-9445-4080-84f2-601e270f7aa0" containerName="glance-db-sync" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.463155 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbf741e4-9445-4080-84f2-601e270f7aa0" containerName="glance-db-sync" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.463295 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbf741e4-9445-4080-84f2-601e270f7aa0" containerName="glance-db-sync" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.464144 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.467671 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-external-config-data" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.468321 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.472015 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-6mvx9" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.485017 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.571293 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-external-api-0\" (UID: \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.571584 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-scripts\") pod \"glance-default-external-api-0\" (UID: \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.571696 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-config-data\") pod \"glance-default-external-api-0\" (UID: \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.571792 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.571898 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w868q\" (UniqueName: \"kubernetes.io/projected/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-kube-api-access-w868q\") pod \"glance-default-external-api-0\" (UID: \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.572020 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.572143 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-external-api-0\" (UID: \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.572239 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.572351 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-logs\") pod \"glance-default-external-api-0\" (UID: \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.572436 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-run\") pod \"glance-default-external-api-0\" (UID: \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.572552 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.572650 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.572734 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-sys\") pod \"glance-default-external-api-0\" (UID: \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.572838 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-dev\") pod \"glance-default-external-api-0\" (UID: \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.621787 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.623095 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.626547 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-internal-config-data" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.647399 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.678738 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-external-api-0\" (UID: \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.678784 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-scripts\") pod \"glance-default-external-api-0\" (UID: \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.678806 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-config-data\") pod \"glance-default-external-api-0\" (UID: \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.678831 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.678853 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w868q\" (UniqueName: \"kubernetes.io/projected/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-kube-api-access-w868q\") pod \"glance-default-external-api-0\" (UID: \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.678881 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.678903 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-external-api-0\" (UID: \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.678924 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.678941 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-logs\") pod \"glance-default-external-api-0\" (UID: \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.678958 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-run\") pod \"glance-default-external-api-0\" (UID: \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.678972 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.678980 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.679039 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-sys\") pod \"glance-default-external-api-0\" (UID: \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.679062 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.679120 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-dev\") pod \"glance-default-external-api-0\" (UID: \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.679228 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-external-api-0\" (UID: \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\") device mount path \"/mnt/openstack/pv20\"" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.679302 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-sys\") pod \"glance-default-external-api-0\" (UID: \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.679281 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-dev\") pod \"glance-default-external-api-0\" (UID: \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.679456 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.679494 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.679594 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.679641 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.679647 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-run\") pod \"glance-default-external-api-0\" (UID: \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.679649 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-external-api-0\" (UID: \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\") device mount path \"/mnt/openstack/pv17\"" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.680081 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-logs\") pod \"glance-default-external-api-0\" (UID: \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.684488 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-config-data\") pod \"glance-default-external-api-0\" (UID: \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.687089 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-scripts\") pod \"glance-default-external-api-0\" (UID: \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.702355 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w868q\" (UniqueName: \"kubernetes.io/projected/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-kube-api-access-w868q\") pod \"glance-default-external-api-0\" (UID: \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.708360 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-external-api-0\" (UID: \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.718912 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-external-api-0\" (UID: \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.780270 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/221322d6-160f-48ee-bed1-a02ac6cbfb09-run\") pod \"glance-default-internal-api-0\" (UID: \"221322d6-160f-48ee-bed1-a02ac6cbfb09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.780625 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54nck\" (UniqueName: \"kubernetes.io/projected/221322d6-160f-48ee-bed1-a02ac6cbfb09-kube-api-access-54nck\") pod \"glance-default-internal-api-0\" (UID: \"221322d6-160f-48ee-bed1-a02ac6cbfb09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.780434 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.780659 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/221322d6-160f-48ee-bed1-a02ac6cbfb09-logs\") pod \"glance-default-internal-api-0\" (UID: \"221322d6-160f-48ee-bed1-a02ac6cbfb09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.780854 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/221322d6-160f-48ee-bed1-a02ac6cbfb09-sys\") pod \"glance-default-internal-api-0\" (UID: \"221322d6-160f-48ee-bed1-a02ac6cbfb09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.780898 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/221322d6-160f-48ee-bed1-a02ac6cbfb09-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"221322d6-160f-48ee-bed1-a02ac6cbfb09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.780959 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"221322d6-160f-48ee-bed1-a02ac6cbfb09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.781299 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"221322d6-160f-48ee-bed1-a02ac6cbfb09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.781423 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/221322d6-160f-48ee-bed1-a02ac6cbfb09-dev\") pod \"glance-default-internal-api-0\" (UID: \"221322d6-160f-48ee-bed1-a02ac6cbfb09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.781486 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/221322d6-160f-48ee-bed1-a02ac6cbfb09-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"221322d6-160f-48ee-bed1-a02ac6cbfb09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.781569 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/221322d6-160f-48ee-bed1-a02ac6cbfb09-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"221322d6-160f-48ee-bed1-a02ac6cbfb09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.781703 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/221322d6-160f-48ee-bed1-a02ac6cbfb09-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"221322d6-160f-48ee-bed1-a02ac6cbfb09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.781777 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/221322d6-160f-48ee-bed1-a02ac6cbfb09-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"221322d6-160f-48ee-bed1-a02ac6cbfb09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.781848 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/221322d6-160f-48ee-bed1-a02ac6cbfb09-config-data\") pod \"glance-default-internal-api-0\" (UID: \"221322d6-160f-48ee-bed1-a02ac6cbfb09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.781918 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/221322d6-160f-48ee-bed1-a02ac6cbfb09-scripts\") pod \"glance-default-internal-api-0\" (UID: \"221322d6-160f-48ee-bed1-a02ac6cbfb09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.883136 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54nck\" (UniqueName: \"kubernetes.io/projected/221322d6-160f-48ee-bed1-a02ac6cbfb09-kube-api-access-54nck\") pod \"glance-default-internal-api-0\" (UID: \"221322d6-160f-48ee-bed1-a02ac6cbfb09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.883196 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/221322d6-160f-48ee-bed1-a02ac6cbfb09-logs\") pod \"glance-default-internal-api-0\" (UID: \"221322d6-160f-48ee-bed1-a02ac6cbfb09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.883238 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/221322d6-160f-48ee-bed1-a02ac6cbfb09-sys\") pod \"glance-default-internal-api-0\" (UID: \"221322d6-160f-48ee-bed1-a02ac6cbfb09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.883260 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/221322d6-160f-48ee-bed1-a02ac6cbfb09-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"221322d6-160f-48ee-bed1-a02ac6cbfb09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.883296 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"221322d6-160f-48ee-bed1-a02ac6cbfb09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.883339 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"221322d6-160f-48ee-bed1-a02ac6cbfb09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.883362 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/221322d6-160f-48ee-bed1-a02ac6cbfb09-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"221322d6-160f-48ee-bed1-a02ac6cbfb09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.883396 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/221322d6-160f-48ee-bed1-a02ac6cbfb09-dev\") pod \"glance-default-internal-api-0\" (UID: \"221322d6-160f-48ee-bed1-a02ac6cbfb09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.883424 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/221322d6-160f-48ee-bed1-a02ac6cbfb09-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"221322d6-160f-48ee-bed1-a02ac6cbfb09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.883485 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/221322d6-160f-48ee-bed1-a02ac6cbfb09-dev\") pod \"glance-default-internal-api-0\" (UID: \"221322d6-160f-48ee-bed1-a02ac6cbfb09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.883509 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/221322d6-160f-48ee-bed1-a02ac6cbfb09-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"221322d6-160f-48ee-bed1-a02ac6cbfb09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.883495 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/221322d6-160f-48ee-bed1-a02ac6cbfb09-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"221322d6-160f-48ee-bed1-a02ac6cbfb09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.883568 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/221322d6-160f-48ee-bed1-a02ac6cbfb09-sys\") pod \"glance-default-internal-api-0\" (UID: \"221322d6-160f-48ee-bed1-a02ac6cbfb09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.883653 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"221322d6-160f-48ee-bed1-a02ac6cbfb09\") device mount path \"/mnt/openstack/pv05\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.883663 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/221322d6-160f-48ee-bed1-a02ac6cbfb09-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"221322d6-160f-48ee-bed1-a02ac6cbfb09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.883689 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/221322d6-160f-48ee-bed1-a02ac6cbfb09-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"221322d6-160f-48ee-bed1-a02ac6cbfb09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.883713 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/221322d6-160f-48ee-bed1-a02ac6cbfb09-config-data\") pod \"glance-default-internal-api-0\" (UID: \"221322d6-160f-48ee-bed1-a02ac6cbfb09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.883736 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/221322d6-160f-48ee-bed1-a02ac6cbfb09-scripts\") pod \"glance-default-internal-api-0\" (UID: \"221322d6-160f-48ee-bed1-a02ac6cbfb09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.883740 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"221322d6-160f-48ee-bed1-a02ac6cbfb09\") device mount path \"/mnt/openstack/pv02\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.883776 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/221322d6-160f-48ee-bed1-a02ac6cbfb09-run\") pod \"glance-default-internal-api-0\" (UID: \"221322d6-160f-48ee-bed1-a02ac6cbfb09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.883864 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/221322d6-160f-48ee-bed1-a02ac6cbfb09-run\") pod \"glance-default-internal-api-0\" (UID: \"221322d6-160f-48ee-bed1-a02ac6cbfb09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.883911 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/221322d6-160f-48ee-bed1-a02ac6cbfb09-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"221322d6-160f-48ee-bed1-a02ac6cbfb09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.883959 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/221322d6-160f-48ee-bed1-a02ac6cbfb09-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"221322d6-160f-48ee-bed1-a02ac6cbfb09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.884089 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/221322d6-160f-48ee-bed1-a02ac6cbfb09-logs\") pod \"glance-default-internal-api-0\" (UID: \"221322d6-160f-48ee-bed1-a02ac6cbfb09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.884123 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/221322d6-160f-48ee-bed1-a02ac6cbfb09-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"221322d6-160f-48ee-bed1-a02ac6cbfb09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.888809 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/221322d6-160f-48ee-bed1-a02ac6cbfb09-scripts\") pod \"glance-default-internal-api-0\" (UID: \"221322d6-160f-48ee-bed1-a02ac6cbfb09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.890021 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/221322d6-160f-48ee-bed1-a02ac6cbfb09-config-data\") pod \"glance-default-internal-api-0\" (UID: \"221322d6-160f-48ee-bed1-a02ac6cbfb09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.902603 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"221322d6-160f-48ee-bed1-a02ac6cbfb09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.902840 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"221322d6-160f-48ee-bed1-a02ac6cbfb09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.916104 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54nck\" (UniqueName: \"kubernetes.io/projected/221322d6-160f-48ee-bed1-a02ac6cbfb09-kube-api-access-54nck\") pod \"glance-default-internal-api-0\" (UID: \"221322d6-160f-48ee-bed1-a02ac6cbfb09\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:01:55 crc kubenswrapper[4751]: I0131 15:01:55.935422 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:01:56 crc kubenswrapper[4751]: I0131 15:01:56.196352 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Jan 31 15:01:56 crc kubenswrapper[4751]: I0131 15:01:56.280621 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"5c99f5b1-8566-4141-9bd4-71a75e7f43b6","Type":"ContainerStarted","Data":"632343238b8b6273cfe0d462a1823f7261ef1f48b55a453cfe7a4028e8a3bc11"} Jan 31 15:01:56 crc kubenswrapper[4751]: I0131 15:01:56.340177 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 15:01:56 crc kubenswrapper[4751]: I0131 15:01:56.733806 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 15:01:57 crc kubenswrapper[4751]: I0131 15:01:57.291201 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"5c99f5b1-8566-4141-9bd4-71a75e7f43b6","Type":"ContainerStarted","Data":"b116da01d3251d46a5037ade80e62e2ba05b68577c23f5fe7ea7d5c1c6939e12"} Jan 31 15:01:57 crc kubenswrapper[4751]: I0131 15:01:57.291859 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"5c99f5b1-8566-4141-9bd4-71a75e7f43b6","Type":"ContainerStarted","Data":"88dad209ae8e8d063a908076aa5adea9728c00caf34294e807966d88047f26f8"} Jan 31 15:01:57 crc kubenswrapper[4751]: I0131 15:01:57.292889 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"221322d6-160f-48ee-bed1-a02ac6cbfb09","Type":"ContainerStarted","Data":"2d17a7d49f7479975731597d7e17ac81d17ead2b622a0ef7093e781d499f7009"} Jan 31 15:01:57 crc kubenswrapper[4751]: I0131 15:01:57.292937 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"221322d6-160f-48ee-bed1-a02ac6cbfb09","Type":"ContainerStarted","Data":"2145d899923840c33715fa17628307294c5047421a38139dabc06cf3d05cb997"} Jan 31 15:01:57 crc kubenswrapper[4751]: I0131 15:01:57.292951 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"221322d6-160f-48ee-bed1-a02ac6cbfb09","Type":"ContainerStarted","Data":"31e67ae7532286bf7c53890d945174ea89dd8a711c528d0711e4c1c63616c6e2"} Jan 31 15:01:58 crc kubenswrapper[4751]: I0131 15:01:58.300809 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"5c99f5b1-8566-4141-9bd4-71a75e7f43b6","Type":"ContainerStarted","Data":"979026a33bcf12c3258686b10538e86fd757dcafb8921f3a74303169ea20efee"} Jan 31 15:01:58 crc kubenswrapper[4751]: I0131 15:01:58.303535 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"221322d6-160f-48ee-bed1-a02ac6cbfb09","Type":"ContainerStarted","Data":"a97088bc226d5155802489f7ac6a208ee9b1cacfbbb954588201d395f2a07500"} Jan 31 15:01:58 crc kubenswrapper[4751]: I0131 15:01:58.303697 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="221322d6-160f-48ee-bed1-a02ac6cbfb09" containerName="glance-log" containerID="cri-o://2145d899923840c33715fa17628307294c5047421a38139dabc06cf3d05cb997" gracePeriod=30 Jan 31 15:01:58 crc kubenswrapper[4751]: I0131 15:01:58.303823 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="221322d6-160f-48ee-bed1-a02ac6cbfb09" containerName="glance-httpd" containerID="cri-o://2d17a7d49f7479975731597d7e17ac81d17ead2b622a0ef7093e781d499f7009" gracePeriod=30 Jan 31 15:01:58 crc kubenswrapper[4751]: I0131 15:01:58.303836 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="221322d6-160f-48ee-bed1-a02ac6cbfb09" containerName="glance-api" containerID="cri-o://a97088bc226d5155802489f7ac6a208ee9b1cacfbbb954588201d395f2a07500" gracePeriod=30 Jan 31 15:01:58 crc kubenswrapper[4751]: I0131 15:01:58.333026 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-0" podStartSLOduration=3.333010475 podStartE2EDuration="3.333010475s" podCreationTimestamp="2026-01-31 15:01:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 15:01:58.331671209 +0000 UTC m=+1222.706384094" watchObservedRunningTime="2026-01-31 15:01:58.333010475 +0000 UTC m=+1222.707723360" Jan 31 15:01:58 crc kubenswrapper[4751]: I0131 15:01:58.363515 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=4.363494718 podStartE2EDuration="4.363494718s" podCreationTimestamp="2026-01-31 15:01:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 15:01:58.360409527 +0000 UTC m=+1222.735122432" watchObservedRunningTime="2026-01-31 15:01:58.363494718 +0000 UTC m=+1222.738207603" Jan 31 15:01:59 crc kubenswrapper[4751]: I0131 15:01:59.313243 4751 generic.go:334] "Generic (PLEG): container finished" podID="221322d6-160f-48ee-bed1-a02ac6cbfb09" containerID="a97088bc226d5155802489f7ac6a208ee9b1cacfbbb954588201d395f2a07500" exitCode=143 Jan 31 15:01:59 crc kubenswrapper[4751]: I0131 15:01:59.313606 4751 generic.go:334] "Generic (PLEG): container finished" podID="221322d6-160f-48ee-bed1-a02ac6cbfb09" containerID="2d17a7d49f7479975731597d7e17ac81d17ead2b622a0ef7093e781d499f7009" exitCode=143 Jan 31 15:01:59 crc kubenswrapper[4751]: I0131 15:01:59.313614 4751 generic.go:334] "Generic (PLEG): container finished" podID="221322d6-160f-48ee-bed1-a02ac6cbfb09" containerID="2145d899923840c33715fa17628307294c5047421a38139dabc06cf3d05cb997" exitCode=143 Jan 31 15:01:59 crc kubenswrapper[4751]: I0131 15:01:59.313306 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"221322d6-160f-48ee-bed1-a02ac6cbfb09","Type":"ContainerDied","Data":"a97088bc226d5155802489f7ac6a208ee9b1cacfbbb954588201d395f2a07500"} Jan 31 15:01:59 crc kubenswrapper[4751]: I0131 15:01:59.313728 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"221322d6-160f-48ee-bed1-a02ac6cbfb09","Type":"ContainerDied","Data":"2d17a7d49f7479975731597d7e17ac81d17ead2b622a0ef7093e781d499f7009"} Jan 31 15:01:59 crc kubenswrapper[4751]: I0131 15:01:59.313745 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"221322d6-160f-48ee-bed1-a02ac6cbfb09","Type":"ContainerDied","Data":"2145d899923840c33715fa17628307294c5047421a38139dabc06cf3d05cb997"} Jan 31 15:01:59 crc kubenswrapper[4751]: I0131 15:01:59.313759 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"221322d6-160f-48ee-bed1-a02ac6cbfb09","Type":"ContainerDied","Data":"31e67ae7532286bf7c53890d945174ea89dd8a711c528d0711e4c1c63616c6e2"} Jan 31 15:01:59 crc kubenswrapper[4751]: I0131 15:01:59.313772 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31e67ae7532286bf7c53890d945174ea89dd8a711c528d0711e4c1c63616c6e2" Jan 31 15:01:59 crc kubenswrapper[4751]: I0131 15:01:59.345946 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:01:59 crc kubenswrapper[4751]: I0131 15:01:59.437385 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/221322d6-160f-48ee-bed1-a02ac6cbfb09-etc-nvme\") pod \"221322d6-160f-48ee-bed1-a02ac6cbfb09\" (UID: \"221322d6-160f-48ee-bed1-a02ac6cbfb09\") " Jan 31 15:01:59 crc kubenswrapper[4751]: I0131 15:01:59.437419 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/221322d6-160f-48ee-bed1-a02ac6cbfb09-etc-iscsi\") pod \"221322d6-160f-48ee-bed1-a02ac6cbfb09\" (UID: \"221322d6-160f-48ee-bed1-a02ac6cbfb09\") " Jan 31 15:01:59 crc kubenswrapper[4751]: I0131 15:01:59.437449 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/221322d6-160f-48ee-bed1-a02ac6cbfb09-scripts\") pod \"221322d6-160f-48ee-bed1-a02ac6cbfb09\" (UID: \"221322d6-160f-48ee-bed1-a02ac6cbfb09\") " Jan 31 15:01:59 crc kubenswrapper[4751]: I0131 15:01:59.437504 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/221322d6-160f-48ee-bed1-a02ac6cbfb09-var-locks-brick\") pod \"221322d6-160f-48ee-bed1-a02ac6cbfb09\" (UID: \"221322d6-160f-48ee-bed1-a02ac6cbfb09\") " Jan 31 15:01:59 crc kubenswrapper[4751]: I0131 15:01:59.437521 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"221322d6-160f-48ee-bed1-a02ac6cbfb09\" (UID: \"221322d6-160f-48ee-bed1-a02ac6cbfb09\") " Jan 31 15:01:59 crc kubenswrapper[4751]: I0131 15:01:59.437538 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/221322d6-160f-48ee-bed1-a02ac6cbfb09-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "221322d6-160f-48ee-bed1-a02ac6cbfb09" (UID: "221322d6-160f-48ee-bed1-a02ac6cbfb09"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:01:59 crc kubenswrapper[4751]: I0131 15:01:59.437576 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/221322d6-160f-48ee-bed1-a02ac6cbfb09-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "221322d6-160f-48ee-bed1-a02ac6cbfb09" (UID: "221322d6-160f-48ee-bed1-a02ac6cbfb09"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:01:59 crc kubenswrapper[4751]: I0131 15:01:59.437538 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/221322d6-160f-48ee-bed1-a02ac6cbfb09-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "221322d6-160f-48ee-bed1-a02ac6cbfb09" (UID: "221322d6-160f-48ee-bed1-a02ac6cbfb09"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:01:59 crc kubenswrapper[4751]: I0131 15:01:59.437560 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54nck\" (UniqueName: \"kubernetes.io/projected/221322d6-160f-48ee-bed1-a02ac6cbfb09-kube-api-access-54nck\") pod \"221322d6-160f-48ee-bed1-a02ac6cbfb09\" (UID: \"221322d6-160f-48ee-bed1-a02ac6cbfb09\") " Jan 31 15:01:59 crc kubenswrapper[4751]: I0131 15:01:59.437635 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"221322d6-160f-48ee-bed1-a02ac6cbfb09\" (UID: \"221322d6-160f-48ee-bed1-a02ac6cbfb09\") " Jan 31 15:01:59 crc kubenswrapper[4751]: I0131 15:01:59.437751 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/221322d6-160f-48ee-bed1-a02ac6cbfb09-logs\") pod \"221322d6-160f-48ee-bed1-a02ac6cbfb09\" (UID: \"221322d6-160f-48ee-bed1-a02ac6cbfb09\") " Jan 31 15:01:59 crc kubenswrapper[4751]: I0131 15:01:59.437773 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/221322d6-160f-48ee-bed1-a02ac6cbfb09-dev\") pod \"221322d6-160f-48ee-bed1-a02ac6cbfb09\" (UID: \"221322d6-160f-48ee-bed1-a02ac6cbfb09\") " Jan 31 15:01:59 crc kubenswrapper[4751]: I0131 15:01:59.437793 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/221322d6-160f-48ee-bed1-a02ac6cbfb09-lib-modules\") pod \"221322d6-160f-48ee-bed1-a02ac6cbfb09\" (UID: \"221322d6-160f-48ee-bed1-a02ac6cbfb09\") " Jan 31 15:01:59 crc kubenswrapper[4751]: I0131 15:01:59.437839 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/221322d6-160f-48ee-bed1-a02ac6cbfb09-sys\") pod \"221322d6-160f-48ee-bed1-a02ac6cbfb09\" (UID: \"221322d6-160f-48ee-bed1-a02ac6cbfb09\") " Jan 31 15:01:59 crc kubenswrapper[4751]: I0131 15:01:59.437858 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/221322d6-160f-48ee-bed1-a02ac6cbfb09-httpd-run\") pod \"221322d6-160f-48ee-bed1-a02ac6cbfb09\" (UID: \"221322d6-160f-48ee-bed1-a02ac6cbfb09\") " Jan 31 15:01:59 crc kubenswrapper[4751]: I0131 15:01:59.437883 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/221322d6-160f-48ee-bed1-a02ac6cbfb09-config-data\") pod \"221322d6-160f-48ee-bed1-a02ac6cbfb09\" (UID: \"221322d6-160f-48ee-bed1-a02ac6cbfb09\") " Jan 31 15:01:59 crc kubenswrapper[4751]: I0131 15:01:59.437908 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/221322d6-160f-48ee-bed1-a02ac6cbfb09-run\") pod \"221322d6-160f-48ee-bed1-a02ac6cbfb09\" (UID: \"221322d6-160f-48ee-bed1-a02ac6cbfb09\") " Jan 31 15:01:59 crc kubenswrapper[4751]: I0131 15:01:59.438157 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/221322d6-160f-48ee-bed1-a02ac6cbfb09-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "221322d6-160f-48ee-bed1-a02ac6cbfb09" (UID: "221322d6-160f-48ee-bed1-a02ac6cbfb09"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:01:59 crc kubenswrapper[4751]: I0131 15:01:59.438209 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/221322d6-160f-48ee-bed1-a02ac6cbfb09-run" (OuterVolumeSpecName: "run") pod "221322d6-160f-48ee-bed1-a02ac6cbfb09" (UID: "221322d6-160f-48ee-bed1-a02ac6cbfb09"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:01:59 crc kubenswrapper[4751]: I0131 15:01:59.438270 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/221322d6-160f-48ee-bed1-a02ac6cbfb09-sys" (OuterVolumeSpecName: "sys") pod "221322d6-160f-48ee-bed1-a02ac6cbfb09" (UID: "221322d6-160f-48ee-bed1-a02ac6cbfb09"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:01:59 crc kubenswrapper[4751]: I0131 15:01:59.438493 4751 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/221322d6-160f-48ee-bed1-a02ac6cbfb09-lib-modules\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:59 crc kubenswrapper[4751]: I0131 15:01:59.438515 4751 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/221322d6-160f-48ee-bed1-a02ac6cbfb09-sys\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:59 crc kubenswrapper[4751]: I0131 15:01:59.438527 4751 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/221322d6-160f-48ee-bed1-a02ac6cbfb09-run\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:59 crc kubenswrapper[4751]: I0131 15:01:59.438537 4751 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/221322d6-160f-48ee-bed1-a02ac6cbfb09-etc-nvme\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:59 crc kubenswrapper[4751]: I0131 15:01:59.438547 4751 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/221322d6-160f-48ee-bed1-a02ac6cbfb09-etc-iscsi\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:59 crc kubenswrapper[4751]: I0131 15:01:59.438558 4751 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/221322d6-160f-48ee-bed1-a02ac6cbfb09-var-locks-brick\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:59 crc kubenswrapper[4751]: I0131 15:01:59.438586 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/221322d6-160f-48ee-bed1-a02ac6cbfb09-dev" (OuterVolumeSpecName: "dev") pod "221322d6-160f-48ee-bed1-a02ac6cbfb09" (UID: "221322d6-160f-48ee-bed1-a02ac6cbfb09"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:01:59 crc kubenswrapper[4751]: I0131 15:01:59.438848 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/221322d6-160f-48ee-bed1-a02ac6cbfb09-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "221322d6-160f-48ee-bed1-a02ac6cbfb09" (UID: "221322d6-160f-48ee-bed1-a02ac6cbfb09"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:01:59 crc kubenswrapper[4751]: I0131 15:01:59.438907 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/221322d6-160f-48ee-bed1-a02ac6cbfb09-logs" (OuterVolumeSpecName: "logs") pod "221322d6-160f-48ee-bed1-a02ac6cbfb09" (UID: "221322d6-160f-48ee-bed1-a02ac6cbfb09"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:01:59 crc kubenswrapper[4751]: I0131 15:01:59.444024 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance-cache") pod "221322d6-160f-48ee-bed1-a02ac6cbfb09" (UID: "221322d6-160f-48ee-bed1-a02ac6cbfb09"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:01:59 crc kubenswrapper[4751]: I0131 15:01:59.444438 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "221322d6-160f-48ee-bed1-a02ac6cbfb09" (UID: "221322d6-160f-48ee-bed1-a02ac6cbfb09"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:01:59 crc kubenswrapper[4751]: I0131 15:01:59.445061 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/221322d6-160f-48ee-bed1-a02ac6cbfb09-kube-api-access-54nck" (OuterVolumeSpecName: "kube-api-access-54nck") pod "221322d6-160f-48ee-bed1-a02ac6cbfb09" (UID: "221322d6-160f-48ee-bed1-a02ac6cbfb09"). InnerVolumeSpecName "kube-api-access-54nck". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:01:59 crc kubenswrapper[4751]: I0131 15:01:59.469301 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/221322d6-160f-48ee-bed1-a02ac6cbfb09-scripts" (OuterVolumeSpecName: "scripts") pod "221322d6-160f-48ee-bed1-a02ac6cbfb09" (UID: "221322d6-160f-48ee-bed1-a02ac6cbfb09"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:01:59 crc kubenswrapper[4751]: I0131 15:01:59.518437 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/221322d6-160f-48ee-bed1-a02ac6cbfb09-config-data" (OuterVolumeSpecName: "config-data") pod "221322d6-160f-48ee-bed1-a02ac6cbfb09" (UID: "221322d6-160f-48ee-bed1-a02ac6cbfb09"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:01:59 crc kubenswrapper[4751]: I0131 15:01:59.539551 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Jan 31 15:01:59 crc kubenswrapper[4751]: I0131 15:01:59.539596 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Jan 31 15:01:59 crc kubenswrapper[4751]: I0131 15:01:59.539614 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54nck\" (UniqueName: \"kubernetes.io/projected/221322d6-160f-48ee-bed1-a02ac6cbfb09-kube-api-access-54nck\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:59 crc kubenswrapper[4751]: I0131 15:01:59.539629 4751 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/221322d6-160f-48ee-bed1-a02ac6cbfb09-logs\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:59 crc kubenswrapper[4751]: I0131 15:01:59.539641 4751 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/221322d6-160f-48ee-bed1-a02ac6cbfb09-dev\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:59 crc kubenswrapper[4751]: I0131 15:01:59.539653 4751 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/221322d6-160f-48ee-bed1-a02ac6cbfb09-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:59 crc kubenswrapper[4751]: I0131 15:01:59.539665 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/221322d6-160f-48ee-bed1-a02ac6cbfb09-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:59 crc kubenswrapper[4751]: I0131 15:01:59.539675 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/221322d6-160f-48ee-bed1-a02ac6cbfb09-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:59 crc kubenswrapper[4751]: I0131 15:01:59.564427 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Jan 31 15:01:59 crc kubenswrapper[4751]: I0131 15:01:59.568725 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Jan 31 15:01:59 crc kubenswrapper[4751]: I0131 15:01:59.641281 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:01:59 crc kubenswrapper[4751]: I0131 15:01:59.641324 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:00 crc kubenswrapper[4751]: I0131 15:02:00.321012 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:00 crc kubenswrapper[4751]: I0131 15:02:00.365879 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 15:02:00 crc kubenswrapper[4751]: I0131 15:02:00.374296 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 15:02:00 crc kubenswrapper[4751]: I0131 15:02:00.386349 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 15:02:00 crc kubenswrapper[4751]: E0131 15:02:00.386647 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="221322d6-160f-48ee-bed1-a02ac6cbfb09" containerName="glance-log" Jan 31 15:02:00 crc kubenswrapper[4751]: I0131 15:02:00.386670 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="221322d6-160f-48ee-bed1-a02ac6cbfb09" containerName="glance-log" Jan 31 15:02:00 crc kubenswrapper[4751]: E0131 15:02:00.386694 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="221322d6-160f-48ee-bed1-a02ac6cbfb09" containerName="glance-httpd" Jan 31 15:02:00 crc kubenswrapper[4751]: I0131 15:02:00.386703 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="221322d6-160f-48ee-bed1-a02ac6cbfb09" containerName="glance-httpd" Jan 31 15:02:00 crc kubenswrapper[4751]: E0131 15:02:00.386710 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="221322d6-160f-48ee-bed1-a02ac6cbfb09" containerName="glance-api" Jan 31 15:02:00 crc kubenswrapper[4751]: I0131 15:02:00.386717 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="221322d6-160f-48ee-bed1-a02ac6cbfb09" containerName="glance-api" Jan 31 15:02:00 crc kubenswrapper[4751]: I0131 15:02:00.386850 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="221322d6-160f-48ee-bed1-a02ac6cbfb09" containerName="glance-httpd" Jan 31 15:02:00 crc kubenswrapper[4751]: I0131 15:02:00.386875 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="221322d6-160f-48ee-bed1-a02ac6cbfb09" containerName="glance-log" Jan 31 15:02:00 crc kubenswrapper[4751]: I0131 15:02:00.386892 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="221322d6-160f-48ee-bed1-a02ac6cbfb09" containerName="glance-api" Jan 31 15:02:00 crc kubenswrapper[4751]: I0131 15:02:00.387873 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:00 crc kubenswrapper[4751]: I0131 15:02:00.390477 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-internal-config-data" Jan 31 15:02:00 crc kubenswrapper[4751]: I0131 15:02:00.403079 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 15:02:00 crc kubenswrapper[4751]: I0131 15:02:00.417546 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="221322d6-160f-48ee-bed1-a02ac6cbfb09" path="/var/lib/kubelet/pods/221322d6-160f-48ee-bed1-a02ac6cbfb09/volumes" Jan 31 15:02:00 crc kubenswrapper[4751]: I0131 15:02:00.453276 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f0b77b88-19a5-4bdc-87a1-6a65273226a2-run\") pod \"glance-default-internal-api-0\" (UID: \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:00 crc kubenswrapper[4751]: I0131 15:02:00.453326 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:00 crc kubenswrapper[4751]: I0131 15:02:00.453350 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f0b77b88-19a5-4bdc-87a1-6a65273226a2-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:00 crc kubenswrapper[4751]: I0131 15:02:00.453371 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f0b77b88-19a5-4bdc-87a1-6a65273226a2-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:00 crc kubenswrapper[4751]: I0131 15:02:00.453403 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f0b77b88-19a5-4bdc-87a1-6a65273226a2-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:00 crc kubenswrapper[4751]: I0131 15:02:00.453441 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0b77b88-19a5-4bdc-87a1-6a65273226a2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:00 crc kubenswrapper[4751]: I0131 15:02:00.453498 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:00 crc kubenswrapper[4751]: I0131 15:02:00.453552 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f0b77b88-19a5-4bdc-87a1-6a65273226a2-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:00 crc kubenswrapper[4751]: I0131 15:02:00.453616 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f0b77b88-19a5-4bdc-87a1-6a65273226a2-dev\") pod \"glance-default-internal-api-0\" (UID: \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:00 crc kubenswrapper[4751]: I0131 15:02:00.453648 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8cqp\" (UniqueName: \"kubernetes.io/projected/f0b77b88-19a5-4bdc-87a1-6a65273226a2-kube-api-access-l8cqp\") pod \"glance-default-internal-api-0\" (UID: \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:00 crc kubenswrapper[4751]: I0131 15:02:00.453898 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0b77b88-19a5-4bdc-87a1-6a65273226a2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:00 crc kubenswrapper[4751]: I0131 15:02:00.453963 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f0b77b88-19a5-4bdc-87a1-6a65273226a2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:00 crc kubenswrapper[4751]: I0131 15:02:00.453996 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0b77b88-19a5-4bdc-87a1-6a65273226a2-logs\") pod \"glance-default-internal-api-0\" (UID: \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:00 crc kubenswrapper[4751]: I0131 15:02:00.454019 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f0b77b88-19a5-4bdc-87a1-6a65273226a2-sys\") pod \"glance-default-internal-api-0\" (UID: \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:00 crc kubenswrapper[4751]: I0131 15:02:00.554836 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f0b77b88-19a5-4bdc-87a1-6a65273226a2-dev\") pod \"glance-default-internal-api-0\" (UID: \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:00 crc kubenswrapper[4751]: I0131 15:02:00.554913 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8cqp\" (UniqueName: \"kubernetes.io/projected/f0b77b88-19a5-4bdc-87a1-6a65273226a2-kube-api-access-l8cqp\") pod \"glance-default-internal-api-0\" (UID: \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:00 crc kubenswrapper[4751]: I0131 15:02:00.554973 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0b77b88-19a5-4bdc-87a1-6a65273226a2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:00 crc kubenswrapper[4751]: I0131 15:02:00.554981 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f0b77b88-19a5-4bdc-87a1-6a65273226a2-dev\") pod \"glance-default-internal-api-0\" (UID: \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:00 crc kubenswrapper[4751]: I0131 15:02:00.555005 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f0b77b88-19a5-4bdc-87a1-6a65273226a2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:00 crc kubenswrapper[4751]: I0131 15:02:00.555021 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0b77b88-19a5-4bdc-87a1-6a65273226a2-logs\") pod \"glance-default-internal-api-0\" (UID: \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:00 crc kubenswrapper[4751]: I0131 15:02:00.555040 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f0b77b88-19a5-4bdc-87a1-6a65273226a2-sys\") pod \"glance-default-internal-api-0\" (UID: \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:00 crc kubenswrapper[4751]: I0131 15:02:00.555098 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:00 crc kubenswrapper[4751]: I0131 15:02:00.555160 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f0b77b88-19a5-4bdc-87a1-6a65273226a2-sys\") pod \"glance-default-internal-api-0\" (UID: \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:00 crc kubenswrapper[4751]: I0131 15:02:00.555201 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f0b77b88-19a5-4bdc-87a1-6a65273226a2-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:00 crc kubenswrapper[4751]: I0131 15:02:00.555270 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f0b77b88-19a5-4bdc-87a1-6a65273226a2-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:00 crc kubenswrapper[4751]: I0131 15:02:00.555216 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f0b77b88-19a5-4bdc-87a1-6a65273226a2-run\") pod \"glance-default-internal-api-0\" (UID: \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:00 crc kubenswrapper[4751]: I0131 15:02:00.555307 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f0b77b88-19a5-4bdc-87a1-6a65273226a2-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:00 crc kubenswrapper[4751]: I0131 15:02:00.555331 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f0b77b88-19a5-4bdc-87a1-6a65273226a2-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:00 crc kubenswrapper[4751]: I0131 15:02:00.555409 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0b77b88-19a5-4bdc-87a1-6a65273226a2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:00 crc kubenswrapper[4751]: I0131 15:02:00.555430 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:00 crc kubenswrapper[4751]: I0131 15:02:00.555356 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f0b77b88-19a5-4bdc-87a1-6a65273226a2-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:00 crc kubenswrapper[4751]: I0131 15:02:00.555338 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f0b77b88-19a5-4bdc-87a1-6a65273226a2-run\") pod \"glance-default-internal-api-0\" (UID: \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:00 crc kubenswrapper[4751]: I0131 15:02:00.555513 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\") device mount path \"/mnt/openstack/pv05\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:00 crc kubenswrapper[4751]: I0131 15:02:00.555377 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f0b77b88-19a5-4bdc-87a1-6a65273226a2-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:00 crc kubenswrapper[4751]: I0131 15:02:00.555606 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f0b77b88-19a5-4bdc-87a1-6a65273226a2-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:00 crc kubenswrapper[4751]: I0131 15:02:00.555633 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\") device mount path \"/mnt/openstack/pv02\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:00 crc kubenswrapper[4751]: I0131 15:02:00.555673 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f0b77b88-19a5-4bdc-87a1-6a65273226a2-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:00 crc kubenswrapper[4751]: I0131 15:02:00.555997 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f0b77b88-19a5-4bdc-87a1-6a65273226a2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:00 crc kubenswrapper[4751]: I0131 15:02:00.556207 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0b77b88-19a5-4bdc-87a1-6a65273226a2-logs\") pod \"glance-default-internal-api-0\" (UID: \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:00 crc kubenswrapper[4751]: I0131 15:02:00.560051 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0b77b88-19a5-4bdc-87a1-6a65273226a2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:00 crc kubenswrapper[4751]: I0131 15:02:00.560741 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0b77b88-19a5-4bdc-87a1-6a65273226a2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:00 crc kubenswrapper[4751]: I0131 15:02:00.571729 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8cqp\" (UniqueName: \"kubernetes.io/projected/f0b77b88-19a5-4bdc-87a1-6a65273226a2-kube-api-access-l8cqp\") pod \"glance-default-internal-api-0\" (UID: \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:00 crc kubenswrapper[4751]: I0131 15:02:00.575261 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:00 crc kubenswrapper[4751]: I0131 15:02:00.583269 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:00 crc kubenswrapper[4751]: I0131 15:02:00.709989 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:01 crc kubenswrapper[4751]: I0131 15:02:01.189880 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 15:02:01 crc kubenswrapper[4751]: I0131 15:02:01.330551 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"f0b77b88-19a5-4bdc-87a1-6a65273226a2","Type":"ContainerStarted","Data":"58907fb108567ccc157e944740b878da74b00dd4afd8f71e705346251d50d030"} Jan 31 15:02:02 crc kubenswrapper[4751]: I0131 15:02:02.340139 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"f0b77b88-19a5-4bdc-87a1-6a65273226a2","Type":"ContainerStarted","Data":"d00daa5873d01a2a52917e99a159d4dd523630cf9771ab415ea43dc1ab2768ec"} Jan 31 15:02:02 crc kubenswrapper[4751]: I0131 15:02:02.340691 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"f0b77b88-19a5-4bdc-87a1-6a65273226a2","Type":"ContainerStarted","Data":"e332c13695ee9418872977980d66d846c226e037834cc39d0c88b742a39fc6a9"} Jan 31 15:02:02 crc kubenswrapper[4751]: I0131 15:02:02.340706 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"f0b77b88-19a5-4bdc-87a1-6a65273226a2","Type":"ContainerStarted","Data":"cdf19d70be1f20e70e6253f8f6e27452c7be5e6f13392a31b245256551aa31c1"} Jan 31 15:02:02 crc kubenswrapper[4751]: I0131 15:02:02.369886 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=2.369866608 podStartE2EDuration="2.369866608s" podCreationTimestamp="2026-01-31 15:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 15:02:02.364843275 +0000 UTC m=+1226.739556150" watchObservedRunningTime="2026-01-31 15:02:02.369866608 +0000 UTC m=+1226.744579493" Jan 31 15:02:05 crc kubenswrapper[4751]: I0131 15:02:05.781365 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:02:05 crc kubenswrapper[4751]: I0131 15:02:05.782030 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:02:05 crc kubenswrapper[4751]: I0131 15:02:05.782060 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:02:05 crc kubenswrapper[4751]: I0131 15:02:05.807787 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:02:05 crc kubenswrapper[4751]: I0131 15:02:05.815643 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:02:05 crc kubenswrapper[4751]: I0131 15:02:05.833397 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:02:06 crc kubenswrapper[4751]: I0131 15:02:06.376936 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:02:06 crc kubenswrapper[4751]: I0131 15:02:06.377016 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:02:06 crc kubenswrapper[4751]: I0131 15:02:06.377043 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:02:06 crc kubenswrapper[4751]: I0131 15:02:06.395678 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:02:06 crc kubenswrapper[4751]: I0131 15:02:06.396808 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:02:06 crc kubenswrapper[4751]: I0131 15:02:06.399226 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:02:08 crc kubenswrapper[4751]: I0131 15:02:08.896923 4751 patch_prober.go:28] interesting pod/machine-config-daemon-2wpj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 15:02:08 crc kubenswrapper[4751]: I0131 15:02:08.897527 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podUID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 15:02:10 crc kubenswrapper[4751]: I0131 15:02:10.710967 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:10 crc kubenswrapper[4751]: I0131 15:02:10.711274 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:10 crc kubenswrapper[4751]: I0131 15:02:10.711285 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:10 crc kubenswrapper[4751]: I0131 15:02:10.744500 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:10 crc kubenswrapper[4751]: I0131 15:02:10.753213 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:10 crc kubenswrapper[4751]: I0131 15:02:10.776691 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:11 crc kubenswrapper[4751]: I0131 15:02:11.421808 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:11 crc kubenswrapper[4751]: I0131 15:02:11.421857 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:11 crc kubenswrapper[4751]: I0131 15:02:11.421866 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:11 crc kubenswrapper[4751]: I0131 15:02:11.435409 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:11 crc kubenswrapper[4751]: I0131 15:02:11.435486 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:11 crc kubenswrapper[4751]: I0131 15:02:11.436275 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.712029 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.715264 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.726236 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.727603 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.743376 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.777300 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.810097 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.811736 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.821000 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.822868 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.836134 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.846574 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.867698 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-etc-iscsi\") pod \"glance-default-external-api-2\" (UID: \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.867745 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-sys\") pod \"glance-default-external-api-2\" (UID: \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.867773 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.867796 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkrk5\" (UniqueName: \"kubernetes.io/projected/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-kube-api-access-vkrk5\") pod \"glance-default-external-api-2\" (UID: \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.867834 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-config-data\") pod \"glance-default-external-api-1\" (UID: \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.867853 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-2\" (UID: \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.867869 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.867885 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t25kc\" (UniqueName: \"kubernetes.io/projected/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-kube-api-access-t25kc\") pod \"glance-default-external-api-1\" (UID: \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.867908 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-logs\") pod \"glance-default-external-api-1\" (UID: \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.867927 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-logs\") pod \"glance-default-external-api-2\" (UID: \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.867942 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.867956 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-dev\") pod \"glance-default-external-api-2\" (UID: \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.867972 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-1\" (UID: \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.867995 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-external-api-2\" (UID: \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.868011 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-dev\") pod \"glance-default-external-api-1\" (UID: \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.868030 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-run\") pod \"glance-default-external-api-2\" (UID: \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.868043 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-scripts\") pod \"glance-default-external-api-1\" (UID: \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.868063 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-scripts\") pod \"glance-default-external-api-2\" (UID: \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.868101 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.868204 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-etc-nvme\") pod \"glance-default-external-api-2\" (UID: \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.868298 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-sys\") pod \"glance-default-external-api-1\" (UID: \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.868415 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-1\" (UID: \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.868483 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-lib-modules\") pod \"glance-default-external-api-2\" (UID: \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.868504 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-var-locks-brick\") pod \"glance-default-external-api-2\" (UID: \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.868523 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-config-data\") pod \"glance-default-external-api-2\" (UID: \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.868544 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-httpd-run\") pod \"glance-default-external-api-2\" (UID: \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.869424 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.869481 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-run\") pod \"glance-default-external-api-1\" (UID: \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.971336 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t25kc\" (UniqueName: \"kubernetes.io/projected/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-kube-api-access-t25kc\") pod \"glance-default-external-api-1\" (UID: \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.971608 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-logs\") pod \"glance-default-external-api-1\" (UID: \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.971688 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/95acd323-0a11-4e25-8439-f848c8811df5-dev\") pod \"glance-default-internal-api-1\" (UID: \"95acd323-0a11-4e25-8439-f848c8811df5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.971800 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/95acd323-0a11-4e25-8439-f848c8811df5-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"95acd323-0a11-4e25-8439-f848c8811df5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.972120 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-logs\") pod \"glance-default-external-api-1\" (UID: \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.972647 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-logs\") pod \"glance-default-external-api-2\" (UID: \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.972677 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.972702 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-dev\") pod \"glance-default-external-api-2\" (UID: \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.972724 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-1\" (UID: \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.972744 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-2\" (UID: \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.972764 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-1\" (UID: \"95acd323-0a11-4e25-8439-f848c8811df5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.972779 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-scripts\") pod \"glance-default-internal-api-2\" (UID: \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.972822 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-external-api-2\" (UID: \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.972840 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-dev\") pod \"glance-default-external-api-1\" (UID: \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.972862 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/95acd323-0a11-4e25-8439-f848c8811df5-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"95acd323-0a11-4e25-8439-f848c8811df5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.972917 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-run\") pod \"glance-default-external-api-2\" (UID: \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.972967 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-scripts\") pod \"glance-default-external-api-1\" (UID: \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.972986 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-2\" (UID: \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.972996 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-logs\") pod \"glance-default-external-api-2\" (UID: \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.973012 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/95acd323-0a11-4e25-8439-f848c8811df5-run\") pod \"glance-default-internal-api-1\" (UID: \"95acd323-0a11-4e25-8439-f848c8811df5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.973051 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-dev\") pod \"glance-default-external-api-2\" (UID: \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.973101 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-scripts\") pod \"glance-default-external-api-2\" (UID: \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.973148 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-lib-modules\") pod \"glance-default-internal-api-2\" (UID: \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.973169 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-etc-nvme\") pod \"glance-default-internal-api-2\" (UID: \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.973189 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-logs\") pod \"glance-default-internal-api-2\" (UID: \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.973210 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.973244 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/95acd323-0a11-4e25-8439-f848c8811df5-sys\") pod \"glance-default-internal-api-1\" (UID: \"95acd323-0a11-4e25-8439-f848c8811df5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.973269 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-etc-iscsi\") pod \"glance-default-internal-api-2\" (UID: \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.973299 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-etc-nvme\") pod \"glance-default-external-api-2\" (UID: \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.973353 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-sys\") pod \"glance-default-external-api-1\" (UID: \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.973356 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-1\" (UID: \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\") device mount path \"/mnt/openstack/pv03\"" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.973377 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-httpd-run\") pod \"glance-default-internal-api-2\" (UID: \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.973553 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-dev\") pod \"glance-default-external-api-1\" (UID: \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.973596 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-config-data\") pod \"glance-default-internal-api-2\" (UID: \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.973614 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-etc-nvme\") pod \"glance-default-external-api-2\" (UID: \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.973628 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-sys\") pod \"glance-default-external-api-1\" (UID: \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.973786 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-external-api-2\" (UID: \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\") device mount path \"/mnt/openstack/pv15\"" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.973815 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.973846 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-run\") pod \"glance-default-external-api-2\" (UID: \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.974033 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.974974 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfkw8\" (UniqueName: \"kubernetes.io/projected/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-kube-api-access-lfkw8\") pod \"glance-default-internal-api-2\" (UID: \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.975050 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-1\" (UID: \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.975085 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vklgz\" (UniqueName: \"kubernetes.io/projected/95acd323-0a11-4e25-8439-f848c8811df5-kube-api-access-vklgz\") pod \"glance-default-internal-api-1\" (UID: \"95acd323-0a11-4e25-8439-f848c8811df5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.975163 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-dev\") pod \"glance-default-internal-api-2\" (UID: \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.975218 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-lib-modules\") pod \"glance-default-external-api-2\" (UID: \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.975246 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-var-locks-brick\") pod \"glance-default-external-api-2\" (UID: \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.975267 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-config-data\") pod \"glance-default-external-api-2\" (UID: \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.975277 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-lib-modules\") pod \"glance-default-external-api-2\" (UID: \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.975288 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-httpd-run\") pod \"glance-default-external-api-2\" (UID: \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.975282 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-1\" (UID: \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\") device mount path \"/mnt/openstack/pv01\"" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.975326 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-var-locks-brick\") pod \"glance-default-external-api-2\" (UID: \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.975353 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/95acd323-0a11-4e25-8439-f848c8811df5-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"95acd323-0a11-4e25-8439-f848c8811df5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.975469 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.975546 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95acd323-0a11-4e25-8439-f848c8811df5-config-data\") pod \"glance-default-internal-api-1\" (UID: \"95acd323-0a11-4e25-8439-f848c8811df5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.975573 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-run\") pod \"glance-default-external-api-1\" (UID: \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.975588 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.975613 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-var-locks-brick\") pod \"glance-default-internal-api-2\" (UID: \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.975645 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-run\") pod \"glance-default-external-api-1\" (UID: \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.975697 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-etc-iscsi\") pod \"glance-default-external-api-2\" (UID: \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.975730 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-etc-iscsi\") pod \"glance-default-external-api-2\" (UID: \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.975692 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-httpd-run\") pod \"glance-default-external-api-2\" (UID: \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.975846 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95acd323-0a11-4e25-8439-f848c8811df5-scripts\") pod \"glance-default-internal-api-1\" (UID: \"95acd323-0a11-4e25-8439-f848c8811df5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.975894 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95acd323-0a11-4e25-8439-f848c8811df5-logs\") pod \"glance-default-internal-api-1\" (UID: \"95acd323-0a11-4e25-8439-f848c8811df5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.975918 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/95acd323-0a11-4e25-8439-f848c8811df5-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"95acd323-0a11-4e25-8439-f848c8811df5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.975939 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-sys\") pod \"glance-default-external-api-2\" (UID: \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.975954 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-sys\") pod \"glance-default-internal-api-2\" (UID: \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.976034 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.976051 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/95acd323-0a11-4e25-8439-f848c8811df5-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"95acd323-0a11-4e25-8439-f848c8811df5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.976113 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkrk5\" (UniqueName: \"kubernetes.io/projected/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-kube-api-access-vkrk5\") pod \"glance-default-external-api-2\" (UID: \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.976126 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.976138 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-run\") pod \"glance-default-internal-api-2\" (UID: \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.976064 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-sys\") pod \"glance-default-external-api-2\" (UID: \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.976195 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-config-data\") pod \"glance-default-external-api-1\" (UID: \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.976223 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-2\" (UID: \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.976246 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-internal-api-1\" (UID: \"95acd323-0a11-4e25-8439-f848c8811df5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.976269 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.976516 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-2\" (UID: \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\") device mount path \"/mnt/openstack/pv08\"" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.976663 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.980346 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-scripts\") pod \"glance-default-external-api-2\" (UID: \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.983212 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-config-data\") pod \"glance-default-external-api-1\" (UID: \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.984354 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-config-data\") pod \"glance-default-external-api-2\" (UID: \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.988166 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t25kc\" (UniqueName: \"kubernetes.io/projected/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-kube-api-access-t25kc\") pod \"glance-default-external-api-1\" (UID: \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.988675 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-scripts\") pod \"glance-default-external-api-1\" (UID: \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:02:13 crc kubenswrapper[4751]: I0131 15:02:13.996376 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkrk5\" (UniqueName: \"kubernetes.io/projected/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-kube-api-access-vkrk5\") pod \"glance-default-external-api-2\" (UID: \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.000625 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-2\" (UID: \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.001323 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-1\" (UID: \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.008149 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-1\" (UID: \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.014704 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-external-api-2\" (UID: \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.043999 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.059706 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.077677 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-run\") pod \"glance-default-internal-api-2\" (UID: \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.077758 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-internal-api-1\" (UID: \"95acd323-0a11-4e25-8439-f848c8811df5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.077796 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/95acd323-0a11-4e25-8439-f848c8811df5-dev\") pod \"glance-default-internal-api-1\" (UID: \"95acd323-0a11-4e25-8439-f848c8811df5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.077870 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/95acd323-0a11-4e25-8439-f848c8811df5-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"95acd323-0a11-4e25-8439-f848c8811df5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.077907 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-2\" (UID: \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.077935 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-1\" (UID: \"95acd323-0a11-4e25-8439-f848c8811df5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.077954 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-scripts\") pod \"glance-default-internal-api-2\" (UID: \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.077964 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-run\") pod \"glance-default-internal-api-2\" (UID: \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.077990 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/95acd323-0a11-4e25-8439-f848c8811df5-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"95acd323-0a11-4e25-8439-f848c8811df5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.078042 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-2\" (UID: \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.078084 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/95acd323-0a11-4e25-8439-f848c8811df5-run\") pod \"glance-default-internal-api-1\" (UID: \"95acd323-0a11-4e25-8439-f848c8811df5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.078095 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/95acd323-0a11-4e25-8439-f848c8811df5-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"95acd323-0a11-4e25-8439-f848c8811df5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.078400 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-2\" (UID: \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\") device mount path \"/mnt/openstack/pv13\"" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.078122 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-lib-modules\") pod \"glance-default-internal-api-2\" (UID: \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.078488 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-etc-nvme\") pod \"glance-default-internal-api-2\" (UID: \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.078489 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/95acd323-0a11-4e25-8439-f848c8811df5-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"95acd323-0a11-4e25-8439-f848c8811df5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.078515 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-logs\") pod \"glance-default-internal-api-2\" (UID: \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.078301 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/95acd323-0a11-4e25-8439-f848c8811df5-run\") pod \"glance-default-internal-api-1\" (UID: \"95acd323-0a11-4e25-8439-f848c8811df5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.078156 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-lib-modules\") pod \"glance-default-internal-api-2\" (UID: \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.078165 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-1\" (UID: \"95acd323-0a11-4e25-8439-f848c8811df5\") device mount path \"/mnt/openstack/pv18\"" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.078575 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-etc-nvme\") pod \"glance-default-internal-api-2\" (UID: \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.078116 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-internal-api-1\" (UID: \"95acd323-0a11-4e25-8439-f848c8811df5\") device mount path \"/mnt/openstack/pv16\"" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.078270 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-2\" (UID: \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\") device mount path \"/mnt/openstack/pv04\"" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.078754 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/95acd323-0a11-4e25-8439-f848c8811df5-sys\") pod \"glance-default-internal-api-1\" (UID: \"95acd323-0a11-4e25-8439-f848c8811df5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.078809 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-etc-iscsi\") pod \"glance-default-internal-api-2\" (UID: \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.078863 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-httpd-run\") pod \"glance-default-internal-api-2\" (UID: \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.078905 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-config-data\") pod \"glance-default-internal-api-2\" (UID: \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.078931 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfkw8\" (UniqueName: \"kubernetes.io/projected/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-kube-api-access-lfkw8\") pod \"glance-default-internal-api-2\" (UID: \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.078969 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vklgz\" (UniqueName: \"kubernetes.io/projected/95acd323-0a11-4e25-8439-f848c8811df5-kube-api-access-vklgz\") pod \"glance-default-internal-api-1\" (UID: \"95acd323-0a11-4e25-8439-f848c8811df5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.078998 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-dev\") pod \"glance-default-internal-api-2\" (UID: \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.079035 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-logs\") pod \"glance-default-internal-api-2\" (UID: \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.079056 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/95acd323-0a11-4e25-8439-f848c8811df5-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"95acd323-0a11-4e25-8439-f848c8811df5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.079110 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/95acd323-0a11-4e25-8439-f848c8811df5-sys\") pod \"glance-default-internal-api-1\" (UID: \"95acd323-0a11-4e25-8439-f848c8811df5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.079115 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/95acd323-0a11-4e25-8439-f848c8811df5-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"95acd323-0a11-4e25-8439-f848c8811df5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.079137 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-etc-iscsi\") pod \"glance-default-internal-api-2\" (UID: \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.079417 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-dev\") pod \"glance-default-internal-api-2\" (UID: \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.078416 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/95acd323-0a11-4e25-8439-f848c8811df5-dev\") pod \"glance-default-internal-api-1\" (UID: \"95acd323-0a11-4e25-8439-f848c8811df5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.079867 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95acd323-0a11-4e25-8439-f848c8811df5-config-data\") pod \"glance-default-internal-api-1\" (UID: \"95acd323-0a11-4e25-8439-f848c8811df5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.079918 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-var-locks-brick\") pod \"glance-default-internal-api-2\" (UID: \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.079947 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95acd323-0a11-4e25-8439-f848c8811df5-scripts\") pod \"glance-default-internal-api-1\" (UID: \"95acd323-0a11-4e25-8439-f848c8811df5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.079983 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95acd323-0a11-4e25-8439-f848c8811df5-logs\") pod \"glance-default-internal-api-1\" (UID: \"95acd323-0a11-4e25-8439-f848c8811df5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.080009 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/95acd323-0a11-4e25-8439-f848c8811df5-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"95acd323-0a11-4e25-8439-f848c8811df5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.080034 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-sys\") pod \"glance-default-internal-api-2\" (UID: \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.080135 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/95acd323-0a11-4e25-8439-f848c8811df5-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"95acd323-0a11-4e25-8439-f848c8811df5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.080273 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/95acd323-0a11-4e25-8439-f848c8811df5-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"95acd323-0a11-4e25-8439-f848c8811df5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.080317 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-var-locks-brick\") pod \"glance-default-internal-api-2\" (UID: \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.083254 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-scripts\") pod \"glance-default-internal-api-2\" (UID: \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.083301 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-sys\") pod \"glance-default-internal-api-2\" (UID: \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.083316 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/95acd323-0a11-4e25-8439-f848c8811df5-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"95acd323-0a11-4e25-8439-f848c8811df5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.083665 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-httpd-run\") pod \"glance-default-internal-api-2\" (UID: \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.084486 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95acd323-0a11-4e25-8439-f848c8811df5-logs\") pod \"glance-default-internal-api-1\" (UID: \"95acd323-0a11-4e25-8439-f848c8811df5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.088956 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-config-data\") pod \"glance-default-internal-api-2\" (UID: \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.088984 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95acd323-0a11-4e25-8439-f848c8811df5-config-data\") pod \"glance-default-internal-api-1\" (UID: \"95acd323-0a11-4e25-8439-f848c8811df5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.096909 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95acd323-0a11-4e25-8439-f848c8811df5-scripts\") pod \"glance-default-internal-api-1\" (UID: \"95acd323-0a11-4e25-8439-f848c8811df5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.101597 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfkw8\" (UniqueName: \"kubernetes.io/projected/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-kube-api-access-lfkw8\") pod \"glance-default-internal-api-2\" (UID: \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.103730 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-1\" (UID: \"95acd323-0a11-4e25-8439-f848c8811df5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.105637 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-internal-api-1\" (UID: \"95acd323-0a11-4e25-8439-f848c8811df5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.109176 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vklgz\" (UniqueName: \"kubernetes.io/projected/95acd323-0a11-4e25-8439-f848c8811df5-kube-api-access-vklgz\") pod \"glance-default-internal-api-1\" (UID: \"95acd323-0a11-4e25-8439-f848c8811df5\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.110654 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-2\" (UID: \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.110830 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-2\" (UID: \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.132940 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.146820 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.509510 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.561151 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Jan 31 15:02:14 crc kubenswrapper[4751]: W0131 15:02:14.563103 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod320d0141_d27c_4f4d_9527_ae0f4db2f4fe.slice/crio-bcdaf73c5ebfac54d35792da2a6c14709ac5d1c622bf7ee51e304db07c548663 WatchSource:0}: Error finding container bcdaf73c5ebfac54d35792da2a6c14709ac5d1c622bf7ee51e304db07c548663: Status 404 returned error can't find the container with id bcdaf73c5ebfac54d35792da2a6c14709ac5d1c622bf7ee51e304db07c548663 Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.626094 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Jan 31 15:02:14 crc kubenswrapper[4751]: I0131 15:02:14.636013 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Jan 31 15:02:14 crc kubenswrapper[4751]: W0131 15:02:14.636561 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95acd323_0a11_4e25_8439_f848c8811df5.slice/crio-cf4c0a604cd5f16c5907f38582ee871fe39142f1bf81d14eb1c48a46548f4e79 WatchSource:0}: Error finding container cf4c0a604cd5f16c5907f38582ee871fe39142f1bf81d14eb1c48a46548f4e79: Status 404 returned error can't find the container with id cf4c0a604cd5f16c5907f38582ee871fe39142f1bf81d14eb1c48a46548f4e79 Jan 31 15:02:14 crc kubenswrapper[4751]: W0131 15:02:14.657658 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ad25a0a_80c0_46fc_9eb7_c91e86c2d3ad.slice/crio-71e60331171172a0be59cf0f295e6f9b8b83fa45d5df8411886bce4a9f2a6d9c WatchSource:0}: Error finding container 71e60331171172a0be59cf0f295e6f9b8b83fa45d5df8411886bce4a9f2a6d9c: Status 404 returned error can't find the container with id 71e60331171172a0be59cf0f295e6f9b8b83fa45d5df8411886bce4a9f2a6d9c Jan 31 15:02:15 crc kubenswrapper[4751]: I0131 15:02:15.458496 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"320d0141-d27c-4f4d-9527-ae0f4db2f4fe","Type":"ContainerStarted","Data":"6ede99f5fdfed08a9d7eac6a184d5660b106e725595d6c7cf676d14eefad9233"} Jan 31 15:02:15 crc kubenswrapper[4751]: I0131 15:02:15.458950 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"320d0141-d27c-4f4d-9527-ae0f4db2f4fe","Type":"ContainerStarted","Data":"682e4ce2a6210e637ba5e4d2dbaf2ac342164e083e46f40cc551cb32e38ee04b"} Jan 31 15:02:15 crc kubenswrapper[4751]: I0131 15:02:15.458961 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"320d0141-d27c-4f4d-9527-ae0f4db2f4fe","Type":"ContainerStarted","Data":"6913cb3476dd5e0905008785656c005c7a92a66aba7f669cab05437045db35fe"} Jan 31 15:02:15 crc kubenswrapper[4751]: I0131 15:02:15.458970 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"320d0141-d27c-4f4d-9527-ae0f4db2f4fe","Type":"ContainerStarted","Data":"bcdaf73c5ebfac54d35792da2a6c14709ac5d1c622bf7ee51e304db07c548663"} Jan 31 15:02:15 crc kubenswrapper[4751]: I0131 15:02:15.461946 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"95acd323-0a11-4e25-8439-f848c8811df5","Type":"ContainerStarted","Data":"1604ddf440b351b315839a0e9505346fa56959eb8d95f8bc4383bea7078e51c8"} Jan 31 15:02:15 crc kubenswrapper[4751]: I0131 15:02:15.461970 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"95acd323-0a11-4e25-8439-f848c8811df5","Type":"ContainerStarted","Data":"437b68b7d490ec3c812b65b9b5b235cb5bc63fbc0448ab8d751106a1f12c81b2"} Jan 31 15:02:15 crc kubenswrapper[4751]: I0131 15:02:15.461980 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"95acd323-0a11-4e25-8439-f848c8811df5","Type":"ContainerStarted","Data":"cf4c0a604cd5f16c5907f38582ee871fe39142f1bf81d14eb1c48a46548f4e79"} Jan 31 15:02:15 crc kubenswrapper[4751]: I0131 15:02:15.466207 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"6a459e47-85a7-4f4d-84ba-a7d3e01180dc","Type":"ContainerStarted","Data":"780c4a5951b3f68e86416bac67962e1bcf96fd06498cc8fffeda627b35ea2d9a"} Jan 31 15:02:15 crc kubenswrapper[4751]: I0131 15:02:15.466235 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"6a459e47-85a7-4f4d-84ba-a7d3e01180dc","Type":"ContainerStarted","Data":"ad949de535e978e687a00582bdf68a90265d128f5c20313eaa241f804c6802e1"} Jan 31 15:02:15 crc kubenswrapper[4751]: I0131 15:02:15.466244 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"6a459e47-85a7-4f4d-84ba-a7d3e01180dc","Type":"ContainerStarted","Data":"dfe4c141e34d71abd0cb1dc9d7e5475327eb2788a53a9e9597a4f06bd94deca9"} Jan 31 15:02:15 crc kubenswrapper[4751]: I0131 15:02:15.466253 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"6a459e47-85a7-4f4d-84ba-a7d3e01180dc","Type":"ContainerStarted","Data":"d8672d8a656b9f58508baa22372a3b5bcd5f2f26025dd43a8c5d2f9ca074eb76"} Jan 31 15:02:15 crc kubenswrapper[4751]: I0131 15:02:15.476215 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad","Type":"ContainerStarted","Data":"b399921133ccb6ca2d797749e4080f6c6ff9b63570cdf4da4a13f5d0b74243c8"} Jan 31 15:02:15 crc kubenswrapper[4751]: I0131 15:02:15.476263 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad","Type":"ContainerStarted","Data":"06745ce7f672d1fed8fdec0610aed76d08cb2a27019556cb1771945716a7dd90"} Jan 31 15:02:15 crc kubenswrapper[4751]: I0131 15:02:15.476280 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad","Type":"ContainerStarted","Data":"71e60331171172a0be59cf0f295e6f9b8b83fa45d5df8411886bce4a9f2a6d9c"} Jan 31 15:02:15 crc kubenswrapper[4751]: I0131 15:02:15.486964 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-1" podStartSLOduration=3.486941399 podStartE2EDuration="3.486941399s" podCreationTimestamp="2026-01-31 15:02:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 15:02:15.484438663 +0000 UTC m=+1239.859151548" watchObservedRunningTime="2026-01-31 15:02:15.486941399 +0000 UTC m=+1239.861654284" Jan 31 15:02:15 crc kubenswrapper[4751]: I0131 15:02:15.513377 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-2" podStartSLOduration=3.513359105 podStartE2EDuration="3.513359105s" podCreationTimestamp="2026-01-31 15:02:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 15:02:15.509526604 +0000 UTC m=+1239.884239489" watchObservedRunningTime="2026-01-31 15:02:15.513359105 +0000 UTC m=+1239.888071990" Jan 31 15:02:16 crc kubenswrapper[4751]: I0131 15:02:16.486057 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad","Type":"ContainerStarted","Data":"a5e9523257d5d31be3474c9ebb00bf1150499e723b36a65c3a75e75e8cce0b9b"} Jan 31 15:02:16 crc kubenswrapper[4751]: I0131 15:02:16.490231 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"95acd323-0a11-4e25-8439-f848c8811df5","Type":"ContainerStarted","Data":"14cacda155a9f23ba59e4f2a3da7afceb0ef3ab9efee85c5bb5e5e8c246a77bf"} Jan 31 15:02:16 crc kubenswrapper[4751]: I0131 15:02:16.534355 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-2" podStartSLOduration=4.534335358 podStartE2EDuration="4.534335358s" podCreationTimestamp="2026-01-31 15:02:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 15:02:16.52985317 +0000 UTC m=+1240.904566055" watchObservedRunningTime="2026-01-31 15:02:16.534335358 +0000 UTC m=+1240.909048253" Jan 31 15:02:16 crc kubenswrapper[4751]: I0131 15:02:16.564665 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-1" podStartSLOduration=4.5646452360000005 podStartE2EDuration="4.564645236s" podCreationTimestamp="2026-01-31 15:02:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 15:02:16.563801414 +0000 UTC m=+1240.938514319" watchObservedRunningTime="2026-01-31 15:02:16.564645236 +0000 UTC m=+1240.939358121" Jan 31 15:02:24 crc kubenswrapper[4751]: I0131 15:02:24.044330 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:02:24 crc kubenswrapper[4751]: I0131 15:02:24.046389 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:02:24 crc kubenswrapper[4751]: I0131 15:02:24.046500 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:02:24 crc kubenswrapper[4751]: I0131 15:02:24.060792 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:02:24 crc kubenswrapper[4751]: I0131 15:02:24.060841 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:02:24 crc kubenswrapper[4751]: I0131 15:02:24.060852 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:02:24 crc kubenswrapper[4751]: I0131 15:02:24.074241 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:02:24 crc kubenswrapper[4751]: I0131 15:02:24.084809 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:02:24 crc kubenswrapper[4751]: I0131 15:02:24.087391 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:02:24 crc kubenswrapper[4751]: I0131 15:02:24.090876 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:02:24 crc kubenswrapper[4751]: I0131 15:02:24.091279 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:02:24 crc kubenswrapper[4751]: I0131 15:02:24.119093 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:02:24 crc kubenswrapper[4751]: I0131 15:02:24.134226 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:02:24 crc kubenswrapper[4751]: I0131 15:02:24.134276 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:02:24 crc kubenswrapper[4751]: I0131 15:02:24.134287 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:02:24 crc kubenswrapper[4751]: I0131 15:02:24.149542 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:02:24 crc kubenswrapper[4751]: I0131 15:02:24.149587 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:02:24 crc kubenswrapper[4751]: I0131 15:02:24.149636 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:02:24 crc kubenswrapper[4751]: I0131 15:02:24.160788 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:02:24 crc kubenswrapper[4751]: I0131 15:02:24.163256 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:02:24 crc kubenswrapper[4751]: I0131 15:02:24.174280 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:02:24 crc kubenswrapper[4751]: I0131 15:02:24.176196 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:02:24 crc kubenswrapper[4751]: I0131 15:02:24.179286 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:02:24 crc kubenswrapper[4751]: I0131 15:02:24.193329 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:02:24 crc kubenswrapper[4751]: I0131 15:02:24.549005 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:02:24 crc kubenswrapper[4751]: I0131 15:02:24.550301 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:02:24 crc kubenswrapper[4751]: I0131 15:02:24.550347 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:02:24 crc kubenswrapper[4751]: I0131 15:02:24.550368 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:02:24 crc kubenswrapper[4751]: I0131 15:02:24.550386 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:02:24 crc kubenswrapper[4751]: I0131 15:02:24.550403 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:02:24 crc kubenswrapper[4751]: I0131 15:02:24.550419 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:02:24 crc kubenswrapper[4751]: I0131 15:02:24.550488 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:02:24 crc kubenswrapper[4751]: I0131 15:02:24.550507 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:02:24 crc kubenswrapper[4751]: I0131 15:02:24.550521 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:02:24 crc kubenswrapper[4751]: I0131 15:02:24.550537 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:02:24 crc kubenswrapper[4751]: I0131 15:02:24.550553 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:02:24 crc kubenswrapper[4751]: I0131 15:02:24.562637 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:02:24 crc kubenswrapper[4751]: I0131 15:02:24.563511 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:02:24 crc kubenswrapper[4751]: I0131 15:02:24.566211 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:02:24 crc kubenswrapper[4751]: I0131 15:02:24.566282 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:02:24 crc kubenswrapper[4751]: I0131 15:02:24.567152 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:02:24 crc kubenswrapper[4751]: I0131 15:02:24.568242 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:02:24 crc kubenswrapper[4751]: I0131 15:02:24.569012 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:02:24 crc kubenswrapper[4751]: I0131 15:02:24.570395 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:02:24 crc kubenswrapper[4751]: I0131 15:02:24.571572 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:02:24 crc kubenswrapper[4751]: I0131 15:02:24.572764 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:02:24 crc kubenswrapper[4751]: I0131 15:02:24.576595 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:02:24 crc kubenswrapper[4751]: I0131 15:02:24.580399 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:02:25 crc kubenswrapper[4751]: I0131 15:02:25.295864 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Jan 31 15:02:25 crc kubenswrapper[4751]: I0131 15:02:25.310054 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Jan 31 15:02:25 crc kubenswrapper[4751]: I0131 15:02:25.500986 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Jan 31 15:02:25 crc kubenswrapper[4751]: I0131 15:02:25.509336 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Jan 31 15:02:26 crc kubenswrapper[4751]: I0131 15:02:26.561539 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-1" podUID="320d0141-d27c-4f4d-9527-ae0f4db2f4fe" containerName="glance-log" containerID="cri-o://6913cb3476dd5e0905008785656c005c7a92a66aba7f669cab05437045db35fe" gracePeriod=30 Jan 31 15:02:26 crc kubenswrapper[4751]: I0131 15:02:26.561625 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-1" podUID="320d0141-d27c-4f4d-9527-ae0f4db2f4fe" containerName="glance-api" containerID="cri-o://6ede99f5fdfed08a9d7eac6a184d5660b106e725595d6c7cf676d14eefad9233" gracePeriod=30 Jan 31 15:02:26 crc kubenswrapper[4751]: I0131 15:02:26.561637 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-1" podUID="320d0141-d27c-4f4d-9527-ae0f4db2f4fe" containerName="glance-httpd" containerID="cri-o://682e4ce2a6210e637ba5e4d2dbaf2ac342164e083e46f40cc551cb32e38ee04b" gracePeriod=30 Jan 31 15:02:26 crc kubenswrapper[4751]: I0131 15:02:26.561731 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-2" podUID="6a459e47-85a7-4f4d-84ba-a7d3e01180dc" containerName="glance-log" containerID="cri-o://dfe4c141e34d71abd0cb1dc9d7e5475327eb2788a53a9e9597a4f06bd94deca9" gracePeriod=30 Jan 31 15:02:26 crc kubenswrapper[4751]: I0131 15:02:26.561787 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-2" podUID="6a459e47-85a7-4f4d-84ba-a7d3e01180dc" containerName="glance-api" containerID="cri-o://780c4a5951b3f68e86416bac67962e1bcf96fd06498cc8fffeda627b35ea2d9a" gracePeriod=30 Jan 31 15:02:26 crc kubenswrapper[4751]: I0131 15:02:26.561802 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-2" podUID="6a459e47-85a7-4f4d-84ba-a7d3e01180dc" containerName="glance-httpd" containerID="cri-o://ad949de535e978e687a00582bdf68a90265d128f5c20313eaa241f804c6802e1" gracePeriod=30 Jan 31 15:02:26 crc kubenswrapper[4751]: I0131 15:02:26.562039 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="95acd323-0a11-4e25-8439-f848c8811df5" containerName="glance-log" containerID="cri-o://437b68b7d490ec3c812b65b9b5b235cb5bc63fbc0448ab8d751106a1f12c81b2" gracePeriod=30 Jan 31 15:02:26 crc kubenswrapper[4751]: I0131 15:02:26.562056 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="95acd323-0a11-4e25-8439-f848c8811df5" containerName="glance-api" containerID="cri-o://14cacda155a9f23ba59e4f2a3da7afceb0ef3ab9efee85c5bb5e5e8c246a77bf" gracePeriod=30 Jan 31 15:02:26 crc kubenswrapper[4751]: I0131 15:02:26.562081 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="95acd323-0a11-4e25-8439-f848c8811df5" containerName="glance-httpd" containerID="cri-o://1604ddf440b351b315839a0e9505346fa56959eb8d95f8bc4383bea7078e51c8" gracePeriod=30 Jan 31 15:02:26 crc kubenswrapper[4751]: I0131 15:02:26.562192 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-2" podUID="7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad" containerName="glance-log" containerID="cri-o://06745ce7f672d1fed8fdec0610aed76d08cb2a27019556cb1771945716a7dd90" gracePeriod=30 Jan 31 15:02:26 crc kubenswrapper[4751]: I0131 15:02:26.562252 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-2" podUID="7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad" containerName="glance-api" containerID="cri-o://a5e9523257d5d31be3474c9ebb00bf1150499e723b36a65c3a75e75e8cce0b9b" gracePeriod=30 Jan 31 15:02:26 crc kubenswrapper[4751]: I0131 15:02:26.562266 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-2" podUID="7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad" containerName="glance-httpd" containerID="cri-o://b399921133ccb6ca2d797749e4080f6c6ff9b63570cdf4da4a13f5d0b74243c8" gracePeriod=30 Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.366550 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.446661 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-scripts\") pod \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\" (UID: \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\") " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.446765 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-etc-nvme\") pod \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\" (UID: \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\") " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.446799 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\" (UID: \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\") " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.446830 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-etc-iscsi\") pod \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\" (UID: \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\") " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.447119 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-var-locks-brick\") pod \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\" (UID: \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\") " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.447155 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-sys\") pod \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\" (UID: \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\") " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.447221 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-dev\") pod \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\" (UID: \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\") " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.447252 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-lib-modules\") pod \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\" (UID: \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\") " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.447342 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-config-data\") pod \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\" (UID: \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\") " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.447350 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad" (UID: "7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.447388 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-run\") pod \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\" (UID: \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\") " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.447462 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-run" (OuterVolumeSpecName: "run") pod "7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad" (UID: "7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.447526 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad" (UID: "7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.447549 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-httpd-run\") pod \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\" (UID: \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\") " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.447561 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-sys" (OuterVolumeSpecName: "sys") pod "7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad" (UID: "7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.447595 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-dev" (OuterVolumeSpecName: "dev") pod "7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad" (UID: "7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.447620 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-logs\") pod \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\" (UID: \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\") " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.447628 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad" (UID: "7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.447674 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfkw8\" (UniqueName: \"kubernetes.io/projected/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-kube-api-access-lfkw8\") pod \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\" (UID: \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\") " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.447727 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\" (UID: \"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad\") " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.448977 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-logs" (OuterVolumeSpecName: "logs") pod "7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad" (UID: "7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.449208 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad" (UID: "7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.449903 4751 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-dev\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.449941 4751 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-lib-modules\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.449952 4751 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-run\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.449962 4751 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.449971 4751 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-logs\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.449982 4751 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-etc-iscsi\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.449992 4751 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-var-locks-brick\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.450002 4751 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-sys\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.450236 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad" (UID: "7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.458243 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage13-crc" (OuterVolumeSpecName: "glance-cache") pod "7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad" (UID: "7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad"). InnerVolumeSpecName "local-storage13-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.458676 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-scripts" (OuterVolumeSpecName: "scripts") pod "7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad" (UID: "7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.458751 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-kube-api-access-lfkw8" (OuterVolumeSpecName: "kube-api-access-lfkw8") pod "7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad" (UID: "7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad"). InnerVolumeSpecName "kube-api-access-lfkw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.462194 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad" (UID: "7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.489527 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.501098 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.507213 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.550816 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkrk5\" (UniqueName: \"kubernetes.io/projected/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-kube-api-access-vkrk5\") pod \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\" (UID: \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\") " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.551064 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-config-data\") pod \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\" (UID: \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\") " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.551193 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-etc-iscsi\") pod \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\" (UID: \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\") " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.551283 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "6a459e47-85a7-4f4d-84ba-a7d3e01180dc" (UID: "6a459e47-85a7-4f4d-84ba-a7d3e01180dc"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.551410 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-var-locks-brick\") pod \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\" (UID: \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\") " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.551432 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "6a459e47-85a7-4f4d-84ba-a7d3e01180dc" (UID: "6a459e47-85a7-4f4d-84ba-a7d3e01180dc"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.551763 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-sys\") pod \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\" (UID: \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\") " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.551962 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-lib-modules\") pod \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\" (UID: \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\") " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.552348 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\" (UID: \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\") " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.552463 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-etc-nvme\") pod \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\" (UID: \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\") " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.552579 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-httpd-run\") pod \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\" (UID: \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\") " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.552688 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-run\") pod \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\" (UID: \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\") " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.552820 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-dev\") pod \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\" (UID: \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\") " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.552931 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-logs\") pod \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\" (UID: \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\") " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.553046 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-scripts\") pod \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\" (UID: \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\") " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.553151 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\" (UID: \"6a459e47-85a7-4f4d-84ba-a7d3e01180dc\") " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.553723 4751 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-etc-iscsi\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.553811 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfkw8\" (UniqueName: \"kubernetes.io/projected/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-kube-api-access-lfkw8\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.553884 4751 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-var-locks-brick\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.553966 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.554036 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.554125 4751 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-etc-nvme\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.554203 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.551895 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-sys" (OuterVolumeSpecName: "sys") pod "6a459e47-85a7-4f4d-84ba-a7d3e01180dc" (UID: "6a459e47-85a7-4f4d-84ba-a7d3e01180dc"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.552283 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "6a459e47-85a7-4f4d-84ba-a7d3e01180dc" (UID: "6a459e47-85a7-4f4d-84ba-a7d3e01180dc"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.556275 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-dev" (OuterVolumeSpecName: "dev") pod "6a459e47-85a7-4f4d-84ba-a7d3e01180dc" (UID: "6a459e47-85a7-4f4d-84ba-a7d3e01180dc"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.556314 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "6a459e47-85a7-4f4d-84ba-a7d3e01180dc" (UID: "6a459e47-85a7-4f4d-84ba-a7d3e01180dc"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.557042 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6a459e47-85a7-4f4d-84ba-a7d3e01180dc" (UID: "6a459e47-85a7-4f4d-84ba-a7d3e01180dc"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.557061 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-logs" (OuterVolumeSpecName: "logs") pod "6a459e47-85a7-4f4d-84ba-a7d3e01180dc" (UID: "6a459e47-85a7-4f4d-84ba-a7d3e01180dc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.558300 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-scripts" (OuterVolumeSpecName: "scripts") pod "6a459e47-85a7-4f4d-84ba-a7d3e01180dc" (UID: "6a459e47-85a7-4f4d-84ba-a7d3e01180dc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.564020 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-run" (OuterVolumeSpecName: "run") pod "6a459e47-85a7-4f4d-84ba-a7d3e01180dc" (UID: "6a459e47-85a7-4f4d-84ba-a7d3e01180dc"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.564362 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-kube-api-access-vkrk5" (OuterVolumeSpecName: "kube-api-access-vkrk5") pod "6a459e47-85a7-4f4d-84ba-a7d3e01180dc" (UID: "6a459e47-85a7-4f4d-84ba-a7d3e01180dc"). InnerVolumeSpecName "kube-api-access-vkrk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.575481 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage15-crc" (OuterVolumeSpecName: "glance") pod "6a459e47-85a7-4f4d-84ba-a7d3e01180dc" (UID: "6a459e47-85a7-4f4d-84ba-a7d3e01180dc"). InnerVolumeSpecName "local-storage15-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.575681 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage13-crc" (UniqueName: "kubernetes.io/local-volume/local-storage13-crc") on node "crc" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.579156 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.584038 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance-cache") pod "6a459e47-85a7-4f4d-84ba-a7d3e01180dc" (UID: "6a459e47-85a7-4f4d-84ba-a7d3e01180dc"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.588236 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-config-data" (OuterVolumeSpecName: "config-data") pod "7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad" (UID: "7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.597717 4751 generic.go:334] "Generic (PLEG): container finished" podID="95acd323-0a11-4e25-8439-f848c8811df5" containerID="14cacda155a9f23ba59e4f2a3da7afceb0ef3ab9efee85c5bb5e5e8c246a77bf" exitCode=0 Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.597751 4751 generic.go:334] "Generic (PLEG): container finished" podID="95acd323-0a11-4e25-8439-f848c8811df5" containerID="1604ddf440b351b315839a0e9505346fa56959eb8d95f8bc4383bea7078e51c8" exitCode=0 Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.597762 4751 generic.go:334] "Generic (PLEG): container finished" podID="95acd323-0a11-4e25-8439-f848c8811df5" containerID="437b68b7d490ec3c812b65b9b5b235cb5bc63fbc0448ab8d751106a1f12c81b2" exitCode=143 Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.597812 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"95acd323-0a11-4e25-8439-f848c8811df5","Type":"ContainerDied","Data":"14cacda155a9f23ba59e4f2a3da7afceb0ef3ab9efee85c5bb5e5e8c246a77bf"} Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.597843 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"95acd323-0a11-4e25-8439-f848c8811df5","Type":"ContainerDied","Data":"1604ddf440b351b315839a0e9505346fa56959eb8d95f8bc4383bea7078e51c8"} Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.597856 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"95acd323-0a11-4e25-8439-f848c8811df5","Type":"ContainerDied","Data":"437b68b7d490ec3c812b65b9b5b235cb5bc63fbc0448ab8d751106a1f12c81b2"} Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.597871 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"95acd323-0a11-4e25-8439-f848c8811df5","Type":"ContainerDied","Data":"cf4c0a604cd5f16c5907f38582ee871fe39142f1bf81d14eb1c48a46548f4e79"} Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.597889 4751 scope.go:117] "RemoveContainer" containerID="14cacda155a9f23ba59e4f2a3da7afceb0ef3ab9efee85c5bb5e5e8c246a77bf" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.598030 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.608498 4751 generic.go:334] "Generic (PLEG): container finished" podID="6a459e47-85a7-4f4d-84ba-a7d3e01180dc" containerID="780c4a5951b3f68e86416bac67962e1bcf96fd06498cc8fffeda627b35ea2d9a" exitCode=0 Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.608533 4751 generic.go:334] "Generic (PLEG): container finished" podID="6a459e47-85a7-4f4d-84ba-a7d3e01180dc" containerID="ad949de535e978e687a00582bdf68a90265d128f5c20313eaa241f804c6802e1" exitCode=0 Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.608542 4751 generic.go:334] "Generic (PLEG): container finished" podID="6a459e47-85a7-4f4d-84ba-a7d3e01180dc" containerID="dfe4c141e34d71abd0cb1dc9d7e5475327eb2788a53a9e9597a4f06bd94deca9" exitCode=143 Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.608616 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"6a459e47-85a7-4f4d-84ba-a7d3e01180dc","Type":"ContainerDied","Data":"780c4a5951b3f68e86416bac67962e1bcf96fd06498cc8fffeda627b35ea2d9a"} Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.608650 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"6a459e47-85a7-4f4d-84ba-a7d3e01180dc","Type":"ContainerDied","Data":"ad949de535e978e687a00582bdf68a90265d128f5c20313eaa241f804c6802e1"} Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.608663 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"6a459e47-85a7-4f4d-84ba-a7d3e01180dc","Type":"ContainerDied","Data":"dfe4c141e34d71abd0cb1dc9d7e5475327eb2788a53a9e9597a4f06bd94deca9"} Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.608676 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"6a459e47-85a7-4f4d-84ba-a7d3e01180dc","Type":"ContainerDied","Data":"d8672d8a656b9f58508baa22372a3b5bcd5f2f26025dd43a8c5d2f9ca074eb76"} Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.608750 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.619808 4751 generic.go:334] "Generic (PLEG): container finished" podID="7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad" containerID="a5e9523257d5d31be3474c9ebb00bf1150499e723b36a65c3a75e75e8cce0b9b" exitCode=0 Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.619866 4751 generic.go:334] "Generic (PLEG): container finished" podID="7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad" containerID="b399921133ccb6ca2d797749e4080f6c6ff9b63570cdf4da4a13f5d0b74243c8" exitCode=0 Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.619861 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad","Type":"ContainerDied","Data":"a5e9523257d5d31be3474c9ebb00bf1150499e723b36a65c3a75e75e8cce0b9b"} Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.619895 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.619913 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad","Type":"ContainerDied","Data":"b399921133ccb6ca2d797749e4080f6c6ff9b63570cdf4da4a13f5d0b74243c8"} Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.619925 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad","Type":"ContainerDied","Data":"06745ce7f672d1fed8fdec0610aed76d08cb2a27019556cb1771945716a7dd90"} Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.619877 4751 generic.go:334] "Generic (PLEG): container finished" podID="7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad" containerID="06745ce7f672d1fed8fdec0610aed76d08cb2a27019556cb1771945716a7dd90" exitCode=143 Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.619976 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad","Type":"ContainerDied","Data":"71e60331171172a0be59cf0f295e6f9b8b83fa45d5df8411886bce4a9f2a6d9c"} Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.626706 4751 generic.go:334] "Generic (PLEG): container finished" podID="320d0141-d27c-4f4d-9527-ae0f4db2f4fe" containerID="6ede99f5fdfed08a9d7eac6a184d5660b106e725595d6c7cf676d14eefad9233" exitCode=0 Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.626741 4751 generic.go:334] "Generic (PLEG): container finished" podID="320d0141-d27c-4f4d-9527-ae0f4db2f4fe" containerID="682e4ce2a6210e637ba5e4d2dbaf2ac342164e083e46f40cc551cb32e38ee04b" exitCode=0 Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.626749 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.626766 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"320d0141-d27c-4f4d-9527-ae0f4db2f4fe","Type":"ContainerDied","Data":"6ede99f5fdfed08a9d7eac6a184d5660b106e725595d6c7cf676d14eefad9233"} Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.626799 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"320d0141-d27c-4f4d-9527-ae0f4db2f4fe","Type":"ContainerDied","Data":"682e4ce2a6210e637ba5e4d2dbaf2ac342164e083e46f40cc551cb32e38ee04b"} Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.626811 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"320d0141-d27c-4f4d-9527-ae0f4db2f4fe","Type":"ContainerDied","Data":"6913cb3476dd5e0905008785656c005c7a92a66aba7f669cab05437045db35fe"} Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.626751 4751 generic.go:334] "Generic (PLEG): container finished" podID="320d0141-d27c-4f4d-9527-ae0f4db2f4fe" containerID="6913cb3476dd5e0905008785656c005c7a92a66aba7f669cab05437045db35fe" exitCode=143 Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.626901 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"320d0141-d27c-4f4d-9527-ae0f4db2f4fe","Type":"ContainerDied","Data":"bcdaf73c5ebfac54d35792da2a6c14709ac5d1c622bf7ee51e304db07c548663"} Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.634778 4751 scope.go:117] "RemoveContainer" containerID="1604ddf440b351b315839a0e9505346fa56959eb8d95f8bc4383bea7078e51c8" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.662440 4751 scope.go:117] "RemoveContainer" containerID="437b68b7d490ec3c812b65b9b5b235cb5bc63fbc0448ab8d751106a1f12c81b2" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.662905 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-etc-nvme\") pod \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\" (UID: \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\") " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.662939 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-scripts\") pod \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\" (UID: \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\") " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.662972 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-dev\") pod \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\" (UID: \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\") " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.662991 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/95acd323-0a11-4e25-8439-f848c8811df5-httpd-run\") pod \"95acd323-0a11-4e25-8439-f848c8811df5\" (UID: \"95acd323-0a11-4e25-8439-f848c8811df5\") " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.663038 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-httpd-run\") pod \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\" (UID: \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\") " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.663103 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-run\") pod \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\" (UID: \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\") " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.663132 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95acd323-0a11-4e25-8439-f848c8811df5-config-data\") pod \"95acd323-0a11-4e25-8439-f848c8811df5\" (UID: \"95acd323-0a11-4e25-8439-f848c8811df5\") " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.663177 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/95acd323-0a11-4e25-8439-f848c8811df5-sys\") pod \"95acd323-0a11-4e25-8439-f848c8811df5\" (UID: \"95acd323-0a11-4e25-8439-f848c8811df5\") " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.663227 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/95acd323-0a11-4e25-8439-f848c8811df5-dev\") pod \"95acd323-0a11-4e25-8439-f848c8811df5\" (UID: \"95acd323-0a11-4e25-8439-f848c8811df5\") " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.663253 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t25kc\" (UniqueName: \"kubernetes.io/projected/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-kube-api-access-t25kc\") pod \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\" (UID: \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\") " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.663289 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-var-locks-brick\") pod \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\" (UID: \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\") " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.663334 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/95acd323-0a11-4e25-8439-f848c8811df5-run\") pod \"95acd323-0a11-4e25-8439-f848c8811df5\" (UID: \"95acd323-0a11-4e25-8439-f848c8811df5\") " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.663358 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-etc-iscsi\") pod \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\" (UID: \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\") " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.663376 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/95acd323-0a11-4e25-8439-f848c8811df5-var-locks-brick\") pod \"95acd323-0a11-4e25-8439-f848c8811df5\" (UID: \"95acd323-0a11-4e25-8439-f848c8811df5\") " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.663410 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95acd323-0a11-4e25-8439-f848c8811df5-scripts\") pod \"95acd323-0a11-4e25-8439-f848c8811df5\" (UID: \"95acd323-0a11-4e25-8439-f848c8811df5\") " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.663441 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\" (UID: \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\") " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.663470 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-logs\") pod \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\" (UID: \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\") " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.663498 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/95acd323-0a11-4e25-8439-f848c8811df5-lib-modules\") pod \"95acd323-0a11-4e25-8439-f848c8811df5\" (UID: \"95acd323-0a11-4e25-8439-f848c8811df5\") " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.663517 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"95acd323-0a11-4e25-8439-f848c8811df5\" (UID: \"95acd323-0a11-4e25-8439-f848c8811df5\") " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.663540 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-config-data\") pod \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\" (UID: \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\") " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.663560 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/95acd323-0a11-4e25-8439-f848c8811df5-etc-iscsi\") pod \"95acd323-0a11-4e25-8439-f848c8811df5\" (UID: \"95acd323-0a11-4e25-8439-f848c8811df5\") " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.663580 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-lib-modules\") pod \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\" (UID: \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\") " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.663602 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95acd323-0a11-4e25-8439-f848c8811df5-logs\") pod \"95acd323-0a11-4e25-8439-f848c8811df5\" (UID: \"95acd323-0a11-4e25-8439-f848c8811df5\") " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.663649 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-sys\") pod \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\" (UID: \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\") " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.663670 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\" (UID: \"320d0141-d27c-4f4d-9527-ae0f4db2f4fe\") " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.663689 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/95acd323-0a11-4e25-8439-f848c8811df5-etc-nvme\") pod \"95acd323-0a11-4e25-8439-f848c8811df5\" (UID: \"95acd323-0a11-4e25-8439-f848c8811df5\") " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.663708 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"95acd323-0a11-4e25-8439-f848c8811df5\" (UID: \"95acd323-0a11-4e25-8439-f848c8811df5\") " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.663745 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vklgz\" (UniqueName: \"kubernetes.io/projected/95acd323-0a11-4e25-8439-f848c8811df5-kube-api-access-vklgz\") pod \"95acd323-0a11-4e25-8439-f848c8811df5\" (UID: \"95acd323-0a11-4e25-8439-f848c8811df5\") " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.664089 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.664114 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.664128 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkrk5\" (UniqueName: \"kubernetes.io/projected/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-kube-api-access-vkrk5\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.664142 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.664153 4751 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-sys\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.664164 4751 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-lib-modules\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.664176 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.664190 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.664201 4751 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-etc-nvme\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.664221 4751 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.664234 4751 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-run\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.664245 4751 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-dev\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.664255 4751 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-logs\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.664266 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.668031 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-config-data" (OuterVolumeSpecName: "config-data") pod "6a459e47-85a7-4f4d-84ba-a7d3e01180dc" (UID: "6a459e47-85a7-4f4d-84ba-a7d3e01180dc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.668872 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-dev" (OuterVolumeSpecName: "dev") pod "320d0141-d27c-4f4d-9527-ae0f4db2f4fe" (UID: "320d0141-d27c-4f4d-9527-ae0f4db2f4fe"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.668929 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "320d0141-d27c-4f4d-9527-ae0f4db2f4fe" (UID: "320d0141-d27c-4f4d-9527-ae0f4db2f4fe"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.669062 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95acd323-0a11-4e25-8439-f848c8811df5-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "95acd323-0a11-4e25-8439-f848c8811df5" (UID: "95acd323-0a11-4e25-8439-f848c8811df5"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.669302 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "320d0141-d27c-4f4d-9527-ae0f4db2f4fe" (UID: "320d0141-d27c-4f4d-9527-ae0f4db2f4fe"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.669332 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-run" (OuterVolumeSpecName: "run") pod "320d0141-d27c-4f4d-9527-ae0f4db2f4fe" (UID: "320d0141-d27c-4f4d-9527-ae0f4db2f4fe"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.669356 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "320d0141-d27c-4f4d-9527-ae0f4db2f4fe" (UID: "320d0141-d27c-4f4d-9527-ae0f4db2f4fe"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.669373 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/95acd323-0a11-4e25-8439-f848c8811df5-sys" (OuterVolumeSpecName: "sys") pod "95acd323-0a11-4e25-8439-f848c8811df5" (UID: "95acd323-0a11-4e25-8439-f848c8811df5"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.669390 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/95acd323-0a11-4e25-8439-f848c8811df5-dev" (OuterVolumeSpecName: "dev") pod "95acd323-0a11-4e25-8439-f848c8811df5" (UID: "95acd323-0a11-4e25-8439-f848c8811df5"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.670055 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.671133 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "320d0141-d27c-4f4d-9527-ae0f4db2f4fe" (UID: "320d0141-d27c-4f4d-9527-ae0f4db2f4fe"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.671182 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "320d0141-d27c-4f4d-9527-ae0f4db2f4fe" (UID: "320d0141-d27c-4f4d-9527-ae0f4db2f4fe"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.671220 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/95acd323-0a11-4e25-8439-f848c8811df5-run" (OuterVolumeSpecName: "run") pod "95acd323-0a11-4e25-8439-f848c8811df5" (UID: "95acd323-0a11-4e25-8439-f848c8811df5"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.671254 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/95acd323-0a11-4e25-8439-f848c8811df5-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "95acd323-0a11-4e25-8439-f848c8811df5" (UID: "95acd323-0a11-4e25-8439-f848c8811df5"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.671490 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95acd323-0a11-4e25-8439-f848c8811df5-logs" (OuterVolumeSpecName: "logs") pod "95acd323-0a11-4e25-8439-f848c8811df5" (UID: "95acd323-0a11-4e25-8439-f848c8811df5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.671490 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95acd323-0a11-4e25-8439-f848c8811df5-scripts" (OuterVolumeSpecName: "scripts") pod "95acd323-0a11-4e25-8439-f848c8811df5" (UID: "95acd323-0a11-4e25-8439-f848c8811df5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.671512 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/95acd323-0a11-4e25-8439-f848c8811df5-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "95acd323-0a11-4e25-8439-f848c8811df5" (UID: "95acd323-0a11-4e25-8439-f848c8811df5"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.671533 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-sys" (OuterVolumeSpecName: "sys") pod "320d0141-d27c-4f4d-9527-ae0f4db2f4fe" (UID: "320d0141-d27c-4f4d-9527-ae0f4db2f4fe"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.672116 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/95acd323-0a11-4e25-8439-f848c8811df5-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "95acd323-0a11-4e25-8439-f848c8811df5" (UID: "95acd323-0a11-4e25-8439-f848c8811df5"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.672128 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/95acd323-0a11-4e25-8439-f848c8811df5-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "95acd323-0a11-4e25-8439-f848c8811df5" (UID: "95acd323-0a11-4e25-8439-f848c8811df5"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.672201 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-logs" (OuterVolumeSpecName: "logs") pod "320d0141-d27c-4f4d-9527-ae0f4db2f4fe" (UID: "320d0141-d27c-4f4d-9527-ae0f4db2f4fe"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.673144 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-scripts" (OuterVolumeSpecName: "scripts") pod "320d0141-d27c-4f4d-9527-ae0f4db2f4fe" (UID: "320d0141-d27c-4f4d-9527-ae0f4db2f4fe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.673316 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-kube-api-access-t25kc" (OuterVolumeSpecName: "kube-api-access-t25kc") pod "320d0141-d27c-4f4d-9527-ae0f4db2f4fe" (UID: "320d0141-d27c-4f4d-9527-ae0f4db2f4fe"). InnerVolumeSpecName "kube-api-access-t25kc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.674585 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.674890 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "320d0141-d27c-4f4d-9527-ae0f4db2f4fe" (UID: "320d0141-d27c-4f4d-9527-ae0f4db2f4fe"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.676141 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95acd323-0a11-4e25-8439-f848c8811df5-kube-api-access-vklgz" (OuterVolumeSpecName: "kube-api-access-vklgz") pod "95acd323-0a11-4e25-8439-f848c8811df5" (UID: "95acd323-0a11-4e25-8439-f848c8811df5"). InnerVolumeSpecName "kube-api-access-vklgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.678292 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage16-crc" (OuterVolumeSpecName: "glance") pod "95acd323-0a11-4e25-8439-f848c8811df5" (UID: "95acd323-0a11-4e25-8439-f848c8811df5"). InnerVolumeSpecName "local-storage16-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.679264 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance-cache") pod "320d0141-d27c-4f4d-9527-ae0f4db2f4fe" (UID: "320d0141-d27c-4f4d-9527-ae0f4db2f4fe"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.679761 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage18-crc" (OuterVolumeSpecName: "glance-cache") pod "95acd323-0a11-4e25-8439-f848c8811df5" (UID: "95acd323-0a11-4e25-8439-f848c8811df5"). InnerVolumeSpecName "local-storage18-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.728645 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage15-crc" (UniqueName: "kubernetes.io/local-volume/local-storage15-crc") on node "crc" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.737525 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.765321 4751 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-run\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.765356 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.765365 4751 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/95acd323-0a11-4e25-8439-f848c8811df5-sys\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.765374 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t25kc\" (UniqueName: \"kubernetes.io/projected/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-kube-api-access-t25kc\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.765383 4751 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/95acd323-0a11-4e25-8439-f848c8811df5-dev\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.765392 4751 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-var-locks-brick\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.765400 4751 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/95acd323-0a11-4e25-8439-f848c8811df5-run\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.765408 4751 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-etc-iscsi\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.765416 4751 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/95acd323-0a11-4e25-8439-f848c8811df5-var-locks-brick\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.765427 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95acd323-0a11-4e25-8439-f848c8811df5-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.765467 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.765475 4751 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-logs\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.765484 4751 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/95acd323-0a11-4e25-8439-f848c8811df5-lib-modules\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.765503 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.765511 4751 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/95acd323-0a11-4e25-8439-f848c8811df5-etc-iscsi\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.765520 4751 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-lib-modules\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.765528 4751 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95acd323-0a11-4e25-8439-f848c8811df5-logs\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.765538 4751 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-sys\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.765545 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.765558 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.765566 4751 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/95acd323-0a11-4e25-8439-f848c8811df5-etc-nvme\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.765578 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" " Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.765586 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a459e47-85a7-4f4d-84ba-a7d3e01180dc-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.765595 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vklgz\" (UniqueName: \"kubernetes.io/projected/95acd323-0a11-4e25-8439-f848c8811df5-kube-api-access-vklgz\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.765603 4751 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-etc-nvme\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.765612 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.765619 4751 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-dev\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.765627 4751 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/95acd323-0a11-4e25-8439-f848c8811df5-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.765635 4751 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.782695 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-config-data" (OuterVolumeSpecName: "config-data") pod "320d0141-d27c-4f4d-9527-ae0f4db2f4fe" (UID: "320d0141-d27c-4f4d-9527-ae0f4db2f4fe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.787189 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage16-crc" (UniqueName: "kubernetes.io/local-volume/local-storage16-crc") on node "crc" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.787642 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.789634 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.789983 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95acd323-0a11-4e25-8439-f848c8811df5-config-data" (OuterVolumeSpecName: "config-data") pod "95acd323-0a11-4e25-8439-f848c8811df5" (UID: "95acd323-0a11-4e25-8439-f848c8811df5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.795386 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage18-crc" (UniqueName: "kubernetes.io/local-volume/local-storage18-crc") on node "crc" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.845410 4751 scope.go:117] "RemoveContainer" containerID="14cacda155a9f23ba59e4f2a3da7afceb0ef3ab9efee85c5bb5e5e8c246a77bf" Jan 31 15:02:27 crc kubenswrapper[4751]: E0131 15:02:27.845890 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14cacda155a9f23ba59e4f2a3da7afceb0ef3ab9efee85c5bb5e5e8c246a77bf\": container with ID starting with 14cacda155a9f23ba59e4f2a3da7afceb0ef3ab9efee85c5bb5e5e8c246a77bf not found: ID does not exist" containerID="14cacda155a9f23ba59e4f2a3da7afceb0ef3ab9efee85c5bb5e5e8c246a77bf" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.845923 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14cacda155a9f23ba59e4f2a3da7afceb0ef3ab9efee85c5bb5e5e8c246a77bf"} err="failed to get container status \"14cacda155a9f23ba59e4f2a3da7afceb0ef3ab9efee85c5bb5e5e8c246a77bf\": rpc error: code = NotFound desc = could not find container \"14cacda155a9f23ba59e4f2a3da7afceb0ef3ab9efee85c5bb5e5e8c246a77bf\": container with ID starting with 14cacda155a9f23ba59e4f2a3da7afceb0ef3ab9efee85c5bb5e5e8c246a77bf not found: ID does not exist" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.845944 4751 scope.go:117] "RemoveContainer" containerID="1604ddf440b351b315839a0e9505346fa56959eb8d95f8bc4383bea7078e51c8" Jan 31 15:02:27 crc kubenswrapper[4751]: E0131 15:02:27.846217 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1604ddf440b351b315839a0e9505346fa56959eb8d95f8bc4383bea7078e51c8\": container with ID starting with 1604ddf440b351b315839a0e9505346fa56959eb8d95f8bc4383bea7078e51c8 not found: ID does not exist" containerID="1604ddf440b351b315839a0e9505346fa56959eb8d95f8bc4383bea7078e51c8" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.846243 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1604ddf440b351b315839a0e9505346fa56959eb8d95f8bc4383bea7078e51c8"} err="failed to get container status \"1604ddf440b351b315839a0e9505346fa56959eb8d95f8bc4383bea7078e51c8\": rpc error: code = NotFound desc = could not find container \"1604ddf440b351b315839a0e9505346fa56959eb8d95f8bc4383bea7078e51c8\": container with ID starting with 1604ddf440b351b315839a0e9505346fa56959eb8d95f8bc4383bea7078e51c8 not found: ID does not exist" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.846255 4751 scope.go:117] "RemoveContainer" containerID="437b68b7d490ec3c812b65b9b5b235cb5bc63fbc0448ab8d751106a1f12c81b2" Jan 31 15:02:27 crc kubenswrapper[4751]: E0131 15:02:27.846444 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"437b68b7d490ec3c812b65b9b5b235cb5bc63fbc0448ab8d751106a1f12c81b2\": container with ID starting with 437b68b7d490ec3c812b65b9b5b235cb5bc63fbc0448ab8d751106a1f12c81b2 not found: ID does not exist" containerID="437b68b7d490ec3c812b65b9b5b235cb5bc63fbc0448ab8d751106a1f12c81b2" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.846513 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"437b68b7d490ec3c812b65b9b5b235cb5bc63fbc0448ab8d751106a1f12c81b2"} err="failed to get container status \"437b68b7d490ec3c812b65b9b5b235cb5bc63fbc0448ab8d751106a1f12c81b2\": rpc error: code = NotFound desc = could not find container \"437b68b7d490ec3c812b65b9b5b235cb5bc63fbc0448ab8d751106a1f12c81b2\": container with ID starting with 437b68b7d490ec3c812b65b9b5b235cb5bc63fbc0448ab8d751106a1f12c81b2 not found: ID does not exist" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.846531 4751 scope.go:117] "RemoveContainer" containerID="14cacda155a9f23ba59e4f2a3da7afceb0ef3ab9efee85c5bb5e5e8c246a77bf" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.846824 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14cacda155a9f23ba59e4f2a3da7afceb0ef3ab9efee85c5bb5e5e8c246a77bf"} err="failed to get container status \"14cacda155a9f23ba59e4f2a3da7afceb0ef3ab9efee85c5bb5e5e8c246a77bf\": rpc error: code = NotFound desc = could not find container \"14cacda155a9f23ba59e4f2a3da7afceb0ef3ab9efee85c5bb5e5e8c246a77bf\": container with ID starting with 14cacda155a9f23ba59e4f2a3da7afceb0ef3ab9efee85c5bb5e5e8c246a77bf not found: ID does not exist" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.846848 4751 scope.go:117] "RemoveContainer" containerID="1604ddf440b351b315839a0e9505346fa56959eb8d95f8bc4383bea7078e51c8" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.847160 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1604ddf440b351b315839a0e9505346fa56959eb8d95f8bc4383bea7078e51c8"} err="failed to get container status \"1604ddf440b351b315839a0e9505346fa56959eb8d95f8bc4383bea7078e51c8\": rpc error: code = NotFound desc = could not find container \"1604ddf440b351b315839a0e9505346fa56959eb8d95f8bc4383bea7078e51c8\": container with ID starting with 1604ddf440b351b315839a0e9505346fa56959eb8d95f8bc4383bea7078e51c8 not found: ID does not exist" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.847181 4751 scope.go:117] "RemoveContainer" containerID="437b68b7d490ec3c812b65b9b5b235cb5bc63fbc0448ab8d751106a1f12c81b2" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.847387 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"437b68b7d490ec3c812b65b9b5b235cb5bc63fbc0448ab8d751106a1f12c81b2"} err="failed to get container status \"437b68b7d490ec3c812b65b9b5b235cb5bc63fbc0448ab8d751106a1f12c81b2\": rpc error: code = NotFound desc = could not find container \"437b68b7d490ec3c812b65b9b5b235cb5bc63fbc0448ab8d751106a1f12c81b2\": container with ID starting with 437b68b7d490ec3c812b65b9b5b235cb5bc63fbc0448ab8d751106a1f12c81b2 not found: ID does not exist" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.847414 4751 scope.go:117] "RemoveContainer" containerID="14cacda155a9f23ba59e4f2a3da7afceb0ef3ab9efee85c5bb5e5e8c246a77bf" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.847603 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14cacda155a9f23ba59e4f2a3da7afceb0ef3ab9efee85c5bb5e5e8c246a77bf"} err="failed to get container status \"14cacda155a9f23ba59e4f2a3da7afceb0ef3ab9efee85c5bb5e5e8c246a77bf\": rpc error: code = NotFound desc = could not find container \"14cacda155a9f23ba59e4f2a3da7afceb0ef3ab9efee85c5bb5e5e8c246a77bf\": container with ID starting with 14cacda155a9f23ba59e4f2a3da7afceb0ef3ab9efee85c5bb5e5e8c246a77bf not found: ID does not exist" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.847621 4751 scope.go:117] "RemoveContainer" containerID="1604ddf440b351b315839a0e9505346fa56959eb8d95f8bc4383bea7078e51c8" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.847804 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1604ddf440b351b315839a0e9505346fa56959eb8d95f8bc4383bea7078e51c8"} err="failed to get container status \"1604ddf440b351b315839a0e9505346fa56959eb8d95f8bc4383bea7078e51c8\": rpc error: code = NotFound desc = could not find container \"1604ddf440b351b315839a0e9505346fa56959eb8d95f8bc4383bea7078e51c8\": container with ID starting with 1604ddf440b351b315839a0e9505346fa56959eb8d95f8bc4383bea7078e51c8 not found: ID does not exist" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.847824 4751 scope.go:117] "RemoveContainer" containerID="437b68b7d490ec3c812b65b9b5b235cb5bc63fbc0448ab8d751106a1f12c81b2" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.847992 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"437b68b7d490ec3c812b65b9b5b235cb5bc63fbc0448ab8d751106a1f12c81b2"} err="failed to get container status \"437b68b7d490ec3c812b65b9b5b235cb5bc63fbc0448ab8d751106a1f12c81b2\": rpc error: code = NotFound desc = could not find container \"437b68b7d490ec3c812b65b9b5b235cb5bc63fbc0448ab8d751106a1f12c81b2\": container with ID starting with 437b68b7d490ec3c812b65b9b5b235cb5bc63fbc0448ab8d751106a1f12c81b2 not found: ID does not exist" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.848018 4751 scope.go:117] "RemoveContainer" containerID="780c4a5951b3f68e86416bac67962e1bcf96fd06498cc8fffeda627b35ea2d9a" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.864787 4751 scope.go:117] "RemoveContainer" containerID="ad949de535e978e687a00582bdf68a90265d128f5c20313eaa241f804c6802e1" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.866668 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.866694 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.866705 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95acd323-0a11-4e25-8439-f848c8811df5-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.866718 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.866729 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.866739 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/320d0141-d27c-4f4d-9527-ae0f4db2f4fe-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.882952 4751 scope.go:117] "RemoveContainer" containerID="dfe4c141e34d71abd0cb1dc9d7e5475327eb2788a53a9e9597a4f06bd94deca9" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.901456 4751 scope.go:117] "RemoveContainer" containerID="780c4a5951b3f68e86416bac67962e1bcf96fd06498cc8fffeda627b35ea2d9a" Jan 31 15:02:27 crc kubenswrapper[4751]: E0131 15:02:27.902001 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"780c4a5951b3f68e86416bac67962e1bcf96fd06498cc8fffeda627b35ea2d9a\": container with ID starting with 780c4a5951b3f68e86416bac67962e1bcf96fd06498cc8fffeda627b35ea2d9a not found: ID does not exist" containerID="780c4a5951b3f68e86416bac67962e1bcf96fd06498cc8fffeda627b35ea2d9a" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.902056 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"780c4a5951b3f68e86416bac67962e1bcf96fd06498cc8fffeda627b35ea2d9a"} err="failed to get container status \"780c4a5951b3f68e86416bac67962e1bcf96fd06498cc8fffeda627b35ea2d9a\": rpc error: code = NotFound desc = could not find container \"780c4a5951b3f68e86416bac67962e1bcf96fd06498cc8fffeda627b35ea2d9a\": container with ID starting with 780c4a5951b3f68e86416bac67962e1bcf96fd06498cc8fffeda627b35ea2d9a not found: ID does not exist" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.902107 4751 scope.go:117] "RemoveContainer" containerID="ad949de535e978e687a00582bdf68a90265d128f5c20313eaa241f804c6802e1" Jan 31 15:02:27 crc kubenswrapper[4751]: E0131 15:02:27.902538 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad949de535e978e687a00582bdf68a90265d128f5c20313eaa241f804c6802e1\": container with ID starting with ad949de535e978e687a00582bdf68a90265d128f5c20313eaa241f804c6802e1 not found: ID does not exist" containerID="ad949de535e978e687a00582bdf68a90265d128f5c20313eaa241f804c6802e1" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.902577 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad949de535e978e687a00582bdf68a90265d128f5c20313eaa241f804c6802e1"} err="failed to get container status \"ad949de535e978e687a00582bdf68a90265d128f5c20313eaa241f804c6802e1\": rpc error: code = NotFound desc = could not find container \"ad949de535e978e687a00582bdf68a90265d128f5c20313eaa241f804c6802e1\": container with ID starting with ad949de535e978e687a00582bdf68a90265d128f5c20313eaa241f804c6802e1 not found: ID does not exist" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.902605 4751 scope.go:117] "RemoveContainer" containerID="dfe4c141e34d71abd0cb1dc9d7e5475327eb2788a53a9e9597a4f06bd94deca9" Jan 31 15:02:27 crc kubenswrapper[4751]: E0131 15:02:27.903296 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfe4c141e34d71abd0cb1dc9d7e5475327eb2788a53a9e9597a4f06bd94deca9\": container with ID starting with dfe4c141e34d71abd0cb1dc9d7e5475327eb2788a53a9e9597a4f06bd94deca9 not found: ID does not exist" containerID="dfe4c141e34d71abd0cb1dc9d7e5475327eb2788a53a9e9597a4f06bd94deca9" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.903327 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfe4c141e34d71abd0cb1dc9d7e5475327eb2788a53a9e9597a4f06bd94deca9"} err="failed to get container status \"dfe4c141e34d71abd0cb1dc9d7e5475327eb2788a53a9e9597a4f06bd94deca9\": rpc error: code = NotFound desc = could not find container \"dfe4c141e34d71abd0cb1dc9d7e5475327eb2788a53a9e9597a4f06bd94deca9\": container with ID starting with dfe4c141e34d71abd0cb1dc9d7e5475327eb2788a53a9e9597a4f06bd94deca9 not found: ID does not exist" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.903348 4751 scope.go:117] "RemoveContainer" containerID="780c4a5951b3f68e86416bac67962e1bcf96fd06498cc8fffeda627b35ea2d9a" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.903799 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"780c4a5951b3f68e86416bac67962e1bcf96fd06498cc8fffeda627b35ea2d9a"} err="failed to get container status \"780c4a5951b3f68e86416bac67962e1bcf96fd06498cc8fffeda627b35ea2d9a\": rpc error: code = NotFound desc = could not find container \"780c4a5951b3f68e86416bac67962e1bcf96fd06498cc8fffeda627b35ea2d9a\": container with ID starting with 780c4a5951b3f68e86416bac67962e1bcf96fd06498cc8fffeda627b35ea2d9a not found: ID does not exist" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.903826 4751 scope.go:117] "RemoveContainer" containerID="ad949de535e978e687a00582bdf68a90265d128f5c20313eaa241f804c6802e1" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.904245 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad949de535e978e687a00582bdf68a90265d128f5c20313eaa241f804c6802e1"} err="failed to get container status \"ad949de535e978e687a00582bdf68a90265d128f5c20313eaa241f804c6802e1\": rpc error: code = NotFound desc = could not find container \"ad949de535e978e687a00582bdf68a90265d128f5c20313eaa241f804c6802e1\": container with ID starting with ad949de535e978e687a00582bdf68a90265d128f5c20313eaa241f804c6802e1 not found: ID does not exist" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.904264 4751 scope.go:117] "RemoveContainer" containerID="dfe4c141e34d71abd0cb1dc9d7e5475327eb2788a53a9e9597a4f06bd94deca9" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.904522 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfe4c141e34d71abd0cb1dc9d7e5475327eb2788a53a9e9597a4f06bd94deca9"} err="failed to get container status \"dfe4c141e34d71abd0cb1dc9d7e5475327eb2788a53a9e9597a4f06bd94deca9\": rpc error: code = NotFound desc = could not find container \"dfe4c141e34d71abd0cb1dc9d7e5475327eb2788a53a9e9597a4f06bd94deca9\": container with ID starting with dfe4c141e34d71abd0cb1dc9d7e5475327eb2788a53a9e9597a4f06bd94deca9 not found: ID does not exist" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.904537 4751 scope.go:117] "RemoveContainer" containerID="780c4a5951b3f68e86416bac67962e1bcf96fd06498cc8fffeda627b35ea2d9a" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.904750 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"780c4a5951b3f68e86416bac67962e1bcf96fd06498cc8fffeda627b35ea2d9a"} err="failed to get container status \"780c4a5951b3f68e86416bac67962e1bcf96fd06498cc8fffeda627b35ea2d9a\": rpc error: code = NotFound desc = could not find container \"780c4a5951b3f68e86416bac67962e1bcf96fd06498cc8fffeda627b35ea2d9a\": container with ID starting with 780c4a5951b3f68e86416bac67962e1bcf96fd06498cc8fffeda627b35ea2d9a not found: ID does not exist" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.904764 4751 scope.go:117] "RemoveContainer" containerID="ad949de535e978e687a00582bdf68a90265d128f5c20313eaa241f804c6802e1" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.904982 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad949de535e978e687a00582bdf68a90265d128f5c20313eaa241f804c6802e1"} err="failed to get container status \"ad949de535e978e687a00582bdf68a90265d128f5c20313eaa241f804c6802e1\": rpc error: code = NotFound desc = could not find container \"ad949de535e978e687a00582bdf68a90265d128f5c20313eaa241f804c6802e1\": container with ID starting with ad949de535e978e687a00582bdf68a90265d128f5c20313eaa241f804c6802e1 not found: ID does not exist" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.905024 4751 scope.go:117] "RemoveContainer" containerID="dfe4c141e34d71abd0cb1dc9d7e5475327eb2788a53a9e9597a4f06bd94deca9" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.905292 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfe4c141e34d71abd0cb1dc9d7e5475327eb2788a53a9e9597a4f06bd94deca9"} err="failed to get container status \"dfe4c141e34d71abd0cb1dc9d7e5475327eb2788a53a9e9597a4f06bd94deca9\": rpc error: code = NotFound desc = could not find container \"dfe4c141e34d71abd0cb1dc9d7e5475327eb2788a53a9e9597a4f06bd94deca9\": container with ID starting with dfe4c141e34d71abd0cb1dc9d7e5475327eb2788a53a9e9597a4f06bd94deca9 not found: ID does not exist" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.905308 4751 scope.go:117] "RemoveContainer" containerID="a5e9523257d5d31be3474c9ebb00bf1150499e723b36a65c3a75e75e8cce0b9b" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.936285 4751 scope.go:117] "RemoveContainer" containerID="b399921133ccb6ca2d797749e4080f6c6ff9b63570cdf4da4a13f5d0b74243c8" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.936603 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.942270 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.959851 4751 scope.go:117] "RemoveContainer" containerID="06745ce7f672d1fed8fdec0610aed76d08cb2a27019556cb1771945716a7dd90" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.972153 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.978629 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.986895 4751 scope.go:117] "RemoveContainer" containerID="a5e9523257d5d31be3474c9ebb00bf1150499e723b36a65c3a75e75e8cce0b9b" Jan 31 15:02:27 crc kubenswrapper[4751]: E0131 15:02:27.987410 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5e9523257d5d31be3474c9ebb00bf1150499e723b36a65c3a75e75e8cce0b9b\": container with ID starting with a5e9523257d5d31be3474c9ebb00bf1150499e723b36a65c3a75e75e8cce0b9b not found: ID does not exist" containerID="a5e9523257d5d31be3474c9ebb00bf1150499e723b36a65c3a75e75e8cce0b9b" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.987443 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5e9523257d5d31be3474c9ebb00bf1150499e723b36a65c3a75e75e8cce0b9b"} err="failed to get container status \"a5e9523257d5d31be3474c9ebb00bf1150499e723b36a65c3a75e75e8cce0b9b\": rpc error: code = NotFound desc = could not find container \"a5e9523257d5d31be3474c9ebb00bf1150499e723b36a65c3a75e75e8cce0b9b\": container with ID starting with a5e9523257d5d31be3474c9ebb00bf1150499e723b36a65c3a75e75e8cce0b9b not found: ID does not exist" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.987466 4751 scope.go:117] "RemoveContainer" containerID="b399921133ccb6ca2d797749e4080f6c6ff9b63570cdf4da4a13f5d0b74243c8" Jan 31 15:02:27 crc kubenswrapper[4751]: E0131 15:02:27.987730 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b399921133ccb6ca2d797749e4080f6c6ff9b63570cdf4da4a13f5d0b74243c8\": container with ID starting with b399921133ccb6ca2d797749e4080f6c6ff9b63570cdf4da4a13f5d0b74243c8 not found: ID does not exist" containerID="b399921133ccb6ca2d797749e4080f6c6ff9b63570cdf4da4a13f5d0b74243c8" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.987755 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b399921133ccb6ca2d797749e4080f6c6ff9b63570cdf4da4a13f5d0b74243c8"} err="failed to get container status \"b399921133ccb6ca2d797749e4080f6c6ff9b63570cdf4da4a13f5d0b74243c8\": rpc error: code = NotFound desc = could not find container \"b399921133ccb6ca2d797749e4080f6c6ff9b63570cdf4da4a13f5d0b74243c8\": container with ID starting with b399921133ccb6ca2d797749e4080f6c6ff9b63570cdf4da4a13f5d0b74243c8 not found: ID does not exist" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.987771 4751 scope.go:117] "RemoveContainer" containerID="06745ce7f672d1fed8fdec0610aed76d08cb2a27019556cb1771945716a7dd90" Jan 31 15:02:27 crc kubenswrapper[4751]: E0131 15:02:27.988246 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06745ce7f672d1fed8fdec0610aed76d08cb2a27019556cb1771945716a7dd90\": container with ID starting with 06745ce7f672d1fed8fdec0610aed76d08cb2a27019556cb1771945716a7dd90 not found: ID does not exist" containerID="06745ce7f672d1fed8fdec0610aed76d08cb2a27019556cb1771945716a7dd90" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.988279 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06745ce7f672d1fed8fdec0610aed76d08cb2a27019556cb1771945716a7dd90"} err="failed to get container status \"06745ce7f672d1fed8fdec0610aed76d08cb2a27019556cb1771945716a7dd90\": rpc error: code = NotFound desc = could not find container \"06745ce7f672d1fed8fdec0610aed76d08cb2a27019556cb1771945716a7dd90\": container with ID starting with 06745ce7f672d1fed8fdec0610aed76d08cb2a27019556cb1771945716a7dd90 not found: ID does not exist" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.988297 4751 scope.go:117] "RemoveContainer" containerID="a5e9523257d5d31be3474c9ebb00bf1150499e723b36a65c3a75e75e8cce0b9b" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.988516 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5e9523257d5d31be3474c9ebb00bf1150499e723b36a65c3a75e75e8cce0b9b"} err="failed to get container status \"a5e9523257d5d31be3474c9ebb00bf1150499e723b36a65c3a75e75e8cce0b9b\": rpc error: code = NotFound desc = could not find container \"a5e9523257d5d31be3474c9ebb00bf1150499e723b36a65c3a75e75e8cce0b9b\": container with ID starting with a5e9523257d5d31be3474c9ebb00bf1150499e723b36a65c3a75e75e8cce0b9b not found: ID does not exist" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.988539 4751 scope.go:117] "RemoveContainer" containerID="b399921133ccb6ca2d797749e4080f6c6ff9b63570cdf4da4a13f5d0b74243c8" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.988805 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b399921133ccb6ca2d797749e4080f6c6ff9b63570cdf4da4a13f5d0b74243c8"} err="failed to get container status \"b399921133ccb6ca2d797749e4080f6c6ff9b63570cdf4da4a13f5d0b74243c8\": rpc error: code = NotFound desc = could not find container \"b399921133ccb6ca2d797749e4080f6c6ff9b63570cdf4da4a13f5d0b74243c8\": container with ID starting with b399921133ccb6ca2d797749e4080f6c6ff9b63570cdf4da4a13f5d0b74243c8 not found: ID does not exist" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.988829 4751 scope.go:117] "RemoveContainer" containerID="06745ce7f672d1fed8fdec0610aed76d08cb2a27019556cb1771945716a7dd90" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.989220 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06745ce7f672d1fed8fdec0610aed76d08cb2a27019556cb1771945716a7dd90"} err="failed to get container status \"06745ce7f672d1fed8fdec0610aed76d08cb2a27019556cb1771945716a7dd90\": rpc error: code = NotFound desc = could not find container \"06745ce7f672d1fed8fdec0610aed76d08cb2a27019556cb1771945716a7dd90\": container with ID starting with 06745ce7f672d1fed8fdec0610aed76d08cb2a27019556cb1771945716a7dd90 not found: ID does not exist" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.989243 4751 scope.go:117] "RemoveContainer" containerID="a5e9523257d5d31be3474c9ebb00bf1150499e723b36a65c3a75e75e8cce0b9b" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.989526 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5e9523257d5d31be3474c9ebb00bf1150499e723b36a65c3a75e75e8cce0b9b"} err="failed to get container status \"a5e9523257d5d31be3474c9ebb00bf1150499e723b36a65c3a75e75e8cce0b9b\": rpc error: code = NotFound desc = could not find container \"a5e9523257d5d31be3474c9ebb00bf1150499e723b36a65c3a75e75e8cce0b9b\": container with ID starting with a5e9523257d5d31be3474c9ebb00bf1150499e723b36a65c3a75e75e8cce0b9b not found: ID does not exist" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.989552 4751 scope.go:117] "RemoveContainer" containerID="b399921133ccb6ca2d797749e4080f6c6ff9b63570cdf4da4a13f5d0b74243c8" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.989784 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b399921133ccb6ca2d797749e4080f6c6ff9b63570cdf4da4a13f5d0b74243c8"} err="failed to get container status \"b399921133ccb6ca2d797749e4080f6c6ff9b63570cdf4da4a13f5d0b74243c8\": rpc error: code = NotFound desc = could not find container \"b399921133ccb6ca2d797749e4080f6c6ff9b63570cdf4da4a13f5d0b74243c8\": container with ID starting with b399921133ccb6ca2d797749e4080f6c6ff9b63570cdf4da4a13f5d0b74243c8 not found: ID does not exist" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.989812 4751 scope.go:117] "RemoveContainer" containerID="06745ce7f672d1fed8fdec0610aed76d08cb2a27019556cb1771945716a7dd90" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.990137 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06745ce7f672d1fed8fdec0610aed76d08cb2a27019556cb1771945716a7dd90"} err="failed to get container status \"06745ce7f672d1fed8fdec0610aed76d08cb2a27019556cb1771945716a7dd90\": rpc error: code = NotFound desc = could not find container \"06745ce7f672d1fed8fdec0610aed76d08cb2a27019556cb1771945716a7dd90\": container with ID starting with 06745ce7f672d1fed8fdec0610aed76d08cb2a27019556cb1771945716a7dd90 not found: ID does not exist" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.990384 4751 scope.go:117] "RemoveContainer" containerID="6ede99f5fdfed08a9d7eac6a184d5660b106e725595d6c7cf676d14eefad9233" Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.991586 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Jan 31 15:02:27 crc kubenswrapper[4751]: I0131 15:02:27.998034 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Jan 31 15:02:28 crc kubenswrapper[4751]: I0131 15:02:28.013354 4751 scope.go:117] "RemoveContainer" containerID="682e4ce2a6210e637ba5e4d2dbaf2ac342164e083e46f40cc551cb32e38ee04b" Jan 31 15:02:28 crc kubenswrapper[4751]: I0131 15:02:28.038329 4751 scope.go:117] "RemoveContainer" containerID="6913cb3476dd5e0905008785656c005c7a92a66aba7f669cab05437045db35fe" Jan 31 15:02:28 crc kubenswrapper[4751]: I0131 15:02:28.053868 4751 scope.go:117] "RemoveContainer" containerID="6ede99f5fdfed08a9d7eac6a184d5660b106e725595d6c7cf676d14eefad9233" Jan 31 15:02:28 crc kubenswrapper[4751]: E0131 15:02:28.054288 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ede99f5fdfed08a9d7eac6a184d5660b106e725595d6c7cf676d14eefad9233\": container with ID starting with 6ede99f5fdfed08a9d7eac6a184d5660b106e725595d6c7cf676d14eefad9233 not found: ID does not exist" containerID="6ede99f5fdfed08a9d7eac6a184d5660b106e725595d6c7cf676d14eefad9233" Jan 31 15:02:28 crc kubenswrapper[4751]: I0131 15:02:28.054334 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ede99f5fdfed08a9d7eac6a184d5660b106e725595d6c7cf676d14eefad9233"} err="failed to get container status \"6ede99f5fdfed08a9d7eac6a184d5660b106e725595d6c7cf676d14eefad9233\": rpc error: code = NotFound desc = could not find container \"6ede99f5fdfed08a9d7eac6a184d5660b106e725595d6c7cf676d14eefad9233\": container with ID starting with 6ede99f5fdfed08a9d7eac6a184d5660b106e725595d6c7cf676d14eefad9233 not found: ID does not exist" Jan 31 15:02:28 crc kubenswrapper[4751]: I0131 15:02:28.054358 4751 scope.go:117] "RemoveContainer" containerID="682e4ce2a6210e637ba5e4d2dbaf2ac342164e083e46f40cc551cb32e38ee04b" Jan 31 15:02:28 crc kubenswrapper[4751]: E0131 15:02:28.054654 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"682e4ce2a6210e637ba5e4d2dbaf2ac342164e083e46f40cc551cb32e38ee04b\": container with ID starting with 682e4ce2a6210e637ba5e4d2dbaf2ac342164e083e46f40cc551cb32e38ee04b not found: ID does not exist" containerID="682e4ce2a6210e637ba5e4d2dbaf2ac342164e083e46f40cc551cb32e38ee04b" Jan 31 15:02:28 crc kubenswrapper[4751]: I0131 15:02:28.054675 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"682e4ce2a6210e637ba5e4d2dbaf2ac342164e083e46f40cc551cb32e38ee04b"} err="failed to get container status \"682e4ce2a6210e637ba5e4d2dbaf2ac342164e083e46f40cc551cb32e38ee04b\": rpc error: code = NotFound desc = could not find container \"682e4ce2a6210e637ba5e4d2dbaf2ac342164e083e46f40cc551cb32e38ee04b\": container with ID starting with 682e4ce2a6210e637ba5e4d2dbaf2ac342164e083e46f40cc551cb32e38ee04b not found: ID does not exist" Jan 31 15:02:28 crc kubenswrapper[4751]: I0131 15:02:28.054708 4751 scope.go:117] "RemoveContainer" containerID="6913cb3476dd5e0905008785656c005c7a92a66aba7f669cab05437045db35fe" Jan 31 15:02:28 crc kubenswrapper[4751]: E0131 15:02:28.054973 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6913cb3476dd5e0905008785656c005c7a92a66aba7f669cab05437045db35fe\": container with ID starting with 6913cb3476dd5e0905008785656c005c7a92a66aba7f669cab05437045db35fe not found: ID does not exist" containerID="6913cb3476dd5e0905008785656c005c7a92a66aba7f669cab05437045db35fe" Jan 31 15:02:28 crc kubenswrapper[4751]: I0131 15:02:28.054993 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6913cb3476dd5e0905008785656c005c7a92a66aba7f669cab05437045db35fe"} err="failed to get container status \"6913cb3476dd5e0905008785656c005c7a92a66aba7f669cab05437045db35fe\": rpc error: code = NotFound desc = could not find container \"6913cb3476dd5e0905008785656c005c7a92a66aba7f669cab05437045db35fe\": container with ID starting with 6913cb3476dd5e0905008785656c005c7a92a66aba7f669cab05437045db35fe not found: ID does not exist" Jan 31 15:02:28 crc kubenswrapper[4751]: I0131 15:02:28.055040 4751 scope.go:117] "RemoveContainer" containerID="6ede99f5fdfed08a9d7eac6a184d5660b106e725595d6c7cf676d14eefad9233" Jan 31 15:02:28 crc kubenswrapper[4751]: I0131 15:02:28.055236 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ede99f5fdfed08a9d7eac6a184d5660b106e725595d6c7cf676d14eefad9233"} err="failed to get container status \"6ede99f5fdfed08a9d7eac6a184d5660b106e725595d6c7cf676d14eefad9233\": rpc error: code = NotFound desc = could not find container \"6ede99f5fdfed08a9d7eac6a184d5660b106e725595d6c7cf676d14eefad9233\": container with ID starting with 6ede99f5fdfed08a9d7eac6a184d5660b106e725595d6c7cf676d14eefad9233 not found: ID does not exist" Jan 31 15:02:28 crc kubenswrapper[4751]: I0131 15:02:28.055274 4751 scope.go:117] "RemoveContainer" containerID="682e4ce2a6210e637ba5e4d2dbaf2ac342164e083e46f40cc551cb32e38ee04b" Jan 31 15:02:28 crc kubenswrapper[4751]: I0131 15:02:28.055466 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"682e4ce2a6210e637ba5e4d2dbaf2ac342164e083e46f40cc551cb32e38ee04b"} err="failed to get container status \"682e4ce2a6210e637ba5e4d2dbaf2ac342164e083e46f40cc551cb32e38ee04b\": rpc error: code = NotFound desc = could not find container \"682e4ce2a6210e637ba5e4d2dbaf2ac342164e083e46f40cc551cb32e38ee04b\": container with ID starting with 682e4ce2a6210e637ba5e4d2dbaf2ac342164e083e46f40cc551cb32e38ee04b not found: ID does not exist" Jan 31 15:02:28 crc kubenswrapper[4751]: I0131 15:02:28.055482 4751 scope.go:117] "RemoveContainer" containerID="6913cb3476dd5e0905008785656c005c7a92a66aba7f669cab05437045db35fe" Jan 31 15:02:28 crc kubenswrapper[4751]: I0131 15:02:28.055653 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6913cb3476dd5e0905008785656c005c7a92a66aba7f669cab05437045db35fe"} err="failed to get container status \"6913cb3476dd5e0905008785656c005c7a92a66aba7f669cab05437045db35fe\": rpc error: code = NotFound desc = could not find container \"6913cb3476dd5e0905008785656c005c7a92a66aba7f669cab05437045db35fe\": container with ID starting with 6913cb3476dd5e0905008785656c005c7a92a66aba7f669cab05437045db35fe not found: ID does not exist" Jan 31 15:02:28 crc kubenswrapper[4751]: I0131 15:02:28.055670 4751 scope.go:117] "RemoveContainer" containerID="6ede99f5fdfed08a9d7eac6a184d5660b106e725595d6c7cf676d14eefad9233" Jan 31 15:02:28 crc kubenswrapper[4751]: I0131 15:02:28.055985 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ede99f5fdfed08a9d7eac6a184d5660b106e725595d6c7cf676d14eefad9233"} err="failed to get container status \"6ede99f5fdfed08a9d7eac6a184d5660b106e725595d6c7cf676d14eefad9233\": rpc error: code = NotFound desc = could not find container \"6ede99f5fdfed08a9d7eac6a184d5660b106e725595d6c7cf676d14eefad9233\": container with ID starting with 6ede99f5fdfed08a9d7eac6a184d5660b106e725595d6c7cf676d14eefad9233 not found: ID does not exist" Jan 31 15:02:28 crc kubenswrapper[4751]: I0131 15:02:28.056002 4751 scope.go:117] "RemoveContainer" containerID="682e4ce2a6210e637ba5e4d2dbaf2ac342164e083e46f40cc551cb32e38ee04b" Jan 31 15:02:28 crc kubenswrapper[4751]: I0131 15:02:28.056379 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"682e4ce2a6210e637ba5e4d2dbaf2ac342164e083e46f40cc551cb32e38ee04b"} err="failed to get container status \"682e4ce2a6210e637ba5e4d2dbaf2ac342164e083e46f40cc551cb32e38ee04b\": rpc error: code = NotFound desc = could not find container \"682e4ce2a6210e637ba5e4d2dbaf2ac342164e083e46f40cc551cb32e38ee04b\": container with ID starting with 682e4ce2a6210e637ba5e4d2dbaf2ac342164e083e46f40cc551cb32e38ee04b not found: ID does not exist" Jan 31 15:02:28 crc kubenswrapper[4751]: I0131 15:02:28.056397 4751 scope.go:117] "RemoveContainer" containerID="6913cb3476dd5e0905008785656c005c7a92a66aba7f669cab05437045db35fe" Jan 31 15:02:28 crc kubenswrapper[4751]: I0131 15:02:28.056661 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6913cb3476dd5e0905008785656c005c7a92a66aba7f669cab05437045db35fe"} err="failed to get container status \"6913cb3476dd5e0905008785656c005c7a92a66aba7f669cab05437045db35fe\": rpc error: code = NotFound desc = could not find container \"6913cb3476dd5e0905008785656c005c7a92a66aba7f669cab05437045db35fe\": container with ID starting with 6913cb3476dd5e0905008785656c005c7a92a66aba7f669cab05437045db35fe not found: ID does not exist" Jan 31 15:02:28 crc kubenswrapper[4751]: I0131 15:02:28.414272 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="320d0141-d27c-4f4d-9527-ae0f4db2f4fe" path="/var/lib/kubelet/pods/320d0141-d27c-4f4d-9527-ae0f4db2f4fe/volumes" Jan 31 15:02:28 crc kubenswrapper[4751]: I0131 15:02:28.415051 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a459e47-85a7-4f4d-84ba-a7d3e01180dc" path="/var/lib/kubelet/pods/6a459e47-85a7-4f4d-84ba-a7d3e01180dc/volumes" Jan 31 15:02:28 crc kubenswrapper[4751]: I0131 15:02:28.416192 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad" path="/var/lib/kubelet/pods/7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad/volumes" Jan 31 15:02:28 crc kubenswrapper[4751]: I0131 15:02:28.416842 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95acd323-0a11-4e25-8439-f848c8811df5" path="/var/lib/kubelet/pods/95acd323-0a11-4e25-8439-f848c8811df5/volumes" Jan 31 15:02:29 crc kubenswrapper[4751]: I0131 15:02:29.160670 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 15:02:29 crc kubenswrapper[4751]: I0131 15:02:29.162000 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="f0b77b88-19a5-4bdc-87a1-6a65273226a2" containerName="glance-log" containerID="cri-o://cdf19d70be1f20e70e6253f8f6e27452c7be5e6f13392a31b245256551aa31c1" gracePeriod=30 Jan 31 15:02:29 crc kubenswrapper[4751]: I0131 15:02:29.162194 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="f0b77b88-19a5-4bdc-87a1-6a65273226a2" containerName="glance-httpd" containerID="cri-o://e332c13695ee9418872977980d66d846c226e037834cc39d0c88b742a39fc6a9" gracePeriod=30 Jan 31 15:02:29 crc kubenswrapper[4751]: I0131 15:02:29.162129 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="f0b77b88-19a5-4bdc-87a1-6a65273226a2" containerName="glance-api" containerID="cri-o://d00daa5873d01a2a52917e99a159d4dd523630cf9771ab415ea43dc1ab2768ec" gracePeriod=30 Jan 31 15:02:29 crc kubenswrapper[4751]: I0131 15:02:29.648589 4751 generic.go:334] "Generic (PLEG): container finished" podID="f0b77b88-19a5-4bdc-87a1-6a65273226a2" containerID="d00daa5873d01a2a52917e99a159d4dd523630cf9771ab415ea43dc1ab2768ec" exitCode=0 Jan 31 15:02:29 crc kubenswrapper[4751]: I0131 15:02:29.648635 4751 generic.go:334] "Generic (PLEG): container finished" podID="f0b77b88-19a5-4bdc-87a1-6a65273226a2" containerID="e332c13695ee9418872977980d66d846c226e037834cc39d0c88b742a39fc6a9" exitCode=0 Jan 31 15:02:29 crc kubenswrapper[4751]: I0131 15:02:29.648643 4751 generic.go:334] "Generic (PLEG): container finished" podID="f0b77b88-19a5-4bdc-87a1-6a65273226a2" containerID="cdf19d70be1f20e70e6253f8f6e27452c7be5e6f13392a31b245256551aa31c1" exitCode=143 Jan 31 15:02:29 crc kubenswrapper[4751]: I0131 15:02:29.648672 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"f0b77b88-19a5-4bdc-87a1-6a65273226a2","Type":"ContainerDied","Data":"d00daa5873d01a2a52917e99a159d4dd523630cf9771ab415ea43dc1ab2768ec"} Jan 31 15:02:29 crc kubenswrapper[4751]: I0131 15:02:29.648722 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"f0b77b88-19a5-4bdc-87a1-6a65273226a2","Type":"ContainerDied","Data":"e332c13695ee9418872977980d66d846c226e037834cc39d0c88b742a39fc6a9"} Jan 31 15:02:29 crc kubenswrapper[4751]: I0131 15:02:29.648735 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"f0b77b88-19a5-4bdc-87a1-6a65273226a2","Type":"ContainerDied","Data":"cdf19d70be1f20e70e6253f8f6e27452c7be5e6f13392a31b245256551aa31c1"} Jan 31 15:02:29 crc kubenswrapper[4751]: I0131 15:02:29.713705 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Jan 31 15:02:29 crc kubenswrapper[4751]: I0131 15:02:29.714258 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="5c99f5b1-8566-4141-9bd4-71a75e7f43b6" containerName="glance-log" containerID="cri-o://88dad209ae8e8d063a908076aa5adea9728c00caf34294e807966d88047f26f8" gracePeriod=30 Jan 31 15:02:29 crc kubenswrapper[4751]: I0131 15:02:29.714326 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="5c99f5b1-8566-4141-9bd4-71a75e7f43b6" containerName="glance-httpd" containerID="cri-o://b116da01d3251d46a5037ade80e62e2ba05b68577c23f5fe7ea7d5c1c6939e12" gracePeriod=30 Jan 31 15:02:29 crc kubenswrapper[4751]: I0131 15:02:29.714332 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="5c99f5b1-8566-4141-9bd4-71a75e7f43b6" containerName="glance-api" containerID="cri-o://979026a33bcf12c3258686b10538e86fd757dcafb8921f3a74303169ea20efee" gracePeriod=30 Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.034931 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.112539 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0b77b88-19a5-4bdc-87a1-6a65273226a2-logs\") pod \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\" (UID: \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\") " Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.112587 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f0b77b88-19a5-4bdc-87a1-6a65273226a2-sys\") pod \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\" (UID: \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\") " Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.112613 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f0b77b88-19a5-4bdc-87a1-6a65273226a2-etc-iscsi\") pod \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\" (UID: \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\") " Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.112638 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0b77b88-19a5-4bdc-87a1-6a65273226a2-config-data\") pod \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\" (UID: \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\") " Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.112652 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f0b77b88-19a5-4bdc-87a1-6a65273226a2-run\") pod \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\" (UID: \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\") " Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.112679 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8cqp\" (UniqueName: \"kubernetes.io/projected/f0b77b88-19a5-4bdc-87a1-6a65273226a2-kube-api-access-l8cqp\") pod \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\" (UID: \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\") " Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.112711 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f0b77b88-19a5-4bdc-87a1-6a65273226a2-lib-modules\") pod \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\" (UID: \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\") " Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.112729 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0b77b88-19a5-4bdc-87a1-6a65273226a2-scripts\") pod \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\" (UID: \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\") " Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.112748 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\" (UID: \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\") " Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.112769 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\" (UID: \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\") " Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.112787 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f0b77b88-19a5-4bdc-87a1-6a65273226a2-var-locks-brick\") pod \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\" (UID: \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\") " Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.112806 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f0b77b88-19a5-4bdc-87a1-6a65273226a2-etc-nvme\") pod \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\" (UID: \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\") " Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.112847 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f0b77b88-19a5-4bdc-87a1-6a65273226a2-dev\") pod \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\" (UID: \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\") " Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.112860 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f0b77b88-19a5-4bdc-87a1-6a65273226a2-httpd-run\") pod \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\" (UID: \"f0b77b88-19a5-4bdc-87a1-6a65273226a2\") " Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.113114 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f0b77b88-19a5-4bdc-87a1-6a65273226a2-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "f0b77b88-19a5-4bdc-87a1-6a65273226a2" (UID: "f0b77b88-19a5-4bdc-87a1-6a65273226a2"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.113177 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f0b77b88-19a5-4bdc-87a1-6a65273226a2-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "f0b77b88-19a5-4bdc-87a1-6a65273226a2" (UID: "f0b77b88-19a5-4bdc-87a1-6a65273226a2"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.113218 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f0b77b88-19a5-4bdc-87a1-6a65273226a2-run" (OuterVolumeSpecName: "run") pod "f0b77b88-19a5-4bdc-87a1-6a65273226a2" (UID: "f0b77b88-19a5-4bdc-87a1-6a65273226a2"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.113251 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f0b77b88-19a5-4bdc-87a1-6a65273226a2-sys" (OuterVolumeSpecName: "sys") pod "f0b77b88-19a5-4bdc-87a1-6a65273226a2" (UID: "f0b77b88-19a5-4bdc-87a1-6a65273226a2"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.113419 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0b77b88-19a5-4bdc-87a1-6a65273226a2-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f0b77b88-19a5-4bdc-87a1-6a65273226a2" (UID: "f0b77b88-19a5-4bdc-87a1-6a65273226a2"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.113461 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f0b77b88-19a5-4bdc-87a1-6a65273226a2-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "f0b77b88-19a5-4bdc-87a1-6a65273226a2" (UID: "f0b77b88-19a5-4bdc-87a1-6a65273226a2"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.113480 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f0b77b88-19a5-4bdc-87a1-6a65273226a2-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "f0b77b88-19a5-4bdc-87a1-6a65273226a2" (UID: "f0b77b88-19a5-4bdc-87a1-6a65273226a2"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.113496 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f0b77b88-19a5-4bdc-87a1-6a65273226a2-dev" (OuterVolumeSpecName: "dev") pod "f0b77b88-19a5-4bdc-87a1-6a65273226a2" (UID: "f0b77b88-19a5-4bdc-87a1-6a65273226a2"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.113519 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0b77b88-19a5-4bdc-87a1-6a65273226a2-logs" (OuterVolumeSpecName: "logs") pod "f0b77b88-19a5-4bdc-87a1-6a65273226a2" (UID: "f0b77b88-19a5-4bdc-87a1-6a65273226a2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.119065 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance-cache") pod "f0b77b88-19a5-4bdc-87a1-6a65273226a2" (UID: "f0b77b88-19a5-4bdc-87a1-6a65273226a2"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.119181 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0b77b88-19a5-4bdc-87a1-6a65273226a2-kube-api-access-l8cqp" (OuterVolumeSpecName: "kube-api-access-l8cqp") pod "f0b77b88-19a5-4bdc-87a1-6a65273226a2" (UID: "f0b77b88-19a5-4bdc-87a1-6a65273226a2"). InnerVolumeSpecName "kube-api-access-l8cqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.119233 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0b77b88-19a5-4bdc-87a1-6a65273226a2-scripts" (OuterVolumeSpecName: "scripts") pod "f0b77b88-19a5-4bdc-87a1-6a65273226a2" (UID: "f0b77b88-19a5-4bdc-87a1-6a65273226a2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.122279 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "f0b77b88-19a5-4bdc-87a1-6a65273226a2" (UID: "f0b77b88-19a5-4bdc-87a1-6a65273226a2"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.181247 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0b77b88-19a5-4bdc-87a1-6a65273226a2-config-data" (OuterVolumeSpecName: "config-data") pod "f0b77b88-19a5-4bdc-87a1-6a65273226a2" (UID: "f0b77b88-19a5-4bdc-87a1-6a65273226a2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.214239 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.214278 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.214295 4751 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f0b77b88-19a5-4bdc-87a1-6a65273226a2-var-locks-brick\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.214308 4751 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f0b77b88-19a5-4bdc-87a1-6a65273226a2-etc-nvme\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.214321 4751 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f0b77b88-19a5-4bdc-87a1-6a65273226a2-dev\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.214331 4751 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f0b77b88-19a5-4bdc-87a1-6a65273226a2-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.214341 4751 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0b77b88-19a5-4bdc-87a1-6a65273226a2-logs\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.214351 4751 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f0b77b88-19a5-4bdc-87a1-6a65273226a2-sys\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.214360 4751 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f0b77b88-19a5-4bdc-87a1-6a65273226a2-etc-iscsi\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.214372 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0b77b88-19a5-4bdc-87a1-6a65273226a2-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.214383 4751 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f0b77b88-19a5-4bdc-87a1-6a65273226a2-run\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.214394 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8cqp\" (UniqueName: \"kubernetes.io/projected/f0b77b88-19a5-4bdc-87a1-6a65273226a2-kube-api-access-l8cqp\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.214404 4751 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f0b77b88-19a5-4bdc-87a1-6a65273226a2-lib-modules\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.214414 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0b77b88-19a5-4bdc-87a1-6a65273226a2-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.226272 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.229324 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.315906 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.315941 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.427784 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.518213 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-dev\") pod \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\" (UID: \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\") " Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.518268 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-lib-modules\") pod \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\" (UID: \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\") " Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.518305 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-sys\") pod \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\" (UID: \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\") " Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.518356 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w868q\" (UniqueName: \"kubernetes.io/projected/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-kube-api-access-w868q\") pod \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\" (UID: \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\") " Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.518381 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-etc-iscsi\") pod \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\" (UID: \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\") " Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.518377 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-dev" (OuterVolumeSpecName: "dev") pod "5c99f5b1-8566-4141-9bd4-71a75e7f43b6" (UID: "5c99f5b1-8566-4141-9bd4-71a75e7f43b6"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.518413 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-logs\") pod \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\" (UID: \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\") " Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.518413 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-sys" (OuterVolumeSpecName: "sys") pod "5c99f5b1-8566-4141-9bd4-71a75e7f43b6" (UID: "5c99f5b1-8566-4141-9bd4-71a75e7f43b6"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.518408 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "5c99f5b1-8566-4141-9bd4-71a75e7f43b6" (UID: "5c99f5b1-8566-4141-9bd4-71a75e7f43b6"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.518449 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "5c99f5b1-8566-4141-9bd4-71a75e7f43b6" (UID: "5c99f5b1-8566-4141-9bd4-71a75e7f43b6"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.518466 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-config-data\") pod \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\" (UID: \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\") " Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.518490 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\" (UID: \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\") " Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.518523 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-scripts\") pod \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\" (UID: \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\") " Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.518557 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\" (UID: \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\") " Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.518581 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-var-locks-brick\") pod \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\" (UID: \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\") " Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.518630 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-httpd-run\") pod \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\" (UID: \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\") " Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.518655 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-run\") pod \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\" (UID: \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\") " Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.518692 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-etc-nvme\") pod \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\" (UID: \"5c99f5b1-8566-4141-9bd4-71a75e7f43b6\") " Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.518874 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-logs" (OuterVolumeSpecName: "logs") pod "5c99f5b1-8566-4141-9bd4-71a75e7f43b6" (UID: "5c99f5b1-8566-4141-9bd4-71a75e7f43b6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.518940 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-run" (OuterVolumeSpecName: "run") pod "5c99f5b1-8566-4141-9bd4-71a75e7f43b6" (UID: "5c99f5b1-8566-4141-9bd4-71a75e7f43b6"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.519239 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "5c99f5b1-8566-4141-9bd4-71a75e7f43b6" (UID: "5c99f5b1-8566-4141-9bd4-71a75e7f43b6"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.519450 4751 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-logs\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.519484 4751 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-run\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.519495 4751 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-etc-nvme\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.519504 4751 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-dev\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.519512 4751 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-lib-modules\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.519520 4751 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-sys\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.519529 4751 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-etc-iscsi\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.520349 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "5c99f5b1-8566-4141-9bd4-71a75e7f43b6" (UID: "5c99f5b1-8566-4141-9bd4-71a75e7f43b6"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.522381 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage17-crc" (OuterVolumeSpecName: "glance-cache") pod "5c99f5b1-8566-4141-9bd4-71a75e7f43b6" (UID: "5c99f5b1-8566-4141-9bd4-71a75e7f43b6"). InnerVolumeSpecName "local-storage17-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.525540 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "5c99f5b1-8566-4141-9bd4-71a75e7f43b6" (UID: "5c99f5b1-8566-4141-9bd4-71a75e7f43b6"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.527267 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage20-crc" (OuterVolumeSpecName: "glance") pod "5c99f5b1-8566-4141-9bd4-71a75e7f43b6" (UID: "5c99f5b1-8566-4141-9bd4-71a75e7f43b6"). InnerVolumeSpecName "local-storage20-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.534288 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-scripts" (OuterVolumeSpecName: "scripts") pod "5c99f5b1-8566-4141-9bd4-71a75e7f43b6" (UID: "5c99f5b1-8566-4141-9bd4-71a75e7f43b6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.535253 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-kube-api-access-w868q" (OuterVolumeSpecName: "kube-api-access-w868q") pod "5c99f5b1-8566-4141-9bd4-71a75e7f43b6" (UID: "5c99f5b1-8566-4141-9bd4-71a75e7f43b6"). InnerVolumeSpecName "kube-api-access-w868q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.597290 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-config-data" (OuterVolumeSpecName: "config-data") pod "5c99f5b1-8566-4141-9bd4-71a75e7f43b6" (UID: "5c99f5b1-8566-4141-9bd4-71a75e7f43b6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.621106 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w868q\" (UniqueName: \"kubernetes.io/projected/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-kube-api-access-w868q\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.621139 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.621170 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" " Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.621179 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.621193 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" " Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.621201 4751 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-var-locks-brick\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.621209 4751 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5c99f5b1-8566-4141-9bd4-71a75e7f43b6-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.635493 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage17-crc" (UniqueName: "kubernetes.io/local-volume/local-storage17-crc") on node "crc" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.646833 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage20-crc" (UniqueName: "kubernetes.io/local-volume/local-storage20-crc") on node "crc" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.657926 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"f0b77b88-19a5-4bdc-87a1-6a65273226a2","Type":"ContainerDied","Data":"58907fb108567ccc157e944740b878da74b00dd4afd8f71e705346251d50d030"} Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.657970 4751 scope.go:117] "RemoveContainer" containerID="d00daa5873d01a2a52917e99a159d4dd523630cf9771ab415ea43dc1ab2768ec" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.657968 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.661284 4751 generic.go:334] "Generic (PLEG): container finished" podID="5c99f5b1-8566-4141-9bd4-71a75e7f43b6" containerID="979026a33bcf12c3258686b10538e86fd757dcafb8921f3a74303169ea20efee" exitCode=0 Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.661308 4751 generic.go:334] "Generic (PLEG): container finished" podID="5c99f5b1-8566-4141-9bd4-71a75e7f43b6" containerID="b116da01d3251d46a5037ade80e62e2ba05b68577c23f5fe7ea7d5c1c6939e12" exitCode=0 Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.661315 4751 generic.go:334] "Generic (PLEG): container finished" podID="5c99f5b1-8566-4141-9bd4-71a75e7f43b6" containerID="88dad209ae8e8d063a908076aa5adea9728c00caf34294e807966d88047f26f8" exitCode=143 Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.661334 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"5c99f5b1-8566-4141-9bd4-71a75e7f43b6","Type":"ContainerDied","Data":"979026a33bcf12c3258686b10538e86fd757dcafb8921f3a74303169ea20efee"} Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.661359 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"5c99f5b1-8566-4141-9bd4-71a75e7f43b6","Type":"ContainerDied","Data":"b116da01d3251d46a5037ade80e62e2ba05b68577c23f5fe7ea7d5c1c6939e12"} Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.661368 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"5c99f5b1-8566-4141-9bd4-71a75e7f43b6","Type":"ContainerDied","Data":"88dad209ae8e8d063a908076aa5adea9728c00caf34294e807966d88047f26f8"} Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.661378 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"5c99f5b1-8566-4141-9bd4-71a75e7f43b6","Type":"ContainerDied","Data":"632343238b8b6273cfe0d462a1823f7261ef1f48b55a453cfe7a4028e8a3bc11"} Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.661608 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.684633 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.698648 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.704842 4751 scope.go:117] "RemoveContainer" containerID="e332c13695ee9418872977980d66d846c226e037834cc39d0c88b742a39fc6a9" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.717208 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.722033 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.722081 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.722927 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.730299 4751 scope.go:117] "RemoveContainer" containerID="cdf19d70be1f20e70e6253f8f6e27452c7be5e6f13392a31b245256551aa31c1" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.753638 4751 scope.go:117] "RemoveContainer" containerID="979026a33bcf12c3258686b10538e86fd757dcafb8921f3a74303169ea20efee" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.769812 4751 scope.go:117] "RemoveContainer" containerID="b116da01d3251d46a5037ade80e62e2ba05b68577c23f5fe7ea7d5c1c6939e12" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.789727 4751 scope.go:117] "RemoveContainer" containerID="88dad209ae8e8d063a908076aa5adea9728c00caf34294e807966d88047f26f8" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.813750 4751 scope.go:117] "RemoveContainer" containerID="979026a33bcf12c3258686b10538e86fd757dcafb8921f3a74303169ea20efee" Jan 31 15:02:30 crc kubenswrapper[4751]: E0131 15:02:30.814206 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"979026a33bcf12c3258686b10538e86fd757dcafb8921f3a74303169ea20efee\": container with ID starting with 979026a33bcf12c3258686b10538e86fd757dcafb8921f3a74303169ea20efee not found: ID does not exist" containerID="979026a33bcf12c3258686b10538e86fd757dcafb8921f3a74303169ea20efee" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.814229 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"979026a33bcf12c3258686b10538e86fd757dcafb8921f3a74303169ea20efee"} err="failed to get container status \"979026a33bcf12c3258686b10538e86fd757dcafb8921f3a74303169ea20efee\": rpc error: code = NotFound desc = could not find container \"979026a33bcf12c3258686b10538e86fd757dcafb8921f3a74303169ea20efee\": container with ID starting with 979026a33bcf12c3258686b10538e86fd757dcafb8921f3a74303169ea20efee not found: ID does not exist" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.814251 4751 scope.go:117] "RemoveContainer" containerID="b116da01d3251d46a5037ade80e62e2ba05b68577c23f5fe7ea7d5c1c6939e12" Jan 31 15:02:30 crc kubenswrapper[4751]: E0131 15:02:30.814701 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b116da01d3251d46a5037ade80e62e2ba05b68577c23f5fe7ea7d5c1c6939e12\": container with ID starting with b116da01d3251d46a5037ade80e62e2ba05b68577c23f5fe7ea7d5c1c6939e12 not found: ID does not exist" containerID="b116da01d3251d46a5037ade80e62e2ba05b68577c23f5fe7ea7d5c1c6939e12" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.814722 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b116da01d3251d46a5037ade80e62e2ba05b68577c23f5fe7ea7d5c1c6939e12"} err="failed to get container status \"b116da01d3251d46a5037ade80e62e2ba05b68577c23f5fe7ea7d5c1c6939e12\": rpc error: code = NotFound desc = could not find container \"b116da01d3251d46a5037ade80e62e2ba05b68577c23f5fe7ea7d5c1c6939e12\": container with ID starting with b116da01d3251d46a5037ade80e62e2ba05b68577c23f5fe7ea7d5c1c6939e12 not found: ID does not exist" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.814737 4751 scope.go:117] "RemoveContainer" containerID="88dad209ae8e8d063a908076aa5adea9728c00caf34294e807966d88047f26f8" Jan 31 15:02:30 crc kubenswrapper[4751]: E0131 15:02:30.814999 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88dad209ae8e8d063a908076aa5adea9728c00caf34294e807966d88047f26f8\": container with ID starting with 88dad209ae8e8d063a908076aa5adea9728c00caf34294e807966d88047f26f8 not found: ID does not exist" containerID="88dad209ae8e8d063a908076aa5adea9728c00caf34294e807966d88047f26f8" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.815019 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88dad209ae8e8d063a908076aa5adea9728c00caf34294e807966d88047f26f8"} err="failed to get container status \"88dad209ae8e8d063a908076aa5adea9728c00caf34294e807966d88047f26f8\": rpc error: code = NotFound desc = could not find container \"88dad209ae8e8d063a908076aa5adea9728c00caf34294e807966d88047f26f8\": container with ID starting with 88dad209ae8e8d063a908076aa5adea9728c00caf34294e807966d88047f26f8 not found: ID does not exist" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.815033 4751 scope.go:117] "RemoveContainer" containerID="979026a33bcf12c3258686b10538e86fd757dcafb8921f3a74303169ea20efee" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.815356 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"979026a33bcf12c3258686b10538e86fd757dcafb8921f3a74303169ea20efee"} err="failed to get container status \"979026a33bcf12c3258686b10538e86fd757dcafb8921f3a74303169ea20efee\": rpc error: code = NotFound desc = could not find container \"979026a33bcf12c3258686b10538e86fd757dcafb8921f3a74303169ea20efee\": container with ID starting with 979026a33bcf12c3258686b10538e86fd757dcafb8921f3a74303169ea20efee not found: ID does not exist" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.815408 4751 scope.go:117] "RemoveContainer" containerID="b116da01d3251d46a5037ade80e62e2ba05b68577c23f5fe7ea7d5c1c6939e12" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.815674 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b116da01d3251d46a5037ade80e62e2ba05b68577c23f5fe7ea7d5c1c6939e12"} err="failed to get container status \"b116da01d3251d46a5037ade80e62e2ba05b68577c23f5fe7ea7d5c1c6939e12\": rpc error: code = NotFound desc = could not find container \"b116da01d3251d46a5037ade80e62e2ba05b68577c23f5fe7ea7d5c1c6939e12\": container with ID starting with b116da01d3251d46a5037ade80e62e2ba05b68577c23f5fe7ea7d5c1c6939e12 not found: ID does not exist" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.815693 4751 scope.go:117] "RemoveContainer" containerID="88dad209ae8e8d063a908076aa5adea9728c00caf34294e807966d88047f26f8" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.815884 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88dad209ae8e8d063a908076aa5adea9728c00caf34294e807966d88047f26f8"} err="failed to get container status \"88dad209ae8e8d063a908076aa5adea9728c00caf34294e807966d88047f26f8\": rpc error: code = NotFound desc = could not find container \"88dad209ae8e8d063a908076aa5adea9728c00caf34294e807966d88047f26f8\": container with ID starting with 88dad209ae8e8d063a908076aa5adea9728c00caf34294e807966d88047f26f8 not found: ID does not exist" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.815902 4751 scope.go:117] "RemoveContainer" containerID="979026a33bcf12c3258686b10538e86fd757dcafb8921f3a74303169ea20efee" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.816186 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"979026a33bcf12c3258686b10538e86fd757dcafb8921f3a74303169ea20efee"} err="failed to get container status \"979026a33bcf12c3258686b10538e86fd757dcafb8921f3a74303169ea20efee\": rpc error: code = NotFound desc = could not find container \"979026a33bcf12c3258686b10538e86fd757dcafb8921f3a74303169ea20efee\": container with ID starting with 979026a33bcf12c3258686b10538e86fd757dcafb8921f3a74303169ea20efee not found: ID does not exist" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.816205 4751 scope.go:117] "RemoveContainer" containerID="b116da01d3251d46a5037ade80e62e2ba05b68577c23f5fe7ea7d5c1c6939e12" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.816529 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b116da01d3251d46a5037ade80e62e2ba05b68577c23f5fe7ea7d5c1c6939e12"} err="failed to get container status \"b116da01d3251d46a5037ade80e62e2ba05b68577c23f5fe7ea7d5c1c6939e12\": rpc error: code = NotFound desc = could not find container \"b116da01d3251d46a5037ade80e62e2ba05b68577c23f5fe7ea7d5c1c6939e12\": container with ID starting with b116da01d3251d46a5037ade80e62e2ba05b68577c23f5fe7ea7d5c1c6939e12 not found: ID does not exist" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.816548 4751 scope.go:117] "RemoveContainer" containerID="88dad209ae8e8d063a908076aa5adea9728c00caf34294e807966d88047f26f8" Jan 31 15:02:30 crc kubenswrapper[4751]: I0131 15:02:30.816754 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88dad209ae8e8d063a908076aa5adea9728c00caf34294e807966d88047f26f8"} err="failed to get container status \"88dad209ae8e8d063a908076aa5adea9728c00caf34294e807966d88047f26f8\": rpc error: code = NotFound desc = could not find container \"88dad209ae8e8d063a908076aa5adea9728c00caf34294e807966d88047f26f8\": container with ID starting with 88dad209ae8e8d063a908076aa5adea9728c00caf34294e807966d88047f26f8 not found: ID does not exist" Jan 31 15:02:32 crc kubenswrapper[4751]: I0131 15:02:32.059738 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-sync-mxvm7"] Jan 31 15:02:32 crc kubenswrapper[4751]: I0131 15:02:32.065445 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-sync-mxvm7"] Jan 31 15:02:32 crc kubenswrapper[4751]: I0131 15:02:32.099596 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glancea977-account-delete-zhttj"] Jan 31 15:02:32 crc kubenswrapper[4751]: E0131 15:02:32.099920 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad" containerName="glance-api" Jan 31 15:02:32 crc kubenswrapper[4751]: I0131 15:02:32.099940 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad" containerName="glance-api" Jan 31 15:02:32 crc kubenswrapper[4751]: E0131 15:02:32.099965 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c99f5b1-8566-4141-9bd4-71a75e7f43b6" containerName="glance-log" Jan 31 15:02:32 crc kubenswrapper[4751]: I0131 15:02:32.099972 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c99f5b1-8566-4141-9bd4-71a75e7f43b6" containerName="glance-log" Jan 31 15:02:32 crc kubenswrapper[4751]: E0131 15:02:32.099985 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c99f5b1-8566-4141-9bd4-71a75e7f43b6" containerName="glance-httpd" Jan 31 15:02:32 crc kubenswrapper[4751]: I0131 15:02:32.099992 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c99f5b1-8566-4141-9bd4-71a75e7f43b6" containerName="glance-httpd" Jan 31 15:02:32 crc kubenswrapper[4751]: E0131 15:02:32.100009 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad" containerName="glance-httpd" Jan 31 15:02:32 crc kubenswrapper[4751]: I0131 15:02:32.100014 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad" containerName="glance-httpd" Jan 31 15:02:32 crc kubenswrapper[4751]: E0131 15:02:32.100025 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad" containerName="glance-log" Jan 31 15:02:32 crc kubenswrapper[4751]: I0131 15:02:32.100032 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad" containerName="glance-log" Jan 31 15:02:32 crc kubenswrapper[4751]: E0131 15:02:32.100040 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95acd323-0a11-4e25-8439-f848c8811df5" containerName="glance-api" Jan 31 15:02:32 crc kubenswrapper[4751]: I0131 15:02:32.100049 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="95acd323-0a11-4e25-8439-f848c8811df5" containerName="glance-api" Jan 31 15:02:32 crc kubenswrapper[4751]: E0131 15:02:32.100080 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c99f5b1-8566-4141-9bd4-71a75e7f43b6" containerName="glance-api" Jan 31 15:02:32 crc kubenswrapper[4751]: I0131 15:02:32.100088 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c99f5b1-8566-4141-9bd4-71a75e7f43b6" containerName="glance-api" Jan 31 15:02:32 crc kubenswrapper[4751]: E0131 15:02:32.100099 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0b77b88-19a5-4bdc-87a1-6a65273226a2" containerName="glance-api" Jan 31 15:02:32 crc kubenswrapper[4751]: I0131 15:02:32.100105 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0b77b88-19a5-4bdc-87a1-6a65273226a2" containerName="glance-api" Jan 31 15:02:32 crc kubenswrapper[4751]: E0131 15:02:32.100117 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a459e47-85a7-4f4d-84ba-a7d3e01180dc" containerName="glance-api" Jan 31 15:02:32 crc kubenswrapper[4751]: I0131 15:02:32.100124 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a459e47-85a7-4f4d-84ba-a7d3e01180dc" containerName="glance-api" Jan 31 15:02:32 crc kubenswrapper[4751]: E0131 15:02:32.100136 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0b77b88-19a5-4bdc-87a1-6a65273226a2" containerName="glance-httpd" Jan 31 15:02:32 crc kubenswrapper[4751]: I0131 15:02:32.100144 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0b77b88-19a5-4bdc-87a1-6a65273226a2" containerName="glance-httpd" Jan 31 15:02:32 crc kubenswrapper[4751]: E0131 15:02:32.100159 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95acd323-0a11-4e25-8439-f848c8811df5" containerName="glance-log" Jan 31 15:02:32 crc kubenswrapper[4751]: I0131 15:02:32.100166 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="95acd323-0a11-4e25-8439-f848c8811df5" containerName="glance-log" Jan 31 15:02:32 crc kubenswrapper[4751]: E0131 15:02:32.100179 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0b77b88-19a5-4bdc-87a1-6a65273226a2" containerName="glance-log" Jan 31 15:02:32 crc kubenswrapper[4751]: I0131 15:02:32.100186 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0b77b88-19a5-4bdc-87a1-6a65273226a2" containerName="glance-log" Jan 31 15:02:32 crc kubenswrapper[4751]: E0131 15:02:32.100197 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a459e47-85a7-4f4d-84ba-a7d3e01180dc" containerName="glance-httpd" Jan 31 15:02:32 crc kubenswrapper[4751]: I0131 15:02:32.100206 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a459e47-85a7-4f4d-84ba-a7d3e01180dc" containerName="glance-httpd" Jan 31 15:02:32 crc kubenswrapper[4751]: E0131 15:02:32.100223 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="320d0141-d27c-4f4d-9527-ae0f4db2f4fe" containerName="glance-api" Jan 31 15:02:32 crc kubenswrapper[4751]: I0131 15:02:32.100231 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="320d0141-d27c-4f4d-9527-ae0f4db2f4fe" containerName="glance-api" Jan 31 15:02:32 crc kubenswrapper[4751]: E0131 15:02:32.100243 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95acd323-0a11-4e25-8439-f848c8811df5" containerName="glance-httpd" Jan 31 15:02:32 crc kubenswrapper[4751]: I0131 15:02:32.100250 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="95acd323-0a11-4e25-8439-f848c8811df5" containerName="glance-httpd" Jan 31 15:02:32 crc kubenswrapper[4751]: E0131 15:02:32.100263 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="320d0141-d27c-4f4d-9527-ae0f4db2f4fe" containerName="glance-httpd" Jan 31 15:02:32 crc kubenswrapper[4751]: I0131 15:02:32.100270 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="320d0141-d27c-4f4d-9527-ae0f4db2f4fe" containerName="glance-httpd" Jan 31 15:02:32 crc kubenswrapper[4751]: E0131 15:02:32.100281 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="320d0141-d27c-4f4d-9527-ae0f4db2f4fe" containerName="glance-log" Jan 31 15:02:32 crc kubenswrapper[4751]: I0131 15:02:32.100288 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="320d0141-d27c-4f4d-9527-ae0f4db2f4fe" containerName="glance-log" Jan 31 15:02:32 crc kubenswrapper[4751]: E0131 15:02:32.100297 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a459e47-85a7-4f4d-84ba-a7d3e01180dc" containerName="glance-log" Jan 31 15:02:32 crc kubenswrapper[4751]: I0131 15:02:32.100302 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a459e47-85a7-4f4d-84ba-a7d3e01180dc" containerName="glance-log" Jan 31 15:02:32 crc kubenswrapper[4751]: I0131 15:02:32.100422 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="95acd323-0a11-4e25-8439-f848c8811df5" containerName="glance-log" Jan 31 15:02:32 crc kubenswrapper[4751]: I0131 15:02:32.100432 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="95acd323-0a11-4e25-8439-f848c8811df5" containerName="glance-api" Jan 31 15:02:32 crc kubenswrapper[4751]: I0131 15:02:32.100441 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a459e47-85a7-4f4d-84ba-a7d3e01180dc" containerName="glance-httpd" Jan 31 15:02:32 crc kubenswrapper[4751]: I0131 15:02:32.100448 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad" containerName="glance-httpd" Jan 31 15:02:32 crc kubenswrapper[4751]: I0131 15:02:32.100455 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="320d0141-d27c-4f4d-9527-ae0f4db2f4fe" containerName="glance-log" Jan 31 15:02:32 crc kubenswrapper[4751]: I0131 15:02:32.100462 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad" containerName="glance-log" Jan 31 15:02:32 crc kubenswrapper[4751]: I0131 15:02:32.100472 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0b77b88-19a5-4bdc-87a1-6a65273226a2" containerName="glance-log" Jan 31 15:02:32 crc kubenswrapper[4751]: I0131 15:02:32.100480 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0b77b88-19a5-4bdc-87a1-6a65273226a2" containerName="glance-api" Jan 31 15:02:32 crc kubenswrapper[4751]: I0131 15:02:32.100490 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c99f5b1-8566-4141-9bd4-71a75e7f43b6" containerName="glance-log" Jan 31 15:02:32 crc kubenswrapper[4751]: I0131 15:02:32.100499 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a459e47-85a7-4f4d-84ba-a7d3e01180dc" containerName="glance-log" Jan 31 15:02:32 crc kubenswrapper[4751]: I0131 15:02:32.100508 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ad25a0a-80c0-46fc-9eb7-c91e86c2d3ad" containerName="glance-api" Jan 31 15:02:32 crc kubenswrapper[4751]: I0131 15:02:32.100517 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="95acd323-0a11-4e25-8439-f848c8811df5" containerName="glance-httpd" Jan 31 15:02:32 crc kubenswrapper[4751]: I0131 15:02:32.100526 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0b77b88-19a5-4bdc-87a1-6a65273226a2" containerName="glance-httpd" Jan 31 15:02:32 crc kubenswrapper[4751]: I0131 15:02:32.100533 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="320d0141-d27c-4f4d-9527-ae0f4db2f4fe" containerName="glance-httpd" Jan 31 15:02:32 crc kubenswrapper[4751]: I0131 15:02:32.100539 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c99f5b1-8566-4141-9bd4-71a75e7f43b6" containerName="glance-httpd" Jan 31 15:02:32 crc kubenswrapper[4751]: I0131 15:02:32.100546 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="320d0141-d27c-4f4d-9527-ae0f4db2f4fe" containerName="glance-api" Jan 31 15:02:32 crc kubenswrapper[4751]: I0131 15:02:32.100554 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a459e47-85a7-4f4d-84ba-a7d3e01180dc" containerName="glance-api" Jan 31 15:02:32 crc kubenswrapper[4751]: I0131 15:02:32.100561 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c99f5b1-8566-4141-9bd4-71a75e7f43b6" containerName="glance-api" Jan 31 15:02:32 crc kubenswrapper[4751]: I0131 15:02:32.101134 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glancea977-account-delete-zhttj" Jan 31 15:02:32 crc kubenswrapper[4751]: I0131 15:02:32.111289 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glancea977-account-delete-zhttj"] Jan 31 15:02:32 crc kubenswrapper[4751]: I0131 15:02:32.244799 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8h45f\" (UniqueName: \"kubernetes.io/projected/67f83dc8-ae5c-44bf-8760-91952693b0cb-kube-api-access-8h45f\") pod \"glancea977-account-delete-zhttj\" (UID: \"67f83dc8-ae5c-44bf-8760-91952693b0cb\") " pod="glance-kuttl-tests/glancea977-account-delete-zhttj" Jan 31 15:02:32 crc kubenswrapper[4751]: I0131 15:02:32.244883 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67f83dc8-ae5c-44bf-8760-91952693b0cb-operator-scripts\") pod \"glancea977-account-delete-zhttj\" (UID: \"67f83dc8-ae5c-44bf-8760-91952693b0cb\") " pod="glance-kuttl-tests/glancea977-account-delete-zhttj" Jan 31 15:02:32 crc kubenswrapper[4751]: I0131 15:02:32.346394 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8h45f\" (UniqueName: \"kubernetes.io/projected/67f83dc8-ae5c-44bf-8760-91952693b0cb-kube-api-access-8h45f\") pod \"glancea977-account-delete-zhttj\" (UID: \"67f83dc8-ae5c-44bf-8760-91952693b0cb\") " pod="glance-kuttl-tests/glancea977-account-delete-zhttj" Jan 31 15:02:32 crc kubenswrapper[4751]: I0131 15:02:32.346481 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67f83dc8-ae5c-44bf-8760-91952693b0cb-operator-scripts\") pod \"glancea977-account-delete-zhttj\" (UID: \"67f83dc8-ae5c-44bf-8760-91952693b0cb\") " pod="glance-kuttl-tests/glancea977-account-delete-zhttj" Jan 31 15:02:32 crc kubenswrapper[4751]: I0131 15:02:32.347225 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67f83dc8-ae5c-44bf-8760-91952693b0cb-operator-scripts\") pod \"glancea977-account-delete-zhttj\" (UID: \"67f83dc8-ae5c-44bf-8760-91952693b0cb\") " pod="glance-kuttl-tests/glancea977-account-delete-zhttj" Jan 31 15:02:32 crc kubenswrapper[4751]: I0131 15:02:32.369043 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8h45f\" (UniqueName: \"kubernetes.io/projected/67f83dc8-ae5c-44bf-8760-91952693b0cb-kube-api-access-8h45f\") pod \"glancea977-account-delete-zhttj\" (UID: \"67f83dc8-ae5c-44bf-8760-91952693b0cb\") " pod="glance-kuttl-tests/glancea977-account-delete-zhttj" Jan 31 15:02:32 crc kubenswrapper[4751]: I0131 15:02:32.415797 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glancea977-account-delete-zhttj" Jan 31 15:02:32 crc kubenswrapper[4751]: I0131 15:02:32.415935 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c99f5b1-8566-4141-9bd4-71a75e7f43b6" path="/var/lib/kubelet/pods/5c99f5b1-8566-4141-9bd4-71a75e7f43b6/volumes" Jan 31 15:02:32 crc kubenswrapper[4751]: I0131 15:02:32.416831 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbf741e4-9445-4080-84f2-601e270f7aa0" path="/var/lib/kubelet/pods/dbf741e4-9445-4080-84f2-601e270f7aa0/volumes" Jan 31 15:02:32 crc kubenswrapper[4751]: I0131 15:02:32.417815 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0b77b88-19a5-4bdc-87a1-6a65273226a2" path="/var/lib/kubelet/pods/f0b77b88-19a5-4bdc-87a1-6a65273226a2/volumes" Jan 31 15:02:32 crc kubenswrapper[4751]: I0131 15:02:32.859709 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glancea977-account-delete-zhttj"] Jan 31 15:02:33 crc kubenswrapper[4751]: I0131 15:02:33.690474 4751 generic.go:334] "Generic (PLEG): container finished" podID="67f83dc8-ae5c-44bf-8760-91952693b0cb" containerID="748bb24fed6fe40319dbeeaf8bdfc4e48c0cf8e80d0e06626f9b2a7dd29a8843" exitCode=0 Jan 31 15:02:33 crc kubenswrapper[4751]: I0131 15:02:33.690576 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glancea977-account-delete-zhttj" event={"ID":"67f83dc8-ae5c-44bf-8760-91952693b0cb","Type":"ContainerDied","Data":"748bb24fed6fe40319dbeeaf8bdfc4e48c0cf8e80d0e06626f9b2a7dd29a8843"} Jan 31 15:02:33 crc kubenswrapper[4751]: I0131 15:02:33.690807 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glancea977-account-delete-zhttj" event={"ID":"67f83dc8-ae5c-44bf-8760-91952693b0cb","Type":"ContainerStarted","Data":"98436c2d7cbd664ed0c8b67784e8453bdc23efe3519e54a49c94c675f566dd23"} Jan 31 15:02:34 crc kubenswrapper[4751]: I0131 15:02:34.971876 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glancea977-account-delete-zhttj" Jan 31 15:02:35 crc kubenswrapper[4751]: I0131 15:02:35.082658 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67f83dc8-ae5c-44bf-8760-91952693b0cb-operator-scripts\") pod \"67f83dc8-ae5c-44bf-8760-91952693b0cb\" (UID: \"67f83dc8-ae5c-44bf-8760-91952693b0cb\") " Jan 31 15:02:35 crc kubenswrapper[4751]: I0131 15:02:35.082727 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8h45f\" (UniqueName: \"kubernetes.io/projected/67f83dc8-ae5c-44bf-8760-91952693b0cb-kube-api-access-8h45f\") pod \"67f83dc8-ae5c-44bf-8760-91952693b0cb\" (UID: \"67f83dc8-ae5c-44bf-8760-91952693b0cb\") " Jan 31 15:02:35 crc kubenswrapper[4751]: I0131 15:02:35.083739 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67f83dc8-ae5c-44bf-8760-91952693b0cb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "67f83dc8-ae5c-44bf-8760-91952693b0cb" (UID: "67f83dc8-ae5c-44bf-8760-91952693b0cb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 15:02:35 crc kubenswrapper[4751]: I0131 15:02:35.089083 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67f83dc8-ae5c-44bf-8760-91952693b0cb-kube-api-access-8h45f" (OuterVolumeSpecName: "kube-api-access-8h45f") pod "67f83dc8-ae5c-44bf-8760-91952693b0cb" (UID: "67f83dc8-ae5c-44bf-8760-91952693b0cb"). InnerVolumeSpecName "kube-api-access-8h45f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:02:35 crc kubenswrapper[4751]: I0131 15:02:35.184134 4751 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67f83dc8-ae5c-44bf-8760-91952693b0cb-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:35 crc kubenswrapper[4751]: I0131 15:02:35.184167 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8h45f\" (UniqueName: \"kubernetes.io/projected/67f83dc8-ae5c-44bf-8760-91952693b0cb-kube-api-access-8h45f\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:35 crc kubenswrapper[4751]: I0131 15:02:35.711230 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glancea977-account-delete-zhttj" event={"ID":"67f83dc8-ae5c-44bf-8760-91952693b0cb","Type":"ContainerDied","Data":"98436c2d7cbd664ed0c8b67784e8453bdc23efe3519e54a49c94c675f566dd23"} Jan 31 15:02:35 crc kubenswrapper[4751]: I0131 15:02:35.711602 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98436c2d7cbd664ed0c8b67784e8453bdc23efe3519e54a49c94c675f566dd23" Jan 31 15:02:35 crc kubenswrapper[4751]: I0131 15:02:35.711279 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glancea977-account-delete-zhttj" Jan 31 15:02:37 crc kubenswrapper[4751]: I0131 15:02:37.125671 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-create-mcgm2"] Jan 31 15:02:37 crc kubenswrapper[4751]: I0131 15:02:37.131583 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-create-mcgm2"] Jan 31 15:02:37 crc kubenswrapper[4751]: I0131 15:02:37.140381 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glancea977-account-delete-zhttj"] Jan 31 15:02:37 crc kubenswrapper[4751]: I0131 15:02:37.145700 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-a977-account-create-update-tlstz"] Jan 31 15:02:37 crc kubenswrapper[4751]: I0131 15:02:37.150277 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-a977-account-create-update-tlstz"] Jan 31 15:02:37 crc kubenswrapper[4751]: I0131 15:02:37.155012 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glancea977-account-delete-zhttj"] Jan 31 15:02:37 crc kubenswrapper[4751]: I0131 15:02:37.746488 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-create-b924d"] Jan 31 15:02:37 crc kubenswrapper[4751]: E0131 15:02:37.747341 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67f83dc8-ae5c-44bf-8760-91952693b0cb" containerName="mariadb-account-delete" Jan 31 15:02:37 crc kubenswrapper[4751]: I0131 15:02:37.747360 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="67f83dc8-ae5c-44bf-8760-91952693b0cb" containerName="mariadb-account-delete" Jan 31 15:02:37 crc kubenswrapper[4751]: I0131 15:02:37.748994 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="67f83dc8-ae5c-44bf-8760-91952693b0cb" containerName="mariadb-account-delete" Jan 31 15:02:37 crc kubenswrapper[4751]: I0131 15:02:37.751446 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-b924d" Jan 31 15:02:37 crc kubenswrapper[4751]: I0131 15:02:37.758822 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-5797-account-create-update-hfdl2"] Jan 31 15:02:37 crc kubenswrapper[4751]: I0131 15:02:37.759903 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-5797-account-create-update-hfdl2" Jan 31 15:02:37 crc kubenswrapper[4751]: I0131 15:02:37.765724 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-5797-account-create-update-hfdl2"] Jan 31 15:02:37 crc kubenswrapper[4751]: I0131 15:02:37.770604 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-db-secret" Jan 31 15:02:37 crc kubenswrapper[4751]: I0131 15:02:37.773200 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-b924d"] Jan 31 15:02:37 crc kubenswrapper[4751]: I0131 15:02:37.823561 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frh95\" (UniqueName: \"kubernetes.io/projected/58c33299-57ac-4fc9-9751-b521d31e60cf-kube-api-access-frh95\") pod \"glance-db-create-b924d\" (UID: \"58c33299-57ac-4fc9-9751-b521d31e60cf\") " pod="glance-kuttl-tests/glance-db-create-b924d" Jan 31 15:02:37 crc kubenswrapper[4751]: I0131 15:02:37.823701 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58c33299-57ac-4fc9-9751-b521d31e60cf-operator-scripts\") pod \"glance-db-create-b924d\" (UID: \"58c33299-57ac-4fc9-9751-b521d31e60cf\") " pod="glance-kuttl-tests/glance-db-create-b924d" Jan 31 15:02:37 crc kubenswrapper[4751]: I0131 15:02:37.924415 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58c33299-57ac-4fc9-9751-b521d31e60cf-operator-scripts\") pod \"glance-db-create-b924d\" (UID: \"58c33299-57ac-4fc9-9751-b521d31e60cf\") " pod="glance-kuttl-tests/glance-db-create-b924d" Jan 31 15:02:37 crc kubenswrapper[4751]: I0131 15:02:37.924479 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/896f2e37-3440-46e7-81ed-2805ab336470-operator-scripts\") pod \"glance-5797-account-create-update-hfdl2\" (UID: \"896f2e37-3440-46e7-81ed-2805ab336470\") " pod="glance-kuttl-tests/glance-5797-account-create-update-hfdl2" Jan 31 15:02:37 crc kubenswrapper[4751]: I0131 15:02:37.924504 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frh95\" (UniqueName: \"kubernetes.io/projected/58c33299-57ac-4fc9-9751-b521d31e60cf-kube-api-access-frh95\") pod \"glance-db-create-b924d\" (UID: \"58c33299-57ac-4fc9-9751-b521d31e60cf\") " pod="glance-kuttl-tests/glance-db-create-b924d" Jan 31 15:02:37 crc kubenswrapper[4751]: I0131 15:02:37.924522 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hsvw\" (UniqueName: \"kubernetes.io/projected/896f2e37-3440-46e7-81ed-2805ab336470-kube-api-access-8hsvw\") pod \"glance-5797-account-create-update-hfdl2\" (UID: \"896f2e37-3440-46e7-81ed-2805ab336470\") " pod="glance-kuttl-tests/glance-5797-account-create-update-hfdl2" Jan 31 15:02:37 crc kubenswrapper[4751]: I0131 15:02:37.925166 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58c33299-57ac-4fc9-9751-b521d31e60cf-operator-scripts\") pod \"glance-db-create-b924d\" (UID: \"58c33299-57ac-4fc9-9751-b521d31e60cf\") " pod="glance-kuttl-tests/glance-db-create-b924d" Jan 31 15:02:37 crc kubenswrapper[4751]: I0131 15:02:37.952089 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frh95\" (UniqueName: \"kubernetes.io/projected/58c33299-57ac-4fc9-9751-b521d31e60cf-kube-api-access-frh95\") pod \"glance-db-create-b924d\" (UID: \"58c33299-57ac-4fc9-9751-b521d31e60cf\") " pod="glance-kuttl-tests/glance-db-create-b924d" Jan 31 15:02:38 crc kubenswrapper[4751]: I0131 15:02:38.025662 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hsvw\" (UniqueName: \"kubernetes.io/projected/896f2e37-3440-46e7-81ed-2805ab336470-kube-api-access-8hsvw\") pod \"glance-5797-account-create-update-hfdl2\" (UID: \"896f2e37-3440-46e7-81ed-2805ab336470\") " pod="glance-kuttl-tests/glance-5797-account-create-update-hfdl2" Jan 31 15:02:38 crc kubenswrapper[4751]: I0131 15:02:38.025833 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/896f2e37-3440-46e7-81ed-2805ab336470-operator-scripts\") pod \"glance-5797-account-create-update-hfdl2\" (UID: \"896f2e37-3440-46e7-81ed-2805ab336470\") " pod="glance-kuttl-tests/glance-5797-account-create-update-hfdl2" Jan 31 15:02:38 crc kubenswrapper[4751]: I0131 15:02:38.026657 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/896f2e37-3440-46e7-81ed-2805ab336470-operator-scripts\") pod \"glance-5797-account-create-update-hfdl2\" (UID: \"896f2e37-3440-46e7-81ed-2805ab336470\") " pod="glance-kuttl-tests/glance-5797-account-create-update-hfdl2" Jan 31 15:02:38 crc kubenswrapper[4751]: I0131 15:02:38.042912 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hsvw\" (UniqueName: \"kubernetes.io/projected/896f2e37-3440-46e7-81ed-2805ab336470-kube-api-access-8hsvw\") pod \"glance-5797-account-create-update-hfdl2\" (UID: \"896f2e37-3440-46e7-81ed-2805ab336470\") " pod="glance-kuttl-tests/glance-5797-account-create-update-hfdl2" Jan 31 15:02:38 crc kubenswrapper[4751]: I0131 15:02:38.077006 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-b924d" Jan 31 15:02:38 crc kubenswrapper[4751]: I0131 15:02:38.085191 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-5797-account-create-update-hfdl2" Jan 31 15:02:38 crc kubenswrapper[4751]: I0131 15:02:38.414447 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67f83dc8-ae5c-44bf-8760-91952693b0cb" path="/var/lib/kubelet/pods/67f83dc8-ae5c-44bf-8760-91952693b0cb/volumes" Jan 31 15:02:38 crc kubenswrapper[4751]: I0131 15:02:38.415383 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9e826f0-62a4-4a7c-8945-0c29cd34e667" path="/var/lib/kubelet/pods/d9e826f0-62a4-4a7c-8945-0c29cd34e667/volumes" Jan 31 15:02:38 crc kubenswrapper[4751]: I0131 15:02:38.415839 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9730563-64d8-44a2-9d93-7fe5fcd4c8d4" path="/var/lib/kubelet/pods/e9730563-64d8-44a2-9d93-7fe5fcd4c8d4/volumes" Jan 31 15:02:38 crc kubenswrapper[4751]: I0131 15:02:38.564755 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-5797-account-create-update-hfdl2"] Jan 31 15:02:38 crc kubenswrapper[4751]: I0131 15:02:38.612792 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-b924d"] Jan 31 15:02:38 crc kubenswrapper[4751]: W0131 15:02:38.620434 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod58c33299_57ac_4fc9_9751_b521d31e60cf.slice/crio-c586edc1f2bb3b69231b90966abffedc546d02a97477bc07a17109fd14a87d7e WatchSource:0}: Error finding container c586edc1f2bb3b69231b90966abffedc546d02a97477bc07a17109fd14a87d7e: Status 404 returned error can't find the container with id c586edc1f2bb3b69231b90966abffedc546d02a97477bc07a17109fd14a87d7e Jan 31 15:02:38 crc kubenswrapper[4751]: I0131 15:02:38.746514 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-5797-account-create-update-hfdl2" event={"ID":"896f2e37-3440-46e7-81ed-2805ab336470","Type":"ContainerStarted","Data":"79a10f8ac34beb7889999938f10a1f8fd98e243cac870f30f1dc184e88a0e786"} Jan 31 15:02:38 crc kubenswrapper[4751]: I0131 15:02:38.747010 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-5797-account-create-update-hfdl2" event={"ID":"896f2e37-3440-46e7-81ed-2805ab336470","Type":"ContainerStarted","Data":"7ec3d74c87884c401f6544a44bd9fa3d26f9394122c1ea5e7d900c5c386bfef1"} Jan 31 15:02:38 crc kubenswrapper[4751]: I0131 15:02:38.748524 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-b924d" event={"ID":"58c33299-57ac-4fc9-9751-b521d31e60cf","Type":"ContainerStarted","Data":"f05a5057693bfdfb7d9c10870add4a18c1b97e05d99f428268b3b93785058feb"} Jan 31 15:02:38 crc kubenswrapper[4751]: I0131 15:02:38.748563 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-b924d" event={"ID":"58c33299-57ac-4fc9-9751-b521d31e60cf","Type":"ContainerStarted","Data":"c586edc1f2bb3b69231b90966abffedc546d02a97477bc07a17109fd14a87d7e"} Jan 31 15:02:38 crc kubenswrapper[4751]: I0131 15:02:38.766311 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-5797-account-create-update-hfdl2" podStartSLOduration=1.7662908800000001 podStartE2EDuration="1.76629088s" podCreationTimestamp="2026-01-31 15:02:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 15:02:38.759594553 +0000 UTC m=+1263.134307438" watchObservedRunningTime="2026-01-31 15:02:38.76629088 +0000 UTC m=+1263.141003755" Jan 31 15:02:38 crc kubenswrapper[4751]: I0131 15:02:38.779370 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-create-b924d" podStartSLOduration=1.7793534439999998 podStartE2EDuration="1.779353444s" podCreationTimestamp="2026-01-31 15:02:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 15:02:38.774896317 +0000 UTC m=+1263.149609222" watchObservedRunningTime="2026-01-31 15:02:38.779353444 +0000 UTC m=+1263.154066329" Jan 31 15:02:38 crc kubenswrapper[4751]: I0131 15:02:38.896431 4751 patch_prober.go:28] interesting pod/machine-config-daemon-2wpj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 15:02:38 crc kubenswrapper[4751]: I0131 15:02:38.896504 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podUID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 15:02:38 crc kubenswrapper[4751]: I0131 15:02:38.896560 4751 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" Jan 31 15:02:38 crc kubenswrapper[4751]: I0131 15:02:38.897286 4751 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7fcf941f127d31d0e5c99d5ef038c633782d289d0e911f4e9c5c6f77b2a91e2a"} pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 31 15:02:38 crc kubenswrapper[4751]: I0131 15:02:38.897361 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podUID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" containerName="machine-config-daemon" containerID="cri-o://7fcf941f127d31d0e5c99d5ef038c633782d289d0e911f4e9c5c6f77b2a91e2a" gracePeriod=600 Jan 31 15:02:39 crc kubenswrapper[4751]: I0131 15:02:39.760635 4751 generic.go:334] "Generic (PLEG): container finished" podID="58c33299-57ac-4fc9-9751-b521d31e60cf" containerID="f05a5057693bfdfb7d9c10870add4a18c1b97e05d99f428268b3b93785058feb" exitCode=0 Jan 31 15:02:39 crc kubenswrapper[4751]: I0131 15:02:39.760724 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-b924d" event={"ID":"58c33299-57ac-4fc9-9751-b521d31e60cf","Type":"ContainerDied","Data":"f05a5057693bfdfb7d9c10870add4a18c1b97e05d99f428268b3b93785058feb"} Jan 31 15:02:39 crc kubenswrapper[4751]: I0131 15:02:39.763337 4751 generic.go:334] "Generic (PLEG): container finished" podID="896f2e37-3440-46e7-81ed-2805ab336470" containerID="79a10f8ac34beb7889999938f10a1f8fd98e243cac870f30f1dc184e88a0e786" exitCode=0 Jan 31 15:02:39 crc kubenswrapper[4751]: I0131 15:02:39.763384 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-5797-account-create-update-hfdl2" event={"ID":"896f2e37-3440-46e7-81ed-2805ab336470","Type":"ContainerDied","Data":"79a10f8ac34beb7889999938f10a1f8fd98e243cac870f30f1dc184e88a0e786"} Jan 31 15:02:39 crc kubenswrapper[4751]: I0131 15:02:39.767211 4751 generic.go:334] "Generic (PLEG): container finished" podID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" containerID="7fcf941f127d31d0e5c99d5ef038c633782d289d0e911f4e9c5c6f77b2a91e2a" exitCode=0 Jan 31 15:02:39 crc kubenswrapper[4751]: I0131 15:02:39.767280 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" event={"ID":"b4c170e8-22c9-43a9-8b34-9d626c2ccddc","Type":"ContainerDied","Data":"7fcf941f127d31d0e5c99d5ef038c633782d289d0e911f4e9c5c6f77b2a91e2a"} Jan 31 15:02:39 crc kubenswrapper[4751]: I0131 15:02:39.767321 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" event={"ID":"b4c170e8-22c9-43a9-8b34-9d626c2ccddc","Type":"ContainerStarted","Data":"89a88ddaeae8a6fe7859be79e45bc66e157a0d02a03f5daf69e0ab6320bd15be"} Jan 31 15:02:39 crc kubenswrapper[4751]: I0131 15:02:39.767349 4751 scope.go:117] "RemoveContainer" containerID="dc064826cd8a78005216541d25736856cc2dd920bfe44778b79dbfd2f76ed341" Jan 31 15:02:41 crc kubenswrapper[4751]: I0131 15:02:41.095582 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-5797-account-create-update-hfdl2" Jan 31 15:02:41 crc kubenswrapper[4751]: I0131 15:02:41.102281 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-b924d" Jan 31 15:02:41 crc kubenswrapper[4751]: I0131 15:02:41.171260 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/896f2e37-3440-46e7-81ed-2805ab336470-operator-scripts\") pod \"896f2e37-3440-46e7-81ed-2805ab336470\" (UID: \"896f2e37-3440-46e7-81ed-2805ab336470\") " Jan 31 15:02:41 crc kubenswrapper[4751]: I0131 15:02:41.171416 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hsvw\" (UniqueName: \"kubernetes.io/projected/896f2e37-3440-46e7-81ed-2805ab336470-kube-api-access-8hsvw\") pod \"896f2e37-3440-46e7-81ed-2805ab336470\" (UID: \"896f2e37-3440-46e7-81ed-2805ab336470\") " Jan 31 15:02:41 crc kubenswrapper[4751]: I0131 15:02:41.172172 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/896f2e37-3440-46e7-81ed-2805ab336470-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "896f2e37-3440-46e7-81ed-2805ab336470" (UID: "896f2e37-3440-46e7-81ed-2805ab336470"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 15:02:41 crc kubenswrapper[4751]: I0131 15:02:41.177778 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/896f2e37-3440-46e7-81ed-2805ab336470-kube-api-access-8hsvw" (OuterVolumeSpecName: "kube-api-access-8hsvw") pod "896f2e37-3440-46e7-81ed-2805ab336470" (UID: "896f2e37-3440-46e7-81ed-2805ab336470"). InnerVolumeSpecName "kube-api-access-8hsvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:02:41 crc kubenswrapper[4751]: I0131 15:02:41.272655 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58c33299-57ac-4fc9-9751-b521d31e60cf-operator-scripts\") pod \"58c33299-57ac-4fc9-9751-b521d31e60cf\" (UID: \"58c33299-57ac-4fc9-9751-b521d31e60cf\") " Jan 31 15:02:41 crc kubenswrapper[4751]: I0131 15:02:41.272744 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frh95\" (UniqueName: \"kubernetes.io/projected/58c33299-57ac-4fc9-9751-b521d31e60cf-kube-api-access-frh95\") pod \"58c33299-57ac-4fc9-9751-b521d31e60cf\" (UID: \"58c33299-57ac-4fc9-9751-b521d31e60cf\") " Jan 31 15:02:41 crc kubenswrapper[4751]: I0131 15:02:41.273024 4751 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/896f2e37-3440-46e7-81ed-2805ab336470-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:41 crc kubenswrapper[4751]: I0131 15:02:41.273037 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hsvw\" (UniqueName: \"kubernetes.io/projected/896f2e37-3440-46e7-81ed-2805ab336470-kube-api-access-8hsvw\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:41 crc kubenswrapper[4751]: I0131 15:02:41.273473 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58c33299-57ac-4fc9-9751-b521d31e60cf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "58c33299-57ac-4fc9-9751-b521d31e60cf" (UID: "58c33299-57ac-4fc9-9751-b521d31e60cf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 15:02:41 crc kubenswrapper[4751]: I0131 15:02:41.279582 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58c33299-57ac-4fc9-9751-b521d31e60cf-kube-api-access-frh95" (OuterVolumeSpecName: "kube-api-access-frh95") pod "58c33299-57ac-4fc9-9751-b521d31e60cf" (UID: "58c33299-57ac-4fc9-9751-b521d31e60cf"). InnerVolumeSpecName "kube-api-access-frh95". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:02:41 crc kubenswrapper[4751]: I0131 15:02:41.374410 4751 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58c33299-57ac-4fc9-9751-b521d31e60cf-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:41 crc kubenswrapper[4751]: I0131 15:02:41.374461 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frh95\" (UniqueName: \"kubernetes.io/projected/58c33299-57ac-4fc9-9751-b521d31e60cf-kube-api-access-frh95\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:41 crc kubenswrapper[4751]: I0131 15:02:41.784405 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-b924d" Jan 31 15:02:41 crc kubenswrapper[4751]: I0131 15:02:41.784589 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-b924d" event={"ID":"58c33299-57ac-4fc9-9751-b521d31e60cf","Type":"ContainerDied","Data":"c586edc1f2bb3b69231b90966abffedc546d02a97477bc07a17109fd14a87d7e"} Jan 31 15:02:41 crc kubenswrapper[4751]: I0131 15:02:41.784675 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c586edc1f2bb3b69231b90966abffedc546d02a97477bc07a17109fd14a87d7e" Jan 31 15:02:41 crc kubenswrapper[4751]: I0131 15:02:41.785985 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-5797-account-create-update-hfdl2" event={"ID":"896f2e37-3440-46e7-81ed-2805ab336470","Type":"ContainerDied","Data":"7ec3d74c87884c401f6544a44bd9fa3d26f9394122c1ea5e7d900c5c386bfef1"} Jan 31 15:02:41 crc kubenswrapper[4751]: I0131 15:02:41.786038 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ec3d74c87884c401f6544a44bd9fa3d26f9394122c1ea5e7d900c5c386bfef1" Jan 31 15:02:41 crc kubenswrapper[4751]: I0131 15:02:41.786006 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-5797-account-create-update-hfdl2" Jan 31 15:02:42 crc kubenswrapper[4751]: I0131 15:02:42.854560 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-sync-pn752"] Jan 31 15:02:42 crc kubenswrapper[4751]: E0131 15:02:42.855101 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="896f2e37-3440-46e7-81ed-2805ab336470" containerName="mariadb-account-create-update" Jan 31 15:02:42 crc kubenswrapper[4751]: I0131 15:02:42.855116 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="896f2e37-3440-46e7-81ed-2805ab336470" containerName="mariadb-account-create-update" Jan 31 15:02:42 crc kubenswrapper[4751]: E0131 15:02:42.855148 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58c33299-57ac-4fc9-9751-b521d31e60cf" containerName="mariadb-database-create" Jan 31 15:02:42 crc kubenswrapper[4751]: I0131 15:02:42.855156 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="58c33299-57ac-4fc9-9751-b521d31e60cf" containerName="mariadb-database-create" Jan 31 15:02:42 crc kubenswrapper[4751]: I0131 15:02:42.855304 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="58c33299-57ac-4fc9-9751-b521d31e60cf" containerName="mariadb-database-create" Jan 31 15:02:42 crc kubenswrapper[4751]: I0131 15:02:42.855326 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="896f2e37-3440-46e7-81ed-2805ab336470" containerName="mariadb-account-create-update" Jan 31 15:02:42 crc kubenswrapper[4751]: I0131 15:02:42.855721 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-pn752" Jan 31 15:02:42 crc kubenswrapper[4751]: I0131 15:02:42.858896 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-config-data" Jan 31 15:02:42 crc kubenswrapper[4751]: I0131 15:02:42.859201 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-hltqc" Jan 31 15:02:42 crc kubenswrapper[4751]: I0131 15:02:42.867181 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-pn752"] Jan 31 15:02:42 crc kubenswrapper[4751]: I0131 15:02:42.997970 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g2hb\" (UniqueName: \"kubernetes.io/projected/d350f693-ea74-48d5-a7a7-3fa3264174ca-kube-api-access-8g2hb\") pod \"glance-db-sync-pn752\" (UID: \"d350f693-ea74-48d5-a7a7-3fa3264174ca\") " pod="glance-kuttl-tests/glance-db-sync-pn752" Jan 31 15:02:42 crc kubenswrapper[4751]: I0131 15:02:42.998061 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d350f693-ea74-48d5-a7a7-3fa3264174ca-db-sync-config-data\") pod \"glance-db-sync-pn752\" (UID: \"d350f693-ea74-48d5-a7a7-3fa3264174ca\") " pod="glance-kuttl-tests/glance-db-sync-pn752" Jan 31 15:02:42 crc kubenswrapper[4751]: I0131 15:02:42.998124 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d350f693-ea74-48d5-a7a7-3fa3264174ca-config-data\") pod \"glance-db-sync-pn752\" (UID: \"d350f693-ea74-48d5-a7a7-3fa3264174ca\") " pod="glance-kuttl-tests/glance-db-sync-pn752" Jan 31 15:02:43 crc kubenswrapper[4751]: I0131 15:02:43.099521 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d350f693-ea74-48d5-a7a7-3fa3264174ca-config-data\") pod \"glance-db-sync-pn752\" (UID: \"d350f693-ea74-48d5-a7a7-3fa3264174ca\") " pod="glance-kuttl-tests/glance-db-sync-pn752" Jan 31 15:02:43 crc kubenswrapper[4751]: I0131 15:02:43.099635 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8g2hb\" (UniqueName: \"kubernetes.io/projected/d350f693-ea74-48d5-a7a7-3fa3264174ca-kube-api-access-8g2hb\") pod \"glance-db-sync-pn752\" (UID: \"d350f693-ea74-48d5-a7a7-3fa3264174ca\") " pod="glance-kuttl-tests/glance-db-sync-pn752" Jan 31 15:02:43 crc kubenswrapper[4751]: I0131 15:02:43.099713 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d350f693-ea74-48d5-a7a7-3fa3264174ca-db-sync-config-data\") pod \"glance-db-sync-pn752\" (UID: \"d350f693-ea74-48d5-a7a7-3fa3264174ca\") " pod="glance-kuttl-tests/glance-db-sync-pn752" Jan 31 15:02:43 crc kubenswrapper[4751]: I0131 15:02:43.111982 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d350f693-ea74-48d5-a7a7-3fa3264174ca-db-sync-config-data\") pod \"glance-db-sync-pn752\" (UID: \"d350f693-ea74-48d5-a7a7-3fa3264174ca\") " pod="glance-kuttl-tests/glance-db-sync-pn752" Jan 31 15:02:43 crc kubenswrapper[4751]: I0131 15:02:43.112013 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d350f693-ea74-48d5-a7a7-3fa3264174ca-config-data\") pod \"glance-db-sync-pn752\" (UID: \"d350f693-ea74-48d5-a7a7-3fa3264174ca\") " pod="glance-kuttl-tests/glance-db-sync-pn752" Jan 31 15:02:43 crc kubenswrapper[4751]: I0131 15:02:43.117723 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g2hb\" (UniqueName: \"kubernetes.io/projected/d350f693-ea74-48d5-a7a7-3fa3264174ca-kube-api-access-8g2hb\") pod \"glance-db-sync-pn752\" (UID: \"d350f693-ea74-48d5-a7a7-3fa3264174ca\") " pod="glance-kuttl-tests/glance-db-sync-pn752" Jan 31 15:02:43 crc kubenswrapper[4751]: I0131 15:02:43.171591 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-pn752" Jan 31 15:02:43 crc kubenswrapper[4751]: I0131 15:02:43.582003 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-pn752"] Jan 31 15:02:43 crc kubenswrapper[4751]: W0131 15:02:43.585713 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd350f693_ea74_48d5_a7a7_3fa3264174ca.slice/crio-6009e8294d091b1eb1aa19dd6fb100b42cfbbfaf924c7ebad1269a56b8969294 WatchSource:0}: Error finding container 6009e8294d091b1eb1aa19dd6fb100b42cfbbfaf924c7ebad1269a56b8969294: Status 404 returned error can't find the container with id 6009e8294d091b1eb1aa19dd6fb100b42cfbbfaf924c7ebad1269a56b8969294 Jan 31 15:02:43 crc kubenswrapper[4751]: I0131 15:02:43.801523 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-pn752" event={"ID":"d350f693-ea74-48d5-a7a7-3fa3264174ca","Type":"ContainerStarted","Data":"6009e8294d091b1eb1aa19dd6fb100b42cfbbfaf924c7ebad1269a56b8969294"} Jan 31 15:02:44 crc kubenswrapper[4751]: I0131 15:02:44.810024 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-pn752" event={"ID":"d350f693-ea74-48d5-a7a7-3fa3264174ca","Type":"ContainerStarted","Data":"0e1d80ca3a8421336cb1b11f5bd0a2d183f47c5e60dedbf720f6c08836e3d291"} Jan 31 15:02:44 crc kubenswrapper[4751]: I0131 15:02:44.841810 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-sync-pn752" podStartSLOduration=2.841791062 podStartE2EDuration="2.841791062s" podCreationTimestamp="2026-01-31 15:02:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 15:02:44.839773838 +0000 UTC m=+1269.214486723" watchObservedRunningTime="2026-01-31 15:02:44.841791062 +0000 UTC m=+1269.216503967" Jan 31 15:02:46 crc kubenswrapper[4751]: I0131 15:02:46.825331 4751 generic.go:334] "Generic (PLEG): container finished" podID="d350f693-ea74-48d5-a7a7-3fa3264174ca" containerID="0e1d80ca3a8421336cb1b11f5bd0a2d183f47c5e60dedbf720f6c08836e3d291" exitCode=0 Jan 31 15:02:46 crc kubenswrapper[4751]: I0131 15:02:46.825437 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-pn752" event={"ID":"d350f693-ea74-48d5-a7a7-3fa3264174ca","Type":"ContainerDied","Data":"0e1d80ca3a8421336cb1b11f5bd0a2d183f47c5e60dedbf720f6c08836e3d291"} Jan 31 15:02:48 crc kubenswrapper[4751]: I0131 15:02:48.158274 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-pn752" Jan 31 15:02:48 crc kubenswrapper[4751]: I0131 15:02:48.271008 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8g2hb\" (UniqueName: \"kubernetes.io/projected/d350f693-ea74-48d5-a7a7-3fa3264174ca-kube-api-access-8g2hb\") pod \"d350f693-ea74-48d5-a7a7-3fa3264174ca\" (UID: \"d350f693-ea74-48d5-a7a7-3fa3264174ca\") " Jan 31 15:02:48 crc kubenswrapper[4751]: I0131 15:02:48.271255 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d350f693-ea74-48d5-a7a7-3fa3264174ca-db-sync-config-data\") pod \"d350f693-ea74-48d5-a7a7-3fa3264174ca\" (UID: \"d350f693-ea74-48d5-a7a7-3fa3264174ca\") " Jan 31 15:02:48 crc kubenswrapper[4751]: I0131 15:02:48.271282 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d350f693-ea74-48d5-a7a7-3fa3264174ca-config-data\") pod \"d350f693-ea74-48d5-a7a7-3fa3264174ca\" (UID: \"d350f693-ea74-48d5-a7a7-3fa3264174ca\") " Jan 31 15:02:48 crc kubenswrapper[4751]: I0131 15:02:48.276892 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d350f693-ea74-48d5-a7a7-3fa3264174ca-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "d350f693-ea74-48d5-a7a7-3fa3264174ca" (UID: "d350f693-ea74-48d5-a7a7-3fa3264174ca"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:02:48 crc kubenswrapper[4751]: I0131 15:02:48.278878 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d350f693-ea74-48d5-a7a7-3fa3264174ca-kube-api-access-8g2hb" (OuterVolumeSpecName: "kube-api-access-8g2hb") pod "d350f693-ea74-48d5-a7a7-3fa3264174ca" (UID: "d350f693-ea74-48d5-a7a7-3fa3264174ca"). InnerVolumeSpecName "kube-api-access-8g2hb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:02:48 crc kubenswrapper[4751]: I0131 15:02:48.309213 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d350f693-ea74-48d5-a7a7-3fa3264174ca-config-data" (OuterVolumeSpecName: "config-data") pod "d350f693-ea74-48d5-a7a7-3fa3264174ca" (UID: "d350f693-ea74-48d5-a7a7-3fa3264174ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:02:48 crc kubenswrapper[4751]: I0131 15:02:48.373479 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d350f693-ea74-48d5-a7a7-3fa3264174ca-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:48 crc kubenswrapper[4751]: I0131 15:02:48.373515 4751 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d350f693-ea74-48d5-a7a7-3fa3264174ca-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:48 crc kubenswrapper[4751]: I0131 15:02:48.373527 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8g2hb\" (UniqueName: \"kubernetes.io/projected/d350f693-ea74-48d5-a7a7-3fa3264174ca-kube-api-access-8g2hb\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:48 crc kubenswrapper[4751]: I0131 15:02:48.839624 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-pn752" event={"ID":"d350f693-ea74-48d5-a7a7-3fa3264174ca","Type":"ContainerDied","Data":"6009e8294d091b1eb1aa19dd6fb100b42cfbbfaf924c7ebad1269a56b8969294"} Jan 31 15:02:48 crc kubenswrapper[4751]: I0131 15:02:48.839662 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6009e8294d091b1eb1aa19dd6fb100b42cfbbfaf924c7ebad1269a56b8969294" Jan 31 15:02:48 crc kubenswrapper[4751]: I0131 15:02:48.839695 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-pn752" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.144856 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Jan 31 15:02:50 crc kubenswrapper[4751]: E0131 15:02:50.145544 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d350f693-ea74-48d5-a7a7-3fa3264174ca" containerName="glance-db-sync" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.145562 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="d350f693-ea74-48d5-a7a7-3fa3264174ca" containerName="glance-db-sync" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.145705 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="d350f693-ea74-48d5-a7a7-3fa3264174ca" containerName="glance-db-sync" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.146408 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.148018 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-external-config-data" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.148143 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.154114 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-hltqc" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.160009 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.297125 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5d5c53d-eea5-4866-983f-8477eb16177b-config-data\") pod \"glance-default-external-api-0\" (UID: \"a5d5c53d-eea5-4866-983f-8477eb16177b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.297183 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5d5c53d-eea5-4866-983f-8477eb16177b-logs\") pod \"glance-default-external-api-0\" (UID: \"a5d5c53d-eea5-4866-983f-8477eb16177b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.297385 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a5d5c53d-eea5-4866-983f-8477eb16177b-dev\") pod \"glance-default-external-api-0\" (UID: \"a5d5c53d-eea5-4866-983f-8477eb16177b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.297456 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"a5d5c53d-eea5-4866-983f-8477eb16177b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.297528 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a5d5c53d-eea5-4866-983f-8477eb16177b-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"a5d5c53d-eea5-4866-983f-8477eb16177b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.297555 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a5d5c53d-eea5-4866-983f-8477eb16177b-sys\") pod \"glance-default-external-api-0\" (UID: \"a5d5c53d-eea5-4866-983f-8477eb16177b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.297580 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a5d5c53d-eea5-4866-983f-8477eb16177b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a5d5c53d-eea5-4866-983f-8477eb16177b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.297689 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a5d5c53d-eea5-4866-983f-8477eb16177b-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"a5d5c53d-eea5-4866-983f-8477eb16177b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.297727 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5d5c53d-eea5-4866-983f-8477eb16177b-scripts\") pod \"glance-default-external-api-0\" (UID: \"a5d5c53d-eea5-4866-983f-8477eb16177b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.297787 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a5d5c53d-eea5-4866-983f-8477eb16177b-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"a5d5c53d-eea5-4866-983f-8477eb16177b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.297812 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a5d5c53d-eea5-4866-983f-8477eb16177b-run\") pod \"glance-default-external-api-0\" (UID: \"a5d5c53d-eea5-4866-983f-8477eb16177b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.297878 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llcmb\" (UniqueName: \"kubernetes.io/projected/a5d5c53d-eea5-4866-983f-8477eb16177b-kube-api-access-llcmb\") pod \"glance-default-external-api-0\" (UID: \"a5d5c53d-eea5-4866-983f-8477eb16177b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.297905 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"a5d5c53d-eea5-4866-983f-8477eb16177b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.297934 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a5d5c53d-eea5-4866-983f-8477eb16177b-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"a5d5c53d-eea5-4866-983f-8477eb16177b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.398974 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a5d5c53d-eea5-4866-983f-8477eb16177b-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"a5d5c53d-eea5-4866-983f-8477eb16177b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.399021 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5d5c53d-eea5-4866-983f-8477eb16177b-scripts\") pod \"glance-default-external-api-0\" (UID: \"a5d5c53d-eea5-4866-983f-8477eb16177b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.399050 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a5d5c53d-eea5-4866-983f-8477eb16177b-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"a5d5c53d-eea5-4866-983f-8477eb16177b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.399080 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a5d5c53d-eea5-4866-983f-8477eb16177b-run\") pod \"glance-default-external-api-0\" (UID: \"a5d5c53d-eea5-4866-983f-8477eb16177b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.399108 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llcmb\" (UniqueName: \"kubernetes.io/projected/a5d5c53d-eea5-4866-983f-8477eb16177b-kube-api-access-llcmb\") pod \"glance-default-external-api-0\" (UID: \"a5d5c53d-eea5-4866-983f-8477eb16177b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.399126 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"a5d5c53d-eea5-4866-983f-8477eb16177b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.399151 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a5d5c53d-eea5-4866-983f-8477eb16177b-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"a5d5c53d-eea5-4866-983f-8477eb16177b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.399182 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5d5c53d-eea5-4866-983f-8477eb16177b-config-data\") pod \"glance-default-external-api-0\" (UID: \"a5d5c53d-eea5-4866-983f-8477eb16177b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.399203 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5d5c53d-eea5-4866-983f-8477eb16177b-logs\") pod \"glance-default-external-api-0\" (UID: \"a5d5c53d-eea5-4866-983f-8477eb16177b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.399237 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a5d5c53d-eea5-4866-983f-8477eb16177b-dev\") pod \"glance-default-external-api-0\" (UID: \"a5d5c53d-eea5-4866-983f-8477eb16177b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.399243 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a5d5c53d-eea5-4866-983f-8477eb16177b-run\") pod \"glance-default-external-api-0\" (UID: \"a5d5c53d-eea5-4866-983f-8477eb16177b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.399257 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"a5d5c53d-eea5-4866-983f-8477eb16177b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.399361 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a5d5c53d-eea5-4866-983f-8477eb16177b-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"a5d5c53d-eea5-4866-983f-8477eb16177b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.399383 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a5d5c53d-eea5-4866-983f-8477eb16177b-sys\") pod \"glance-default-external-api-0\" (UID: \"a5d5c53d-eea5-4866-983f-8477eb16177b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.399408 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a5d5c53d-eea5-4866-983f-8477eb16177b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a5d5c53d-eea5-4866-983f-8477eb16177b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.399433 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a5d5c53d-eea5-4866-983f-8477eb16177b-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"a5d5c53d-eea5-4866-983f-8477eb16177b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.399435 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a5d5c53d-eea5-4866-983f-8477eb16177b-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"a5d5c53d-eea5-4866-983f-8477eb16177b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.399472 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a5d5c53d-eea5-4866-983f-8477eb16177b-dev\") pod \"glance-default-external-api-0\" (UID: \"a5d5c53d-eea5-4866-983f-8477eb16177b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.399353 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a5d5c53d-eea5-4866-983f-8477eb16177b-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"a5d5c53d-eea5-4866-983f-8477eb16177b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.399581 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a5d5c53d-eea5-4866-983f-8477eb16177b-sys\") pod \"glance-default-external-api-0\" (UID: \"a5d5c53d-eea5-4866-983f-8477eb16177b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.399638 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a5d5c53d-eea5-4866-983f-8477eb16177b-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"a5d5c53d-eea5-4866-983f-8477eb16177b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.399789 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"a5d5c53d-eea5-4866-983f-8477eb16177b\") device mount path \"/mnt/openstack/pv02\"" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.399972 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a5d5c53d-eea5-4866-983f-8477eb16177b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a5d5c53d-eea5-4866-983f-8477eb16177b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.400004 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5d5c53d-eea5-4866-983f-8477eb16177b-logs\") pod \"glance-default-external-api-0\" (UID: \"a5d5c53d-eea5-4866-983f-8477eb16177b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.400101 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"a5d5c53d-eea5-4866-983f-8477eb16177b\") device mount path \"/mnt/openstack/pv05\"" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.408924 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5d5c53d-eea5-4866-983f-8477eb16177b-config-data\") pod \"glance-default-external-api-0\" (UID: \"a5d5c53d-eea5-4866-983f-8477eb16177b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.409379 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5d5c53d-eea5-4866-983f-8477eb16177b-scripts\") pod \"glance-default-external-api-0\" (UID: \"a5d5c53d-eea5-4866-983f-8477eb16177b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.419705 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llcmb\" (UniqueName: \"kubernetes.io/projected/a5d5c53d-eea5-4866-983f-8477eb16177b-kube-api-access-llcmb\") pod \"glance-default-external-api-0\" (UID: \"a5d5c53d-eea5-4866-983f-8477eb16177b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.424151 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"a5d5c53d-eea5-4866-983f-8477eb16177b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.431509 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"a5d5c53d-eea5-4866-983f-8477eb16177b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.463053 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.587872 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.589010 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.591527 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-internal-config-data" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.602637 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.703769 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/00ce2535-6386-444d-8bbd-abded7935ebf-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"00ce2535-6386-444d-8bbd-abded7935ebf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.703844 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/00ce2535-6386-444d-8bbd-abded7935ebf-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"00ce2535-6386-444d-8bbd-abded7935ebf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.703869 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/00ce2535-6386-444d-8bbd-abded7935ebf-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"00ce2535-6386-444d-8bbd-abded7935ebf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.703919 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/00ce2535-6386-444d-8bbd-abded7935ebf-run\") pod \"glance-default-internal-api-0\" (UID: \"00ce2535-6386-444d-8bbd-abded7935ebf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.703984 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-0\" (UID: \"00ce2535-6386-444d-8bbd-abded7935ebf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.704001 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00ce2535-6386-444d-8bbd-abded7935ebf-logs\") pod \"glance-default-internal-api-0\" (UID: \"00ce2535-6386-444d-8bbd-abded7935ebf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.704037 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwdrl\" (UniqueName: \"kubernetes.io/projected/00ce2535-6386-444d-8bbd-abded7935ebf-kube-api-access-vwdrl\") pod \"glance-default-internal-api-0\" (UID: \"00ce2535-6386-444d-8bbd-abded7935ebf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.704062 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00ce2535-6386-444d-8bbd-abded7935ebf-config-data\") pod \"glance-default-internal-api-0\" (UID: \"00ce2535-6386-444d-8bbd-abded7935ebf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.704097 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00ce2535-6386-444d-8bbd-abded7935ebf-scripts\") pod \"glance-default-internal-api-0\" (UID: \"00ce2535-6386-444d-8bbd-abded7935ebf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.704125 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/00ce2535-6386-444d-8bbd-abded7935ebf-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"00ce2535-6386-444d-8bbd-abded7935ebf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.704139 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/00ce2535-6386-444d-8bbd-abded7935ebf-dev\") pod \"glance-default-internal-api-0\" (UID: \"00ce2535-6386-444d-8bbd-abded7935ebf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.704164 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-internal-api-0\" (UID: \"00ce2535-6386-444d-8bbd-abded7935ebf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.704201 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/00ce2535-6386-444d-8bbd-abded7935ebf-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"00ce2535-6386-444d-8bbd-abded7935ebf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.704227 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/00ce2535-6386-444d-8bbd-abded7935ebf-sys\") pod \"glance-default-internal-api-0\" (UID: \"00ce2535-6386-444d-8bbd-abded7935ebf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.805317 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/00ce2535-6386-444d-8bbd-abded7935ebf-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"00ce2535-6386-444d-8bbd-abded7935ebf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.805375 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/00ce2535-6386-444d-8bbd-abded7935ebf-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"00ce2535-6386-444d-8bbd-abded7935ebf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.805410 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/00ce2535-6386-444d-8bbd-abded7935ebf-run\") pod \"glance-default-internal-api-0\" (UID: \"00ce2535-6386-444d-8bbd-abded7935ebf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.805447 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-0\" (UID: \"00ce2535-6386-444d-8bbd-abded7935ebf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.805465 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00ce2535-6386-444d-8bbd-abded7935ebf-logs\") pod \"glance-default-internal-api-0\" (UID: \"00ce2535-6386-444d-8bbd-abded7935ebf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.805488 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/00ce2535-6386-444d-8bbd-abded7935ebf-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"00ce2535-6386-444d-8bbd-abded7935ebf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.805514 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwdrl\" (UniqueName: \"kubernetes.io/projected/00ce2535-6386-444d-8bbd-abded7935ebf-kube-api-access-vwdrl\") pod \"glance-default-internal-api-0\" (UID: \"00ce2535-6386-444d-8bbd-abded7935ebf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.805502 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/00ce2535-6386-444d-8bbd-abded7935ebf-run\") pod \"glance-default-internal-api-0\" (UID: \"00ce2535-6386-444d-8bbd-abded7935ebf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.805591 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00ce2535-6386-444d-8bbd-abded7935ebf-config-data\") pod \"glance-default-internal-api-0\" (UID: \"00ce2535-6386-444d-8bbd-abded7935ebf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.805641 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00ce2535-6386-444d-8bbd-abded7935ebf-scripts\") pod \"glance-default-internal-api-0\" (UID: \"00ce2535-6386-444d-8bbd-abded7935ebf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.805722 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/00ce2535-6386-444d-8bbd-abded7935ebf-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"00ce2535-6386-444d-8bbd-abded7935ebf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.805751 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/00ce2535-6386-444d-8bbd-abded7935ebf-dev\") pod \"glance-default-internal-api-0\" (UID: \"00ce2535-6386-444d-8bbd-abded7935ebf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.805818 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-internal-api-0\" (UID: \"00ce2535-6386-444d-8bbd-abded7935ebf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.805837 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/00ce2535-6386-444d-8bbd-abded7935ebf-dev\") pod \"glance-default-internal-api-0\" (UID: \"00ce2535-6386-444d-8bbd-abded7935ebf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.805854 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-0\" (UID: \"00ce2535-6386-444d-8bbd-abded7935ebf\") device mount path \"/mnt/openstack/pv13\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.805843 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/00ce2535-6386-444d-8bbd-abded7935ebf-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"00ce2535-6386-444d-8bbd-abded7935ebf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.805850 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/00ce2535-6386-444d-8bbd-abded7935ebf-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"00ce2535-6386-444d-8bbd-abded7935ebf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.805879 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/00ce2535-6386-444d-8bbd-abded7935ebf-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"00ce2535-6386-444d-8bbd-abded7935ebf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.805962 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/00ce2535-6386-444d-8bbd-abded7935ebf-sys\") pod \"glance-default-internal-api-0\" (UID: \"00ce2535-6386-444d-8bbd-abded7935ebf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.806002 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/00ce2535-6386-444d-8bbd-abded7935ebf-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"00ce2535-6386-444d-8bbd-abded7935ebf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.806013 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/00ce2535-6386-444d-8bbd-abded7935ebf-sys\") pod \"glance-default-internal-api-0\" (UID: \"00ce2535-6386-444d-8bbd-abded7935ebf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.805973 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-internal-api-0\" (UID: \"00ce2535-6386-444d-8bbd-abded7935ebf\") device mount path \"/mnt/openstack/pv17\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.806032 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/00ce2535-6386-444d-8bbd-abded7935ebf-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"00ce2535-6386-444d-8bbd-abded7935ebf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.806114 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00ce2535-6386-444d-8bbd-abded7935ebf-logs\") pod \"glance-default-internal-api-0\" (UID: \"00ce2535-6386-444d-8bbd-abded7935ebf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.806154 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/00ce2535-6386-444d-8bbd-abded7935ebf-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"00ce2535-6386-444d-8bbd-abded7935ebf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.812777 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00ce2535-6386-444d-8bbd-abded7935ebf-scripts\") pod \"glance-default-internal-api-0\" (UID: \"00ce2535-6386-444d-8bbd-abded7935ebf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.813246 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00ce2535-6386-444d-8bbd-abded7935ebf-config-data\") pod \"glance-default-internal-api-0\" (UID: \"00ce2535-6386-444d-8bbd-abded7935ebf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.822062 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwdrl\" (UniqueName: \"kubernetes.io/projected/00ce2535-6386-444d-8bbd-abded7935ebf-kube-api-access-vwdrl\") pod \"glance-default-internal-api-0\" (UID: \"00ce2535-6386-444d-8bbd-abded7935ebf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.829347 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-0\" (UID: \"00ce2535-6386-444d-8bbd-abded7935ebf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.832592 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-internal-api-0\" (UID: \"00ce2535-6386-444d-8bbd-abded7935ebf\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.907054 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:50 crc kubenswrapper[4751]: I0131 15:02:50.909045 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Jan 31 15:02:51 crc kubenswrapper[4751]: I0131 15:02:51.117274 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 15:02:51 crc kubenswrapper[4751]: I0131 15:02:51.339859 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 15:02:51 crc kubenswrapper[4751]: I0131 15:02:51.879806 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"a5d5c53d-eea5-4866-983f-8477eb16177b","Type":"ContainerStarted","Data":"46e6d1ac28c6a7caa669a8786fb2699fde254ad44262a93b0f5099c64c3baee9"} Jan 31 15:02:51 crc kubenswrapper[4751]: I0131 15:02:51.880330 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"a5d5c53d-eea5-4866-983f-8477eb16177b","Type":"ContainerStarted","Data":"1130e67347fe51af0e3a73480eea09807ed13c49b680ae612acbf9d7812bf42d"} Jan 31 15:02:51 crc kubenswrapper[4751]: I0131 15:02:51.880347 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"a5d5c53d-eea5-4866-983f-8477eb16177b","Type":"ContainerStarted","Data":"f196d65739ae0a450d1f988eb2e240599d07c3bc3006a4085be84398709e7ee3"} Jan 31 15:02:51 crc kubenswrapper[4751]: I0131 15:02:51.882978 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"00ce2535-6386-444d-8bbd-abded7935ebf","Type":"ContainerStarted","Data":"ae2fb2bf76086e3824ffca58eb1d7c7c332b7d443fb93e1343b97051d794f81b"} Jan 31 15:02:51 crc kubenswrapper[4751]: I0131 15:02:51.883003 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"00ce2535-6386-444d-8bbd-abded7935ebf","Type":"ContainerStarted","Data":"6f4bddbaf0148a369cf2bebe12727f16d7417ca9251868a8567b9cbe7ad7cc1d"} Jan 31 15:02:51 crc kubenswrapper[4751]: I0131 15:02:51.910543 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-0" podStartSLOduration=1.910519046 podStartE2EDuration="1.910519046s" podCreationTimestamp="2026-01-31 15:02:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 15:02:51.901704943 +0000 UTC m=+1276.276417828" watchObservedRunningTime="2026-01-31 15:02:51.910519046 +0000 UTC m=+1276.285231951" Jan 31 15:02:52 crc kubenswrapper[4751]: I0131 15:02:52.891830 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"00ce2535-6386-444d-8bbd-abded7935ebf","Type":"ContainerStarted","Data":"8320b4bf6cf3688e90f70ba1e9d0543cf0c4236c9baf7a9f5afd1fdd87cde7f1"} Jan 31 15:02:52 crc kubenswrapper[4751]: I0131 15:02:52.891882 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="00ce2535-6386-444d-8bbd-abded7935ebf" containerName="glance-log" containerID="cri-o://ae2fb2bf76086e3824ffca58eb1d7c7c332b7d443fb93e1343b97051d794f81b" gracePeriod=30 Jan 31 15:02:52 crc kubenswrapper[4751]: I0131 15:02:52.891994 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="00ce2535-6386-444d-8bbd-abded7935ebf" containerName="glance-httpd" containerID="cri-o://8320b4bf6cf3688e90f70ba1e9d0543cf0c4236c9baf7a9f5afd1fdd87cde7f1" gracePeriod=30 Jan 31 15:02:52 crc kubenswrapper[4751]: I0131 15:02:52.926356 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=3.926336553 podStartE2EDuration="3.926336553s" podCreationTimestamp="2026-01-31 15:02:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 15:02:52.920374176 +0000 UTC m=+1277.295087081" watchObservedRunningTime="2026-01-31 15:02:52.926336553 +0000 UTC m=+1277.301049438" Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.313510 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.351984 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/00ce2535-6386-444d-8bbd-abded7935ebf-var-locks-brick\") pod \"00ce2535-6386-444d-8bbd-abded7935ebf\" (UID: \"00ce2535-6386-444d-8bbd-abded7935ebf\") " Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.352044 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00ce2535-6386-444d-8bbd-abded7935ebf-scripts\") pod \"00ce2535-6386-444d-8bbd-abded7935ebf\" (UID: \"00ce2535-6386-444d-8bbd-abded7935ebf\") " Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.352119 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/00ce2535-6386-444d-8bbd-abded7935ebf-sys\") pod \"00ce2535-6386-444d-8bbd-abded7935ebf\" (UID: \"00ce2535-6386-444d-8bbd-abded7935ebf\") " Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.352145 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/00ce2535-6386-444d-8bbd-abded7935ebf-dev\") pod \"00ce2535-6386-444d-8bbd-abded7935ebf\" (UID: \"00ce2535-6386-444d-8bbd-abded7935ebf\") " Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.352176 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"00ce2535-6386-444d-8bbd-abded7935ebf\" (UID: \"00ce2535-6386-444d-8bbd-abded7935ebf\") " Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.352193 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/00ce2535-6386-444d-8bbd-abded7935ebf-etc-nvme\") pod \"00ce2535-6386-444d-8bbd-abded7935ebf\" (UID: \"00ce2535-6386-444d-8bbd-abded7935ebf\") " Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.352223 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/00ce2535-6386-444d-8bbd-abded7935ebf-lib-modules\") pod \"00ce2535-6386-444d-8bbd-abded7935ebf\" (UID: \"00ce2535-6386-444d-8bbd-abded7935ebf\") " Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.352243 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"00ce2535-6386-444d-8bbd-abded7935ebf\" (UID: \"00ce2535-6386-444d-8bbd-abded7935ebf\") " Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.352260 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwdrl\" (UniqueName: \"kubernetes.io/projected/00ce2535-6386-444d-8bbd-abded7935ebf-kube-api-access-vwdrl\") pod \"00ce2535-6386-444d-8bbd-abded7935ebf\" (UID: \"00ce2535-6386-444d-8bbd-abded7935ebf\") " Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.352295 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/00ce2535-6386-444d-8bbd-abded7935ebf-etc-iscsi\") pod \"00ce2535-6386-444d-8bbd-abded7935ebf\" (UID: \"00ce2535-6386-444d-8bbd-abded7935ebf\") " Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.352335 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/00ce2535-6386-444d-8bbd-abded7935ebf-httpd-run\") pod \"00ce2535-6386-444d-8bbd-abded7935ebf\" (UID: \"00ce2535-6386-444d-8bbd-abded7935ebf\") " Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.352397 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00ce2535-6386-444d-8bbd-abded7935ebf-config-data\") pod \"00ce2535-6386-444d-8bbd-abded7935ebf\" (UID: \"00ce2535-6386-444d-8bbd-abded7935ebf\") " Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.352424 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00ce2535-6386-444d-8bbd-abded7935ebf-logs\") pod \"00ce2535-6386-444d-8bbd-abded7935ebf\" (UID: \"00ce2535-6386-444d-8bbd-abded7935ebf\") " Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.352438 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/00ce2535-6386-444d-8bbd-abded7935ebf-run\") pod \"00ce2535-6386-444d-8bbd-abded7935ebf\" (UID: \"00ce2535-6386-444d-8bbd-abded7935ebf\") " Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.352721 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/00ce2535-6386-444d-8bbd-abded7935ebf-run" (OuterVolumeSpecName: "run") pod "00ce2535-6386-444d-8bbd-abded7935ebf" (UID: "00ce2535-6386-444d-8bbd-abded7935ebf"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.352751 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/00ce2535-6386-444d-8bbd-abded7935ebf-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "00ce2535-6386-444d-8bbd-abded7935ebf" (UID: "00ce2535-6386-444d-8bbd-abded7935ebf"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.357796 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/00ce2535-6386-444d-8bbd-abded7935ebf-sys" (OuterVolumeSpecName: "sys") pod "00ce2535-6386-444d-8bbd-abded7935ebf" (UID: "00ce2535-6386-444d-8bbd-abded7935ebf"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.357849 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/00ce2535-6386-444d-8bbd-abded7935ebf-dev" (OuterVolumeSpecName: "dev") pod "00ce2535-6386-444d-8bbd-abded7935ebf" (UID: "00ce2535-6386-444d-8bbd-abded7935ebf"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.359234 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/00ce2535-6386-444d-8bbd-abded7935ebf-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "00ce2535-6386-444d-8bbd-abded7935ebf" (UID: "00ce2535-6386-444d-8bbd-abded7935ebf"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.359335 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/00ce2535-6386-444d-8bbd-abded7935ebf-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "00ce2535-6386-444d-8bbd-abded7935ebf" (UID: "00ce2535-6386-444d-8bbd-abded7935ebf"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.359478 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/00ce2535-6386-444d-8bbd-abded7935ebf-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "00ce2535-6386-444d-8bbd-abded7935ebf" (UID: "00ce2535-6386-444d-8bbd-abded7935ebf"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.359591 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00ce2535-6386-444d-8bbd-abded7935ebf-scripts" (OuterVolumeSpecName: "scripts") pod "00ce2535-6386-444d-8bbd-abded7935ebf" (UID: "00ce2535-6386-444d-8bbd-abded7935ebf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.359826 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00ce2535-6386-444d-8bbd-abded7935ebf-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "00ce2535-6386-444d-8bbd-abded7935ebf" (UID: "00ce2535-6386-444d-8bbd-abded7935ebf"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.359925 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00ce2535-6386-444d-8bbd-abded7935ebf-logs" (OuterVolumeSpecName: "logs") pod "00ce2535-6386-444d-8bbd-abded7935ebf" (UID: "00ce2535-6386-444d-8bbd-abded7935ebf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.361405 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage17-crc" (OuterVolumeSpecName: "glance") pod "00ce2535-6386-444d-8bbd-abded7935ebf" (UID: "00ce2535-6386-444d-8bbd-abded7935ebf"). InnerVolumeSpecName "local-storage17-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.362264 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage13-crc" (OuterVolumeSpecName: "glance-cache") pod "00ce2535-6386-444d-8bbd-abded7935ebf" (UID: "00ce2535-6386-444d-8bbd-abded7935ebf"). InnerVolumeSpecName "local-storage13-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.364143 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00ce2535-6386-444d-8bbd-abded7935ebf-kube-api-access-vwdrl" (OuterVolumeSpecName: "kube-api-access-vwdrl") pod "00ce2535-6386-444d-8bbd-abded7935ebf" (UID: "00ce2535-6386-444d-8bbd-abded7935ebf"). InnerVolumeSpecName "kube-api-access-vwdrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.400631 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00ce2535-6386-444d-8bbd-abded7935ebf-config-data" (OuterVolumeSpecName: "config-data") pod "00ce2535-6386-444d-8bbd-abded7935ebf" (UID: "00ce2535-6386-444d-8bbd-abded7935ebf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.454659 4751 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/00ce2535-6386-444d-8bbd-abded7935ebf-var-locks-brick\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.454691 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00ce2535-6386-444d-8bbd-abded7935ebf-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.454700 4751 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/00ce2535-6386-444d-8bbd-abded7935ebf-sys\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.454708 4751 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/00ce2535-6386-444d-8bbd-abded7935ebf-dev\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.454726 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" " Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.454737 4751 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/00ce2535-6386-444d-8bbd-abded7935ebf-etc-nvme\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.454745 4751 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/00ce2535-6386-444d-8bbd-abded7935ebf-lib-modules\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.454757 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" " Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.454767 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwdrl\" (UniqueName: \"kubernetes.io/projected/00ce2535-6386-444d-8bbd-abded7935ebf-kube-api-access-vwdrl\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.454776 4751 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/00ce2535-6386-444d-8bbd-abded7935ebf-etc-iscsi\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.454784 4751 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/00ce2535-6386-444d-8bbd-abded7935ebf-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.454792 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00ce2535-6386-444d-8bbd-abded7935ebf-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.454801 4751 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00ce2535-6386-444d-8bbd-abded7935ebf-logs\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.454809 4751 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/00ce2535-6386-444d-8bbd-abded7935ebf-run\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.471259 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage17-crc" (UniqueName: "kubernetes.io/local-volume/local-storage17-crc") on node "crc" Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.471363 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage13-crc" (UniqueName: "kubernetes.io/local-volume/local-storage13-crc") on node "crc" Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.556094 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.556128 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.907030 4751 generic.go:334] "Generic (PLEG): container finished" podID="00ce2535-6386-444d-8bbd-abded7935ebf" containerID="8320b4bf6cf3688e90f70ba1e9d0543cf0c4236c9baf7a9f5afd1fdd87cde7f1" exitCode=143 Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.908317 4751 generic.go:334] "Generic (PLEG): container finished" podID="00ce2535-6386-444d-8bbd-abded7935ebf" containerID="ae2fb2bf76086e3824ffca58eb1d7c7c332b7d443fb93e1343b97051d794f81b" exitCode=143 Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.907223 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.907126 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"00ce2535-6386-444d-8bbd-abded7935ebf","Type":"ContainerDied","Data":"8320b4bf6cf3688e90f70ba1e9d0543cf0c4236c9baf7a9f5afd1fdd87cde7f1"} Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.908642 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"00ce2535-6386-444d-8bbd-abded7935ebf","Type":"ContainerDied","Data":"ae2fb2bf76086e3824ffca58eb1d7c7c332b7d443fb93e1343b97051d794f81b"} Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.908669 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"00ce2535-6386-444d-8bbd-abded7935ebf","Type":"ContainerDied","Data":"6f4bddbaf0148a369cf2bebe12727f16d7417ca9251868a8567b9cbe7ad7cc1d"} Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.908705 4751 scope.go:117] "RemoveContainer" containerID="8320b4bf6cf3688e90f70ba1e9d0543cf0c4236c9baf7a9f5afd1fdd87cde7f1" Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.938582 4751 scope.go:117] "RemoveContainer" containerID="ae2fb2bf76086e3824ffca58eb1d7c7c332b7d443fb93e1343b97051d794f81b" Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.965175 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.975485 4751 scope.go:117] "RemoveContainer" containerID="8320b4bf6cf3688e90f70ba1e9d0543cf0c4236c9baf7a9f5afd1fdd87cde7f1" Jan 31 15:02:53 crc kubenswrapper[4751]: E0131 15:02:53.975987 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8320b4bf6cf3688e90f70ba1e9d0543cf0c4236c9baf7a9f5afd1fdd87cde7f1\": container with ID starting with 8320b4bf6cf3688e90f70ba1e9d0543cf0c4236c9baf7a9f5afd1fdd87cde7f1 not found: ID does not exist" containerID="8320b4bf6cf3688e90f70ba1e9d0543cf0c4236c9baf7a9f5afd1fdd87cde7f1" Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.976057 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8320b4bf6cf3688e90f70ba1e9d0543cf0c4236c9baf7a9f5afd1fdd87cde7f1"} err="failed to get container status \"8320b4bf6cf3688e90f70ba1e9d0543cf0c4236c9baf7a9f5afd1fdd87cde7f1\": rpc error: code = NotFound desc = could not find container \"8320b4bf6cf3688e90f70ba1e9d0543cf0c4236c9baf7a9f5afd1fdd87cde7f1\": container with ID starting with 8320b4bf6cf3688e90f70ba1e9d0543cf0c4236c9baf7a9f5afd1fdd87cde7f1 not found: ID does not exist" Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.976107 4751 scope.go:117] "RemoveContainer" containerID="ae2fb2bf76086e3824ffca58eb1d7c7c332b7d443fb93e1343b97051d794f81b" Jan 31 15:02:53 crc kubenswrapper[4751]: E0131 15:02:53.976459 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae2fb2bf76086e3824ffca58eb1d7c7c332b7d443fb93e1343b97051d794f81b\": container with ID starting with ae2fb2bf76086e3824ffca58eb1d7c7c332b7d443fb93e1343b97051d794f81b not found: ID does not exist" containerID="ae2fb2bf76086e3824ffca58eb1d7c7c332b7d443fb93e1343b97051d794f81b" Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.976498 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae2fb2bf76086e3824ffca58eb1d7c7c332b7d443fb93e1343b97051d794f81b"} err="failed to get container status \"ae2fb2bf76086e3824ffca58eb1d7c7c332b7d443fb93e1343b97051d794f81b\": rpc error: code = NotFound desc = could not find container \"ae2fb2bf76086e3824ffca58eb1d7c7c332b7d443fb93e1343b97051d794f81b\": container with ID starting with ae2fb2bf76086e3824ffca58eb1d7c7c332b7d443fb93e1343b97051d794f81b not found: ID does not exist" Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.976524 4751 scope.go:117] "RemoveContainer" containerID="8320b4bf6cf3688e90f70ba1e9d0543cf0c4236c9baf7a9f5afd1fdd87cde7f1" Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.976795 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8320b4bf6cf3688e90f70ba1e9d0543cf0c4236c9baf7a9f5afd1fdd87cde7f1"} err="failed to get container status \"8320b4bf6cf3688e90f70ba1e9d0543cf0c4236c9baf7a9f5afd1fdd87cde7f1\": rpc error: code = NotFound desc = could not find container \"8320b4bf6cf3688e90f70ba1e9d0543cf0c4236c9baf7a9f5afd1fdd87cde7f1\": container with ID starting with 8320b4bf6cf3688e90f70ba1e9d0543cf0c4236c9baf7a9f5afd1fdd87cde7f1 not found: ID does not exist" Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.976816 4751 scope.go:117] "RemoveContainer" containerID="ae2fb2bf76086e3824ffca58eb1d7c7c332b7d443fb93e1343b97051d794f81b" Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.977060 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 15:02:53 crc kubenswrapper[4751]: I0131 15:02:53.977253 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae2fb2bf76086e3824ffca58eb1d7c7c332b7d443fb93e1343b97051d794f81b"} err="failed to get container status \"ae2fb2bf76086e3824ffca58eb1d7c7c332b7d443fb93e1343b97051d794f81b\": rpc error: code = NotFound desc = could not find container \"ae2fb2bf76086e3824ffca58eb1d7c7c332b7d443fb93e1343b97051d794f81b\": container with ID starting with ae2fb2bf76086e3824ffca58eb1d7c7c332b7d443fb93e1343b97051d794f81b not found: ID does not exist" Jan 31 15:02:54 crc kubenswrapper[4751]: I0131 15:02:54.003688 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 15:02:54 crc kubenswrapper[4751]: E0131 15:02:54.003978 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00ce2535-6386-444d-8bbd-abded7935ebf" containerName="glance-httpd" Jan 31 15:02:54 crc kubenswrapper[4751]: I0131 15:02:54.003992 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="00ce2535-6386-444d-8bbd-abded7935ebf" containerName="glance-httpd" Jan 31 15:02:54 crc kubenswrapper[4751]: E0131 15:02:54.004028 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00ce2535-6386-444d-8bbd-abded7935ebf" containerName="glance-log" Jan 31 15:02:54 crc kubenswrapper[4751]: I0131 15:02:54.004038 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="00ce2535-6386-444d-8bbd-abded7935ebf" containerName="glance-log" Jan 31 15:02:54 crc kubenswrapper[4751]: I0131 15:02:54.004291 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="00ce2535-6386-444d-8bbd-abded7935ebf" containerName="glance-log" Jan 31 15:02:54 crc kubenswrapper[4751]: I0131 15:02:54.004333 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="00ce2535-6386-444d-8bbd-abded7935ebf" containerName="glance-httpd" Jan 31 15:02:54 crc kubenswrapper[4751]: I0131 15:02:54.005358 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:54 crc kubenswrapper[4751]: I0131 15:02:54.008730 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-internal-config-data" Jan 31 15:02:54 crc kubenswrapper[4751]: I0131 15:02:54.025443 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 15:02:54 crc kubenswrapper[4751]: I0131 15:02:54.066748 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxgsq\" (UniqueName: \"kubernetes.io/projected/1c3cde72-72a2-4a51-a061-06397061de3c-kube-api-access-kxgsq\") pod \"glance-default-internal-api-0\" (UID: \"1c3cde72-72a2-4a51-a061-06397061de3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:54 crc kubenswrapper[4751]: I0131 15:02:54.067140 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1c3cde72-72a2-4a51-a061-06397061de3c-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"1c3cde72-72a2-4a51-a061-06397061de3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:54 crc kubenswrapper[4751]: I0131 15:02:54.067173 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1c3cde72-72a2-4a51-a061-06397061de3c-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"1c3cde72-72a2-4a51-a061-06397061de3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:54 crc kubenswrapper[4751]: I0131 15:02:54.067302 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1c3cde72-72a2-4a51-a061-06397061de3c-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"1c3cde72-72a2-4a51-a061-06397061de3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:54 crc kubenswrapper[4751]: I0131 15:02:54.067363 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c3cde72-72a2-4a51-a061-06397061de3c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1c3cde72-72a2-4a51-a061-06397061de3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:54 crc kubenswrapper[4751]: I0131 15:02:54.067420 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-0\" (UID: \"1c3cde72-72a2-4a51-a061-06397061de3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:54 crc kubenswrapper[4751]: I0131 15:02:54.067481 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1c3cde72-72a2-4a51-a061-06397061de3c-run\") pod \"glance-default-internal-api-0\" (UID: \"1c3cde72-72a2-4a51-a061-06397061de3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:54 crc kubenswrapper[4751]: I0131 15:02:54.067520 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1c3cde72-72a2-4a51-a061-06397061de3c-dev\") pod \"glance-default-internal-api-0\" (UID: \"1c3cde72-72a2-4a51-a061-06397061de3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:54 crc kubenswrapper[4751]: I0131 15:02:54.067622 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1c3cde72-72a2-4a51-a061-06397061de3c-sys\") pod \"glance-default-internal-api-0\" (UID: \"1c3cde72-72a2-4a51-a061-06397061de3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:54 crc kubenswrapper[4751]: I0131 15:02:54.067655 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c3cde72-72a2-4a51-a061-06397061de3c-logs\") pod \"glance-default-internal-api-0\" (UID: \"1c3cde72-72a2-4a51-a061-06397061de3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:54 crc kubenswrapper[4751]: I0131 15:02:54.068518 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c3cde72-72a2-4a51-a061-06397061de3c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1c3cde72-72a2-4a51-a061-06397061de3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:54 crc kubenswrapper[4751]: I0131 15:02:54.068622 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-internal-api-0\" (UID: \"1c3cde72-72a2-4a51-a061-06397061de3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:54 crc kubenswrapper[4751]: I0131 15:02:54.068738 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1c3cde72-72a2-4a51-a061-06397061de3c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1c3cde72-72a2-4a51-a061-06397061de3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:54 crc kubenswrapper[4751]: I0131 15:02:54.068812 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1c3cde72-72a2-4a51-a061-06397061de3c-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"1c3cde72-72a2-4a51-a061-06397061de3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:54 crc kubenswrapper[4751]: I0131 15:02:54.170342 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c3cde72-72a2-4a51-a061-06397061de3c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1c3cde72-72a2-4a51-a061-06397061de3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:54 crc kubenswrapper[4751]: I0131 15:02:54.170418 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-internal-api-0\" (UID: \"1c3cde72-72a2-4a51-a061-06397061de3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:54 crc kubenswrapper[4751]: I0131 15:02:54.170460 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1c3cde72-72a2-4a51-a061-06397061de3c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1c3cde72-72a2-4a51-a061-06397061de3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:54 crc kubenswrapper[4751]: I0131 15:02:54.170495 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1c3cde72-72a2-4a51-a061-06397061de3c-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"1c3cde72-72a2-4a51-a061-06397061de3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:54 crc kubenswrapper[4751]: I0131 15:02:54.170561 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxgsq\" (UniqueName: \"kubernetes.io/projected/1c3cde72-72a2-4a51-a061-06397061de3c-kube-api-access-kxgsq\") pod \"glance-default-internal-api-0\" (UID: \"1c3cde72-72a2-4a51-a061-06397061de3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:54 crc kubenswrapper[4751]: I0131 15:02:54.170588 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1c3cde72-72a2-4a51-a061-06397061de3c-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"1c3cde72-72a2-4a51-a061-06397061de3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:54 crc kubenswrapper[4751]: I0131 15:02:54.170608 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1c3cde72-72a2-4a51-a061-06397061de3c-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"1c3cde72-72a2-4a51-a061-06397061de3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:54 crc kubenswrapper[4751]: I0131 15:02:54.170629 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1c3cde72-72a2-4a51-a061-06397061de3c-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"1c3cde72-72a2-4a51-a061-06397061de3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:54 crc kubenswrapper[4751]: I0131 15:02:54.170649 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c3cde72-72a2-4a51-a061-06397061de3c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1c3cde72-72a2-4a51-a061-06397061de3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:54 crc kubenswrapper[4751]: I0131 15:02:54.170674 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-0\" (UID: \"1c3cde72-72a2-4a51-a061-06397061de3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:54 crc kubenswrapper[4751]: I0131 15:02:54.170703 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1c3cde72-72a2-4a51-a061-06397061de3c-run\") pod \"glance-default-internal-api-0\" (UID: \"1c3cde72-72a2-4a51-a061-06397061de3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:54 crc kubenswrapper[4751]: I0131 15:02:54.170737 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1c3cde72-72a2-4a51-a061-06397061de3c-dev\") pod \"glance-default-internal-api-0\" (UID: \"1c3cde72-72a2-4a51-a061-06397061de3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:54 crc kubenswrapper[4751]: I0131 15:02:54.170780 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1c3cde72-72a2-4a51-a061-06397061de3c-sys\") pod \"glance-default-internal-api-0\" (UID: \"1c3cde72-72a2-4a51-a061-06397061de3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:54 crc kubenswrapper[4751]: I0131 15:02:54.170802 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c3cde72-72a2-4a51-a061-06397061de3c-logs\") pod \"glance-default-internal-api-0\" (UID: \"1c3cde72-72a2-4a51-a061-06397061de3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:54 crc kubenswrapper[4751]: I0131 15:02:54.170962 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1c3cde72-72a2-4a51-a061-06397061de3c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1c3cde72-72a2-4a51-a061-06397061de3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:54 crc kubenswrapper[4751]: I0131 15:02:54.171039 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1c3cde72-72a2-4a51-a061-06397061de3c-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"1c3cde72-72a2-4a51-a061-06397061de3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:54 crc kubenswrapper[4751]: I0131 15:02:54.171106 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1c3cde72-72a2-4a51-a061-06397061de3c-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"1c3cde72-72a2-4a51-a061-06397061de3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:54 crc kubenswrapper[4751]: I0131 15:02:54.171142 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1c3cde72-72a2-4a51-a061-06397061de3c-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"1c3cde72-72a2-4a51-a061-06397061de3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:54 crc kubenswrapper[4751]: I0131 15:02:54.171171 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1c3cde72-72a2-4a51-a061-06397061de3c-run\") pod \"glance-default-internal-api-0\" (UID: \"1c3cde72-72a2-4a51-a061-06397061de3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:54 crc kubenswrapper[4751]: I0131 15:02:54.171291 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1c3cde72-72a2-4a51-a061-06397061de3c-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"1c3cde72-72a2-4a51-a061-06397061de3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:54 crc kubenswrapper[4751]: I0131 15:02:54.171308 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c3cde72-72a2-4a51-a061-06397061de3c-logs\") pod \"glance-default-internal-api-0\" (UID: \"1c3cde72-72a2-4a51-a061-06397061de3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:54 crc kubenswrapper[4751]: I0131 15:02:54.171349 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1c3cde72-72a2-4a51-a061-06397061de3c-sys\") pod \"glance-default-internal-api-0\" (UID: \"1c3cde72-72a2-4a51-a061-06397061de3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:54 crc kubenswrapper[4751]: I0131 15:02:54.171343 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1c3cde72-72a2-4a51-a061-06397061de3c-dev\") pod \"glance-default-internal-api-0\" (UID: \"1c3cde72-72a2-4a51-a061-06397061de3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:54 crc kubenswrapper[4751]: I0131 15:02:54.171441 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-0\" (UID: \"1c3cde72-72a2-4a51-a061-06397061de3c\") device mount path \"/mnt/openstack/pv13\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:54 crc kubenswrapper[4751]: I0131 15:02:54.172354 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-internal-api-0\" (UID: \"1c3cde72-72a2-4a51-a061-06397061de3c\") device mount path \"/mnt/openstack/pv17\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:54 crc kubenswrapper[4751]: I0131 15:02:54.185816 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c3cde72-72a2-4a51-a061-06397061de3c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1c3cde72-72a2-4a51-a061-06397061de3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:54 crc kubenswrapper[4751]: I0131 15:02:54.187234 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c3cde72-72a2-4a51-a061-06397061de3c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1c3cde72-72a2-4a51-a061-06397061de3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:54 crc kubenswrapper[4751]: I0131 15:02:54.188874 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxgsq\" (UniqueName: \"kubernetes.io/projected/1c3cde72-72a2-4a51-a061-06397061de3c-kube-api-access-kxgsq\") pod \"glance-default-internal-api-0\" (UID: \"1c3cde72-72a2-4a51-a061-06397061de3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:54 crc kubenswrapper[4751]: I0131 15:02:54.195596 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-0\" (UID: \"1c3cde72-72a2-4a51-a061-06397061de3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:54 crc kubenswrapper[4751]: I0131 15:02:54.212371 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-internal-api-0\" (UID: \"1c3cde72-72a2-4a51-a061-06397061de3c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:54 crc kubenswrapper[4751]: I0131 15:02:54.325290 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:02:54 crc kubenswrapper[4751]: I0131 15:02:54.429250 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00ce2535-6386-444d-8bbd-abded7935ebf" path="/var/lib/kubelet/pods/00ce2535-6386-444d-8bbd-abded7935ebf/volumes" Jan 31 15:02:54 crc kubenswrapper[4751]: I0131 15:02:54.774560 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 15:02:54 crc kubenswrapper[4751]: W0131 15:02:54.778952 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c3cde72_72a2_4a51_a061_06397061de3c.slice/crio-c898b2da5714126a65ee741ceb3ed33e63dd1016489a4c5e916d0a834f254ea4 WatchSource:0}: Error finding container c898b2da5714126a65ee741ceb3ed33e63dd1016489a4c5e916d0a834f254ea4: Status 404 returned error can't find the container with id c898b2da5714126a65ee741ceb3ed33e63dd1016489a4c5e916d0a834f254ea4 Jan 31 15:02:54 crc kubenswrapper[4751]: I0131 15:02:54.916240 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"1c3cde72-72a2-4a51-a061-06397061de3c","Type":"ContainerStarted","Data":"c898b2da5714126a65ee741ceb3ed33e63dd1016489a4c5e916d0a834f254ea4"} Jan 31 15:02:55 crc kubenswrapper[4751]: I0131 15:02:55.924493 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"1c3cde72-72a2-4a51-a061-06397061de3c","Type":"ContainerStarted","Data":"6735f01e0fcf3a4be1978348937bbe37355d165dbd9469a97f61438e89b09635"} Jan 31 15:02:55 crc kubenswrapper[4751]: I0131 15:02:55.924995 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"1c3cde72-72a2-4a51-a061-06397061de3c","Type":"ContainerStarted","Data":"5f91babe44ec234dd474a96b96ebc6224d6c3a4f5e9aa9658f789a37025d6480"} Jan 31 15:02:55 crc kubenswrapper[4751]: I0131 15:02:55.956547 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=2.95651657 podStartE2EDuration="2.95651657s" podCreationTimestamp="2026-01-31 15:02:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 15:02:55.942821799 +0000 UTC m=+1280.317534704" watchObservedRunningTime="2026-01-31 15:02:55.95651657 +0000 UTC m=+1280.331229495" Jan 31 15:03:00 crc kubenswrapper[4751]: I0131 15:03:00.463936 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:03:00 crc kubenswrapper[4751]: I0131 15:03:00.464494 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:03:00 crc kubenswrapper[4751]: I0131 15:03:00.491698 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:03:00 crc kubenswrapper[4751]: I0131 15:03:00.519371 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:03:00 crc kubenswrapper[4751]: I0131 15:03:00.968349 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:03:00 crc kubenswrapper[4751]: I0131 15:03:00.968855 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:03:02 crc kubenswrapper[4751]: I0131 15:03:02.815431 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:03:02 crc kubenswrapper[4751]: I0131 15:03:02.842379 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:03:04 crc kubenswrapper[4751]: I0131 15:03:04.325862 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:03:04 crc kubenswrapper[4751]: I0131 15:03:04.326194 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:03:04 crc kubenswrapper[4751]: I0131 15:03:04.399088 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:03:04 crc kubenswrapper[4751]: I0131 15:03:04.498283 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:03:05 crc kubenswrapper[4751]: I0131 15:03:05.001513 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:03:05 crc kubenswrapper[4751]: I0131 15:03:05.001589 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:03:06 crc kubenswrapper[4751]: I0131 15:03:06.859757 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:03:06 crc kubenswrapper[4751]: I0131 15:03:06.922704 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.031667 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.032989 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.043756 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.045996 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.050501 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.078876 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.129230 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/40930074-48c4-404d-a55c-bb8a4f581f56-etc-iscsi\") pod \"glance-default-external-api-2\" (UID: \"40930074-48c4-404d-a55c-bb8a4f581f56\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.129280 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2142d4ca-115a-49b7-8f50-ac020fdbc342-scripts\") pod \"glance-default-external-api-1\" (UID: \"2142d4ca-115a-49b7-8f50-ac020fdbc342\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.129302 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-1\" (UID: \"2142d4ca-115a-49b7-8f50-ac020fdbc342\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.129405 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2142d4ca-115a-49b7-8f50-ac020fdbc342-dev\") pod \"glance-default-external-api-1\" (UID: \"2142d4ca-115a-49b7-8f50-ac020fdbc342\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.129434 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2142d4ca-115a-49b7-8f50-ac020fdbc342-config-data\") pod \"glance-default-external-api-1\" (UID: \"2142d4ca-115a-49b7-8f50-ac020fdbc342\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.129454 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-1\" (UID: \"2142d4ca-115a-49b7-8f50-ac020fdbc342\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.129472 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2142d4ca-115a-49b7-8f50-ac020fdbc342-run\") pod \"glance-default-external-api-1\" (UID: \"2142d4ca-115a-49b7-8f50-ac020fdbc342\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.129495 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/40930074-48c4-404d-a55c-bb8a4f581f56-var-locks-brick\") pod \"glance-default-external-api-2\" (UID: \"40930074-48c4-404d-a55c-bb8a4f581f56\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.129513 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-2\" (UID: \"40930074-48c4-404d-a55c-bb8a4f581f56\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.129666 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2142d4ca-115a-49b7-8f50-ac020fdbc342-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"2142d4ca-115a-49b7-8f50-ac020fdbc342\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.129715 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/40930074-48c4-404d-a55c-bb8a4f581f56-etc-nvme\") pod \"glance-default-external-api-2\" (UID: \"40930074-48c4-404d-a55c-bb8a4f581f56\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.129739 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/40930074-48c4-404d-a55c-bb8a4f581f56-lib-modules\") pod \"glance-default-external-api-2\" (UID: \"40930074-48c4-404d-a55c-bb8a4f581f56\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.129834 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2142d4ca-115a-49b7-8f50-ac020fdbc342-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"2142d4ca-115a-49b7-8f50-ac020fdbc342\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.129895 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75pr7\" (UniqueName: \"kubernetes.io/projected/2142d4ca-115a-49b7-8f50-ac020fdbc342-kube-api-access-75pr7\") pod \"glance-default-external-api-1\" (UID: \"2142d4ca-115a-49b7-8f50-ac020fdbc342\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.129958 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2142d4ca-115a-49b7-8f50-ac020fdbc342-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"2142d4ca-115a-49b7-8f50-ac020fdbc342\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.130000 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-2\" (UID: \"40930074-48c4-404d-a55c-bb8a4f581f56\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.130020 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2142d4ca-115a-49b7-8f50-ac020fdbc342-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"2142d4ca-115a-49b7-8f50-ac020fdbc342\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.130040 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40930074-48c4-404d-a55c-bb8a4f581f56-logs\") pod \"glance-default-external-api-2\" (UID: \"40930074-48c4-404d-a55c-bb8a4f581f56\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.130060 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/40930074-48c4-404d-a55c-bb8a4f581f56-dev\") pod \"glance-default-external-api-2\" (UID: \"40930074-48c4-404d-a55c-bb8a4f581f56\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.130090 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2142d4ca-115a-49b7-8f50-ac020fdbc342-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"2142d4ca-115a-49b7-8f50-ac020fdbc342\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.130112 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2142d4ca-115a-49b7-8f50-ac020fdbc342-logs\") pod \"glance-default-external-api-1\" (UID: \"2142d4ca-115a-49b7-8f50-ac020fdbc342\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.130148 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40930074-48c4-404d-a55c-bb8a4f581f56-scripts\") pod \"glance-default-external-api-2\" (UID: \"40930074-48c4-404d-a55c-bb8a4f581f56\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.130169 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnnkf\" (UniqueName: \"kubernetes.io/projected/40930074-48c4-404d-a55c-bb8a4f581f56-kube-api-access-rnnkf\") pod \"glance-default-external-api-2\" (UID: \"40930074-48c4-404d-a55c-bb8a4f581f56\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.130184 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2142d4ca-115a-49b7-8f50-ac020fdbc342-sys\") pod \"glance-default-external-api-1\" (UID: \"2142d4ca-115a-49b7-8f50-ac020fdbc342\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.130212 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40930074-48c4-404d-a55c-bb8a4f581f56-config-data\") pod \"glance-default-external-api-2\" (UID: \"40930074-48c4-404d-a55c-bb8a4f581f56\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.130225 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/40930074-48c4-404d-a55c-bb8a4f581f56-run\") pod \"glance-default-external-api-2\" (UID: \"40930074-48c4-404d-a55c-bb8a4f581f56\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.130250 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/40930074-48c4-404d-a55c-bb8a4f581f56-httpd-run\") pod \"glance-default-external-api-2\" (UID: \"40930074-48c4-404d-a55c-bb8a4f581f56\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.130264 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/40930074-48c4-404d-a55c-bb8a4f581f56-sys\") pod \"glance-default-external-api-2\" (UID: \"40930074-48c4-404d-a55c-bb8a4f581f56\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.148830 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.150006 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.157717 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.158908 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.167380 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.175175 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.231289 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/40930074-48c4-404d-a55c-bb8a4f581f56-sys\") pod \"glance-default-external-api-2\" (UID: \"40930074-48c4-404d-a55c-bb8a4f581f56\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.231344 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-httpd-run\") pod \"glance-default-internal-api-2\" (UID: \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.231372 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-config-data\") pod \"glance-default-internal-api-2\" (UID: \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.231392 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-2\" (UID: \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.231411 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-internal-api-1\" (UID: \"63d398be-aa92-4a00-933b-549a0c4e4ad7\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.231429 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/63d398be-aa92-4a00-933b-549a0c4e4ad7-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"63d398be-aa92-4a00-933b-549a0c4e4ad7\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.231428 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/40930074-48c4-404d-a55c-bb8a4f581f56-sys\") pod \"glance-default-external-api-2\" (UID: \"40930074-48c4-404d-a55c-bb8a4f581f56\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.231445 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-etc-iscsi\") pod \"glance-default-internal-api-2\" (UID: \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.231479 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/40930074-48c4-404d-a55c-bb8a4f581f56-etc-iscsi\") pod \"glance-default-external-api-2\" (UID: \"40930074-48c4-404d-a55c-bb8a4f581f56\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.231514 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/63d398be-aa92-4a00-933b-549a0c4e4ad7-run\") pod \"glance-default-internal-api-1\" (UID: \"63d398be-aa92-4a00-933b-549a0c4e4ad7\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.231541 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/63d398be-aa92-4a00-933b-549a0c4e4ad7-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"63d398be-aa92-4a00-933b-549a0c4e4ad7\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.231546 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/40930074-48c4-404d-a55c-bb8a4f581f56-etc-iscsi\") pod \"glance-default-external-api-2\" (UID: \"40930074-48c4-404d-a55c-bb8a4f581f56\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.231566 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2142d4ca-115a-49b7-8f50-ac020fdbc342-scripts\") pod \"glance-default-external-api-1\" (UID: \"2142d4ca-115a-49b7-8f50-ac020fdbc342\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.231588 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-2\" (UID: \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.231771 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-1\" (UID: \"2142d4ca-115a-49b7-8f50-ac020fdbc342\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.231815 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-1\" (UID: \"63d398be-aa92-4a00-933b-549a0c4e4ad7\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.231847 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/63d398be-aa92-4a00-933b-549a0c4e4ad7-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"63d398be-aa92-4a00-933b-549a0c4e4ad7\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.231895 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2142d4ca-115a-49b7-8f50-ac020fdbc342-dev\") pod \"glance-default-external-api-1\" (UID: \"2142d4ca-115a-49b7-8f50-ac020fdbc342\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.231928 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7j7jg\" (UniqueName: \"kubernetes.io/projected/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-kube-api-access-7j7jg\") pod \"glance-default-internal-api-2\" (UID: \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.231950 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2142d4ca-115a-49b7-8f50-ac020fdbc342-config-data\") pod \"glance-default-external-api-1\" (UID: \"2142d4ca-115a-49b7-8f50-ac020fdbc342\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.231968 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-scripts\") pod \"glance-default-internal-api-2\" (UID: \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.231993 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-1\" (UID: \"2142d4ca-115a-49b7-8f50-ac020fdbc342\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.232012 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-var-locks-brick\") pod \"glance-default-internal-api-2\" (UID: \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.232035 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-sys\") pod \"glance-default-internal-api-2\" (UID: \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.232042 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2142d4ca-115a-49b7-8f50-ac020fdbc342-dev\") pod \"glance-default-external-api-1\" (UID: \"2142d4ca-115a-49b7-8f50-ac020fdbc342\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.232059 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2142d4ca-115a-49b7-8f50-ac020fdbc342-run\") pod \"glance-default-external-api-1\" (UID: \"2142d4ca-115a-49b7-8f50-ac020fdbc342\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.232109 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/40930074-48c4-404d-a55c-bb8a4f581f56-var-locks-brick\") pod \"glance-default-external-api-2\" (UID: \"40930074-48c4-404d-a55c-bb8a4f581f56\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.232125 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-1\" (UID: \"2142d4ca-115a-49b7-8f50-ac020fdbc342\") device mount path \"/mnt/openstack/pv03\"" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.232156 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-1\" (UID: \"2142d4ca-115a-49b7-8f50-ac020fdbc342\") device mount path \"/mnt/openstack/pv04\"" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.232246 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2142d4ca-115a-49b7-8f50-ac020fdbc342-run\") pod \"glance-default-external-api-1\" (UID: \"2142d4ca-115a-49b7-8f50-ac020fdbc342\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.232300 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/40930074-48c4-404d-a55c-bb8a4f581f56-var-locks-brick\") pod \"glance-default-external-api-2\" (UID: \"40930074-48c4-404d-a55c-bb8a4f581f56\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.232129 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-2\" (UID: \"40930074-48c4-404d-a55c-bb8a4f581f56\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.232393 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63d398be-aa92-4a00-933b-549a0c4e4ad7-scripts\") pod \"glance-default-internal-api-1\" (UID: \"63d398be-aa92-4a00-933b-549a0c4e4ad7\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.232414 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2142d4ca-115a-49b7-8f50-ac020fdbc342-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"2142d4ca-115a-49b7-8f50-ac020fdbc342\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.232445 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-run\") pod \"glance-default-internal-api-2\" (UID: \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.232477 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/40930074-48c4-404d-a55c-bb8a4f581f56-etc-nvme\") pod \"glance-default-external-api-2\" (UID: \"40930074-48c4-404d-a55c-bb8a4f581f56\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.232502 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/40930074-48c4-404d-a55c-bb8a4f581f56-lib-modules\") pod \"glance-default-external-api-2\" (UID: \"40930074-48c4-404d-a55c-bb8a4f581f56\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.232533 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2142d4ca-115a-49b7-8f50-ac020fdbc342-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"2142d4ca-115a-49b7-8f50-ac020fdbc342\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.232556 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75pr7\" (UniqueName: \"kubernetes.io/projected/2142d4ca-115a-49b7-8f50-ac020fdbc342-kube-api-access-75pr7\") pod \"glance-default-external-api-1\" (UID: \"2142d4ca-115a-49b7-8f50-ac020fdbc342\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.232625 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/40930074-48c4-404d-a55c-bb8a4f581f56-etc-nvme\") pod \"glance-default-external-api-2\" (UID: \"40930074-48c4-404d-a55c-bb8a4f581f56\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.232662 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/40930074-48c4-404d-a55c-bb8a4f581f56-lib-modules\") pod \"glance-default-external-api-2\" (UID: \"40930074-48c4-404d-a55c-bb8a4f581f56\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.232715 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2142d4ca-115a-49b7-8f50-ac020fdbc342-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"2142d4ca-115a-49b7-8f50-ac020fdbc342\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.232829 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2142d4ca-115a-49b7-8f50-ac020fdbc342-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"2142d4ca-115a-49b7-8f50-ac020fdbc342\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.232846 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2142d4ca-115a-49b7-8f50-ac020fdbc342-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"2142d4ca-115a-49b7-8f50-ac020fdbc342\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.232864 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/63d398be-aa92-4a00-933b-549a0c4e4ad7-dev\") pod \"glance-default-internal-api-1\" (UID: \"63d398be-aa92-4a00-933b-549a0c4e4ad7\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.232887 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-2\" (UID: \"40930074-48c4-404d-a55c-bb8a4f581f56\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.232907 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdj2p\" (UniqueName: \"kubernetes.io/projected/63d398be-aa92-4a00-933b-549a0c4e4ad7-kube-api-access-pdj2p\") pod \"glance-default-internal-api-1\" (UID: \"63d398be-aa92-4a00-933b-549a0c4e4ad7\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.232932 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40930074-48c4-404d-a55c-bb8a4f581f56-logs\") pod \"glance-default-external-api-2\" (UID: \"40930074-48c4-404d-a55c-bb8a4f581f56\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.232939 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2142d4ca-115a-49b7-8f50-ac020fdbc342-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"2142d4ca-115a-49b7-8f50-ac020fdbc342\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.232946 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/63d398be-aa92-4a00-933b-549a0c4e4ad7-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"63d398be-aa92-4a00-933b-549a0c4e4ad7\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.233005 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/40930074-48c4-404d-a55c-bb8a4f581f56-dev\") pod \"glance-default-external-api-2\" (UID: \"40930074-48c4-404d-a55c-bb8a4f581f56\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.233033 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2142d4ca-115a-49b7-8f50-ac020fdbc342-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"2142d4ca-115a-49b7-8f50-ac020fdbc342\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.233066 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2142d4ca-115a-49b7-8f50-ac020fdbc342-logs\") pod \"glance-default-external-api-1\" (UID: \"2142d4ca-115a-49b7-8f50-ac020fdbc342\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.233109 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/63d398be-aa92-4a00-933b-549a0c4e4ad7-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"63d398be-aa92-4a00-933b-549a0c4e4ad7\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.233135 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-dev\") pod \"glance-default-internal-api-2\" (UID: \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.233169 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-etc-nvme\") pod \"glance-default-internal-api-2\" (UID: \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.233199 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40930074-48c4-404d-a55c-bb8a4f581f56-scripts\") pod \"glance-default-external-api-2\" (UID: \"40930074-48c4-404d-a55c-bb8a4f581f56\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.233221 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-2\" (UID: \"40930074-48c4-404d-a55c-bb8a4f581f56\") device mount path \"/mnt/openstack/pv01\"" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.233239 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-logs\") pod \"glance-default-internal-api-2\" (UID: \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.233254 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2142d4ca-115a-49b7-8f50-ac020fdbc342-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"2142d4ca-115a-49b7-8f50-ac020fdbc342\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.233267 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnnkf\" (UniqueName: \"kubernetes.io/projected/40930074-48c4-404d-a55c-bb8a4f581f56-kube-api-access-rnnkf\") pod \"glance-default-external-api-2\" (UID: \"40930074-48c4-404d-a55c-bb8a4f581f56\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.233261 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2142d4ca-115a-49b7-8f50-ac020fdbc342-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"2142d4ca-115a-49b7-8f50-ac020fdbc342\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.233300 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-lib-modules\") pod \"glance-default-internal-api-2\" (UID: \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.233321 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/63d398be-aa92-4a00-933b-549a0c4e4ad7-sys\") pod \"glance-default-internal-api-1\" (UID: \"63d398be-aa92-4a00-933b-549a0c4e4ad7\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.233329 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/40930074-48c4-404d-a55c-bb8a4f581f56-dev\") pod \"glance-default-external-api-2\" (UID: \"40930074-48c4-404d-a55c-bb8a4f581f56\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.233339 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63d398be-aa92-4a00-933b-549a0c4e4ad7-logs\") pod \"glance-default-internal-api-1\" (UID: \"63d398be-aa92-4a00-933b-549a0c4e4ad7\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.233361 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2142d4ca-115a-49b7-8f50-ac020fdbc342-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"2142d4ca-115a-49b7-8f50-ac020fdbc342\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.233367 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2142d4ca-115a-49b7-8f50-ac020fdbc342-sys\") pod \"glance-default-external-api-1\" (UID: \"2142d4ca-115a-49b7-8f50-ac020fdbc342\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.233396 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40930074-48c4-404d-a55c-bb8a4f581f56-config-data\") pod \"glance-default-external-api-2\" (UID: \"40930074-48c4-404d-a55c-bb8a4f581f56\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.233420 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/40930074-48c4-404d-a55c-bb8a4f581f56-run\") pod \"glance-default-external-api-2\" (UID: \"40930074-48c4-404d-a55c-bb8a4f581f56\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.233440 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63d398be-aa92-4a00-933b-549a0c4e4ad7-config-data\") pod \"glance-default-internal-api-1\" (UID: \"63d398be-aa92-4a00-933b-549a0c4e4ad7\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.233485 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/40930074-48c4-404d-a55c-bb8a4f581f56-httpd-run\") pod \"glance-default-external-api-2\" (UID: \"40930074-48c4-404d-a55c-bb8a4f581f56\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.233528 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40930074-48c4-404d-a55c-bb8a4f581f56-logs\") pod \"glance-default-external-api-2\" (UID: \"40930074-48c4-404d-a55c-bb8a4f581f56\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.233629 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2142d4ca-115a-49b7-8f50-ac020fdbc342-logs\") pod \"glance-default-external-api-1\" (UID: \"2142d4ca-115a-49b7-8f50-ac020fdbc342\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.233868 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/40930074-48c4-404d-a55c-bb8a4f581f56-httpd-run\") pod \"glance-default-external-api-2\" (UID: \"40930074-48c4-404d-a55c-bb8a4f581f56\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.233872 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2142d4ca-115a-49b7-8f50-ac020fdbc342-sys\") pod \"glance-default-external-api-1\" (UID: \"2142d4ca-115a-49b7-8f50-ac020fdbc342\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.233906 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/40930074-48c4-404d-a55c-bb8a4f581f56-run\") pod \"glance-default-external-api-2\" (UID: \"40930074-48c4-404d-a55c-bb8a4f581f56\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.233974 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-2\" (UID: \"40930074-48c4-404d-a55c-bb8a4f581f56\") device mount path \"/mnt/openstack/pv11\"" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.238388 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40930074-48c4-404d-a55c-bb8a4f581f56-scripts\") pod \"glance-default-external-api-2\" (UID: \"40930074-48c4-404d-a55c-bb8a4f581f56\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.239565 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40930074-48c4-404d-a55c-bb8a4f581f56-config-data\") pod \"glance-default-external-api-2\" (UID: \"40930074-48c4-404d-a55c-bb8a4f581f56\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.253886 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2142d4ca-115a-49b7-8f50-ac020fdbc342-scripts\") pod \"glance-default-external-api-1\" (UID: \"2142d4ca-115a-49b7-8f50-ac020fdbc342\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.260872 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2142d4ca-115a-49b7-8f50-ac020fdbc342-config-data\") pod \"glance-default-external-api-1\" (UID: \"2142d4ca-115a-49b7-8f50-ac020fdbc342\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.264849 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-1\" (UID: \"2142d4ca-115a-49b7-8f50-ac020fdbc342\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.267315 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnnkf\" (UniqueName: \"kubernetes.io/projected/40930074-48c4-404d-a55c-bb8a4f581f56-kube-api-access-rnnkf\") pod \"glance-default-external-api-2\" (UID: \"40930074-48c4-404d-a55c-bb8a4f581f56\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.269270 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75pr7\" (UniqueName: \"kubernetes.io/projected/2142d4ca-115a-49b7-8f50-ac020fdbc342-kube-api-access-75pr7\") pod \"glance-default-external-api-1\" (UID: \"2142d4ca-115a-49b7-8f50-ac020fdbc342\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.278010 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-1\" (UID: \"2142d4ca-115a-49b7-8f50-ac020fdbc342\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.278929 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-2\" (UID: \"40930074-48c4-404d-a55c-bb8a4f581f56\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.287922 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-2\" (UID: \"40930074-48c4-404d-a55c-bb8a4f581f56\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.334688 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-httpd-run\") pod \"glance-default-internal-api-2\" (UID: \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.334753 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-config-data\") pod \"glance-default-internal-api-2\" (UID: \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.334773 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-2\" (UID: \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.334791 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/63d398be-aa92-4a00-933b-549a0c4e4ad7-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"63d398be-aa92-4a00-933b-549a0c4e4ad7\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.334808 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-internal-api-1\" (UID: \"63d398be-aa92-4a00-933b-549a0c4e4ad7\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.334823 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-etc-iscsi\") pod \"glance-default-internal-api-2\" (UID: \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.334845 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/63d398be-aa92-4a00-933b-549a0c4e4ad7-run\") pod \"glance-default-internal-api-1\" (UID: \"63d398be-aa92-4a00-933b-549a0c4e4ad7\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.334862 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/63d398be-aa92-4a00-933b-549a0c4e4ad7-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"63d398be-aa92-4a00-933b-549a0c4e4ad7\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.334877 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-2\" (UID: \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.334893 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-1\" (UID: \"63d398be-aa92-4a00-933b-549a0c4e4ad7\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.334909 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/63d398be-aa92-4a00-933b-549a0c4e4ad7-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"63d398be-aa92-4a00-933b-549a0c4e4ad7\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.334940 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7j7jg\" (UniqueName: \"kubernetes.io/projected/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-kube-api-access-7j7jg\") pod \"glance-default-internal-api-2\" (UID: \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.334961 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-scripts\") pod \"glance-default-internal-api-2\" (UID: \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.334988 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-var-locks-brick\") pod \"glance-default-internal-api-2\" (UID: \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.335008 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-sys\") pod \"glance-default-internal-api-2\" (UID: \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.335029 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63d398be-aa92-4a00-933b-549a0c4e4ad7-scripts\") pod \"glance-default-internal-api-1\" (UID: \"63d398be-aa92-4a00-933b-549a0c4e4ad7\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.335051 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-run\") pod \"glance-default-internal-api-2\" (UID: \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.335112 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/63d398be-aa92-4a00-933b-549a0c4e4ad7-dev\") pod \"glance-default-internal-api-1\" (UID: \"63d398be-aa92-4a00-933b-549a0c4e4ad7\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.335130 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdj2p\" (UniqueName: \"kubernetes.io/projected/63d398be-aa92-4a00-933b-549a0c4e4ad7-kube-api-access-pdj2p\") pod \"glance-default-internal-api-1\" (UID: \"63d398be-aa92-4a00-933b-549a0c4e4ad7\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.335147 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/63d398be-aa92-4a00-933b-549a0c4e4ad7-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"63d398be-aa92-4a00-933b-549a0c4e4ad7\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.335172 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/63d398be-aa92-4a00-933b-549a0c4e4ad7-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"63d398be-aa92-4a00-933b-549a0c4e4ad7\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.335188 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-dev\") pod \"glance-default-internal-api-2\" (UID: \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.335207 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-etc-nvme\") pod \"glance-default-internal-api-2\" (UID: \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.335228 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-logs\") pod \"glance-default-internal-api-2\" (UID: \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.335244 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-lib-modules\") pod \"glance-default-internal-api-2\" (UID: \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.335258 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/63d398be-aa92-4a00-933b-549a0c4e4ad7-sys\") pod \"glance-default-internal-api-1\" (UID: \"63d398be-aa92-4a00-933b-549a0c4e4ad7\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.335273 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63d398be-aa92-4a00-933b-549a0c4e4ad7-logs\") pod \"glance-default-internal-api-1\" (UID: \"63d398be-aa92-4a00-933b-549a0c4e4ad7\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.335293 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63d398be-aa92-4a00-933b-549a0c4e4ad7-config-data\") pod \"glance-default-internal-api-1\" (UID: \"63d398be-aa92-4a00-933b-549a0c4e4ad7\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.335433 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-var-locks-brick\") pod \"glance-default-internal-api-2\" (UID: \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.335921 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/63d398be-aa92-4a00-933b-549a0c4e4ad7-run\") pod \"glance-default-internal-api-1\" (UID: \"63d398be-aa92-4a00-933b-549a0c4e4ad7\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.335948 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-httpd-run\") pod \"glance-default-internal-api-2\" (UID: \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.335975 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/63d398be-aa92-4a00-933b-549a0c4e4ad7-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"63d398be-aa92-4a00-933b-549a0c4e4ad7\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.336155 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/63d398be-aa92-4a00-933b-549a0c4e4ad7-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"63d398be-aa92-4a00-933b-549a0c4e4ad7\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.336128 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-1\" (UID: \"63d398be-aa92-4a00-933b-549a0c4e4ad7\") device mount path \"/mnt/openstack/pv19\"" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.337499 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/63d398be-aa92-4a00-933b-549a0c4e4ad7-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"63d398be-aa92-4a00-933b-549a0c4e4ad7\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.337543 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/63d398be-aa92-4a00-933b-549a0c4e4ad7-dev\") pod \"glance-default-internal-api-1\" (UID: \"63d398be-aa92-4a00-933b-549a0c4e4ad7\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.337577 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-run\") pod \"glance-default-internal-api-2\" (UID: \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.337650 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-sys\") pod \"glance-default-internal-api-2\" (UID: \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.338709 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/63d398be-aa92-4a00-933b-549a0c4e4ad7-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"63d398be-aa92-4a00-933b-549a0c4e4ad7\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.338812 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-2\" (UID: \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\") device mount path \"/mnt/openstack/pv07\"" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.339005 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63d398be-aa92-4a00-933b-549a0c4e4ad7-config-data\") pod \"glance-default-internal-api-1\" (UID: \"63d398be-aa92-4a00-933b-549a0c4e4ad7\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.339087 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-dev\") pod \"glance-default-internal-api-2\" (UID: \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.339132 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-etc-nvme\") pod \"glance-default-internal-api-2\" (UID: \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.339394 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-logs\") pod \"glance-default-internal-api-2\" (UID: \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.339441 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-lib-modules\") pod \"glance-default-internal-api-2\" (UID: \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.339470 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/63d398be-aa92-4a00-933b-549a0c4e4ad7-sys\") pod \"glance-default-internal-api-1\" (UID: \"63d398be-aa92-4a00-933b-549a0c4e4ad7\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.339522 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/63d398be-aa92-4a00-933b-549a0c4e4ad7-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"63d398be-aa92-4a00-933b-549a0c4e4ad7\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.339496 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-internal-api-1\" (UID: \"63d398be-aa92-4a00-933b-549a0c4e4ad7\") device mount path \"/mnt/openstack/pv20\"" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.336057 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-2\" (UID: \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\") device mount path \"/mnt/openstack/pv09\"" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.339657 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-etc-iscsi\") pod \"glance-default-internal-api-2\" (UID: \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.339782 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63d398be-aa92-4a00-933b-549a0c4e4ad7-logs\") pod \"glance-default-internal-api-1\" (UID: \"63d398be-aa92-4a00-933b-549a0c4e4ad7\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.343309 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63d398be-aa92-4a00-933b-549a0c4e4ad7-scripts\") pod \"glance-default-internal-api-1\" (UID: \"63d398be-aa92-4a00-933b-549a0c4e4ad7\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.343576 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-scripts\") pod \"glance-default-internal-api-2\" (UID: \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.349911 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-config-data\") pod \"glance-default-internal-api-2\" (UID: \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.351403 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.354486 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdj2p\" (UniqueName: \"kubernetes.io/projected/63d398be-aa92-4a00-933b-549a0c4e4ad7-kube-api-access-pdj2p\") pod \"glance-default-internal-api-1\" (UID: \"63d398be-aa92-4a00-933b-549a0c4e4ad7\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.359258 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7j7jg\" (UniqueName: \"kubernetes.io/projected/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-kube-api-access-7j7jg\") pod \"glance-default-internal-api-2\" (UID: \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.359848 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-2\" (UID: \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.364011 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-1\" (UID: \"63d398be-aa92-4a00-933b-549a0c4e4ad7\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.370374 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.374280 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-2\" (UID: \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.383903 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-internal-api-1\" (UID: \"63d398be-aa92-4a00-933b-549a0c4e4ad7\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.477048 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.487698 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.777765 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Jan 31 15:03:09 crc kubenswrapper[4751]: W0131 15:03:09.779048 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40930074_48c4_404d_a55c_bb8a4f581f56.slice/crio-95584e8f76d354aa1bb1539546392e64aa9a3d998c839b11c54c3e4e4b46195b WatchSource:0}: Error finding container 95584e8f76d354aa1bb1539546392e64aa9a3d998c839b11c54c3e4e4b46195b: Status 404 returned error can't find the container with id 95584e8f76d354aa1bb1539546392e64aa9a3d998c839b11c54c3e4e4b46195b Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.851918 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Jan 31 15:03:09 crc kubenswrapper[4751]: W0131 15:03:09.918496 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3d6d7db_fc12_479e_aedf_8ef829bf01e5.slice/crio-cc8b1dd3b31a488ef9b0862ddc7dae65875f8b41c155c15581f698c10d6ef4dd WatchSource:0}: Error finding container cc8b1dd3b31a488ef9b0862ddc7dae65875f8b41c155c15581f698c10d6ef4dd: Status 404 returned error can't find the container with id cc8b1dd3b31a488ef9b0862ddc7dae65875f8b41c155c15581f698c10d6ef4dd Jan 31 15:03:09 crc kubenswrapper[4751]: I0131 15:03:09.919231 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Jan 31 15:03:10 crc kubenswrapper[4751]: I0131 15:03:10.003260 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Jan 31 15:03:10 crc kubenswrapper[4751]: I0131 15:03:10.041884 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"63d398be-aa92-4a00-933b-549a0c4e4ad7","Type":"ContainerStarted","Data":"af5a8366873c62793cd928a263608e14d01ee8087ae27093c452690e6adc2f31"} Jan 31 15:03:10 crc kubenswrapper[4751]: I0131 15:03:10.043817 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"2142d4ca-115a-49b7-8f50-ac020fdbc342","Type":"ContainerStarted","Data":"9cf14d4a3826deef636e4c06d5c8aee7e5d6320538dd47d4daa7da644059233a"} Jan 31 15:03:10 crc kubenswrapper[4751]: I0131 15:03:10.043859 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"2142d4ca-115a-49b7-8f50-ac020fdbc342","Type":"ContainerStarted","Data":"5f1a0e7c6277e92312ce6862469a84b79e4f876e98e63b342abc9b0fa8fe5418"} Jan 31 15:03:10 crc kubenswrapper[4751]: I0131 15:03:10.048925 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"40930074-48c4-404d-a55c-bb8a4f581f56","Type":"ContainerStarted","Data":"566c7ba296d8a89d8b54dce02f2ecb937ecdc55998a198c89a4aa08ceab608d9"} Jan 31 15:03:10 crc kubenswrapper[4751]: I0131 15:03:10.048974 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"40930074-48c4-404d-a55c-bb8a4f581f56","Type":"ContainerStarted","Data":"95584e8f76d354aa1bb1539546392e64aa9a3d998c839b11c54c3e4e4b46195b"} Jan 31 15:03:10 crc kubenswrapper[4751]: I0131 15:03:10.051051 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"f3d6d7db-fc12-479e-aedf-8ef829bf01e5","Type":"ContainerStarted","Data":"cc8b1dd3b31a488ef9b0862ddc7dae65875f8b41c155c15581f698c10d6ef4dd"} Jan 31 15:03:11 crc kubenswrapper[4751]: I0131 15:03:11.060276 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"40930074-48c4-404d-a55c-bb8a4f581f56","Type":"ContainerStarted","Data":"16e5e639ab2ee9cd835695e29e8846f4088bef1a2b7a0d28e0303930d813839c"} Jan 31 15:03:11 crc kubenswrapper[4751]: I0131 15:03:11.062532 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"f3d6d7db-fc12-479e-aedf-8ef829bf01e5","Type":"ContainerStarted","Data":"1dc237ef9e49d030bd026bc0cf53f0f8764a60860c14efa8846ee1aaa6f4fde8"} Jan 31 15:03:11 crc kubenswrapper[4751]: I0131 15:03:11.062626 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"f3d6d7db-fc12-479e-aedf-8ef829bf01e5","Type":"ContainerStarted","Data":"9a94ed6a4818186063a21a203b752d947f2a19e75f8bff027ed517efba40515f"} Jan 31 15:03:11 crc kubenswrapper[4751]: I0131 15:03:11.064588 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"63d398be-aa92-4a00-933b-549a0c4e4ad7","Type":"ContainerStarted","Data":"9c261553a8935996c28e0989101c6aa4dfa134fe2060a0f6d58b3c2531dcdda7"} Jan 31 15:03:11 crc kubenswrapper[4751]: I0131 15:03:11.064662 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"63d398be-aa92-4a00-933b-549a0c4e4ad7","Type":"ContainerStarted","Data":"136f5cdff8be1979c81bded97141738e6801ed16ea2d04c80f29ec3512279160"} Jan 31 15:03:11 crc kubenswrapper[4751]: I0131 15:03:11.066493 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"2142d4ca-115a-49b7-8f50-ac020fdbc342","Type":"ContainerStarted","Data":"dea90ac64b7d37ae389ad079e16e4c6e1e9d9d925e5e168a9ae5bb17022b8317"} Jan 31 15:03:11 crc kubenswrapper[4751]: I0131 15:03:11.115076 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-2" podStartSLOduration=4.115051404 podStartE2EDuration="4.115051404s" podCreationTimestamp="2026-01-31 15:03:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 15:03:11.08302054 +0000 UTC m=+1295.457733425" watchObservedRunningTime="2026-01-31 15:03:11.115051404 +0000 UTC m=+1295.489764289" Jan 31 15:03:11 crc kubenswrapper[4751]: I0131 15:03:11.115438 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-1" podStartSLOduration=4.115431384 podStartE2EDuration="4.115431384s" podCreationTimestamp="2026-01-31 15:03:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 15:03:11.108623805 +0000 UTC m=+1295.483336750" watchObservedRunningTime="2026-01-31 15:03:11.115431384 +0000 UTC m=+1295.490144269" Jan 31 15:03:11 crc kubenswrapper[4751]: I0131 15:03:11.141252 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-2" podStartSLOduration=3.141226914 podStartE2EDuration="3.141226914s" podCreationTimestamp="2026-01-31 15:03:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 15:03:11.13920198 +0000 UTC m=+1295.513914875" watchObservedRunningTime="2026-01-31 15:03:11.141226914 +0000 UTC m=+1295.515939829" Jan 31 15:03:11 crc kubenswrapper[4751]: I0131 15:03:11.173819 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-1" podStartSLOduration=3.173796462 podStartE2EDuration="3.173796462s" podCreationTimestamp="2026-01-31 15:03:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 15:03:11.168905613 +0000 UTC m=+1295.543618548" watchObservedRunningTime="2026-01-31 15:03:11.173796462 +0000 UTC m=+1295.548509367" Jan 31 15:03:19 crc kubenswrapper[4751]: I0131 15:03:19.352851 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:03:19 crc kubenswrapper[4751]: I0131 15:03:19.353391 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:03:19 crc kubenswrapper[4751]: I0131 15:03:19.371370 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:03:19 crc kubenswrapper[4751]: I0131 15:03:19.373106 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:03:19 crc kubenswrapper[4751]: I0131 15:03:19.377330 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:03:19 crc kubenswrapper[4751]: I0131 15:03:19.395646 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:03:19 crc kubenswrapper[4751]: I0131 15:03:19.397554 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:03:19 crc kubenswrapper[4751]: I0131 15:03:19.413987 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:03:19 crc kubenswrapper[4751]: I0131 15:03:19.477929 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:03:19 crc kubenswrapper[4751]: I0131 15:03:19.477975 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:03:19 crc kubenswrapper[4751]: I0131 15:03:19.487875 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:03:19 crc kubenswrapper[4751]: I0131 15:03:19.488422 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:03:19 crc kubenswrapper[4751]: I0131 15:03:19.505540 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:03:19 crc kubenswrapper[4751]: I0131 15:03:19.525374 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:03:19 crc kubenswrapper[4751]: I0131 15:03:19.529823 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:03:19 crc kubenswrapper[4751]: I0131 15:03:19.554442 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:03:20 crc kubenswrapper[4751]: I0131 15:03:20.145799 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:03:20 crc kubenswrapper[4751]: I0131 15:03:20.146244 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:03:20 crc kubenswrapper[4751]: I0131 15:03:20.146311 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:03:20 crc kubenswrapper[4751]: I0131 15:03:20.146393 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:03:20 crc kubenswrapper[4751]: I0131 15:03:20.146455 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:03:20 crc kubenswrapper[4751]: I0131 15:03:20.146513 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:03:20 crc kubenswrapper[4751]: I0131 15:03:20.146575 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:03:20 crc kubenswrapper[4751]: I0131 15:03:20.146638 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:03:22 crc kubenswrapper[4751]: I0131 15:03:22.007312 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:03:22 crc kubenswrapper[4751]: I0131 15:03:22.012448 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:03:22 crc kubenswrapper[4751]: I0131 15:03:22.069913 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:03:22 crc kubenswrapper[4751]: I0131 15:03:22.116793 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:03:22 crc kubenswrapper[4751]: I0131 15:03:22.131551 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:03:22 crc kubenswrapper[4751]: I0131 15:03:22.158199 4751 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 31 15:03:22 crc kubenswrapper[4751]: I0131 15:03:22.158198 4751 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 31 15:03:22 crc kubenswrapper[4751]: I0131 15:03:22.158249 4751 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 31 15:03:22 crc kubenswrapper[4751]: I0131 15:03:22.200671 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:03:22 crc kubenswrapper[4751]: I0131 15:03:22.231564 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:03:22 crc kubenswrapper[4751]: I0131 15:03:22.392342 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:03:23 crc kubenswrapper[4751]: I0131 15:03:23.714791 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Jan 31 15:03:23 crc kubenswrapper[4751]: I0131 15:03:23.726978 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Jan 31 15:03:23 crc kubenswrapper[4751]: I0131 15:03:23.879574 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Jan 31 15:03:23 crc kubenswrapper[4751]: I0131 15:03:23.895225 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Jan 31 15:03:24 crc kubenswrapper[4751]: I0131 15:03:24.171626 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-1" podUID="2142d4ca-115a-49b7-8f50-ac020fdbc342" containerName="glance-log" containerID="cri-o://9cf14d4a3826deef636e4c06d5c8aee7e5d6320538dd47d4daa7da644059233a" gracePeriod=30 Jan 31 15:03:24 crc kubenswrapper[4751]: I0131 15:03:24.171734 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-1" podUID="2142d4ca-115a-49b7-8f50-ac020fdbc342" containerName="glance-httpd" containerID="cri-o://dea90ac64b7d37ae389ad079e16e4c6e1e9d9d925e5e168a9ae5bb17022b8317" gracePeriod=30 Jan 31 15:03:24 crc kubenswrapper[4751]: I0131 15:03:24.172865 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-2" podUID="40930074-48c4-404d-a55c-bb8a4f581f56" containerName="glance-log" containerID="cri-o://566c7ba296d8a89d8b54dce02f2ecb937ecdc55998a198c89a4aa08ceab608d9" gracePeriod=30 Jan 31 15:03:24 crc kubenswrapper[4751]: I0131 15:03:24.172895 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-2" podUID="40930074-48c4-404d-a55c-bb8a4f581f56" containerName="glance-httpd" containerID="cri-o://16e5e639ab2ee9cd835695e29e8846f4088bef1a2b7a0d28e0303930d813839c" gracePeriod=30 Jan 31 15:03:25 crc kubenswrapper[4751]: I0131 15:03:25.182212 4751 generic.go:334] "Generic (PLEG): container finished" podID="40930074-48c4-404d-a55c-bb8a4f581f56" containerID="566c7ba296d8a89d8b54dce02f2ecb937ecdc55998a198c89a4aa08ceab608d9" exitCode=143 Jan 31 15:03:25 crc kubenswrapper[4751]: I0131 15:03:25.182320 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"40930074-48c4-404d-a55c-bb8a4f581f56","Type":"ContainerDied","Data":"566c7ba296d8a89d8b54dce02f2ecb937ecdc55998a198c89a4aa08ceab608d9"} Jan 31 15:03:25 crc kubenswrapper[4751]: I0131 15:03:25.184819 4751 generic.go:334] "Generic (PLEG): container finished" podID="2142d4ca-115a-49b7-8f50-ac020fdbc342" containerID="9cf14d4a3826deef636e4c06d5c8aee7e5d6320538dd47d4daa7da644059233a" exitCode=143 Jan 31 15:03:25 crc kubenswrapper[4751]: I0131 15:03:25.184907 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"2142d4ca-115a-49b7-8f50-ac020fdbc342","Type":"ContainerDied","Data":"9cf14d4a3826deef636e4c06d5c8aee7e5d6320538dd47d4daa7da644059233a"} Jan 31 15:03:25 crc kubenswrapper[4751]: I0131 15:03:25.185237 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-2" podUID="f3d6d7db-fc12-479e-aedf-8ef829bf01e5" containerName="glance-log" containerID="cri-o://9a94ed6a4818186063a21a203b752d947f2a19e75f8bff027ed517efba40515f" gracePeriod=30 Jan 31 15:03:25 crc kubenswrapper[4751]: I0131 15:03:25.185314 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-2" podUID="f3d6d7db-fc12-479e-aedf-8ef829bf01e5" containerName="glance-httpd" containerID="cri-o://1dc237ef9e49d030bd026bc0cf53f0f8764a60860c14efa8846ee1aaa6f4fde8" gracePeriod=30 Jan 31 15:03:25 crc kubenswrapper[4751]: I0131 15:03:25.185617 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="63d398be-aa92-4a00-933b-549a0c4e4ad7" containerName="glance-log" containerID="cri-o://136f5cdff8be1979c81bded97141738e6801ed16ea2d04c80f29ec3512279160" gracePeriod=30 Jan 31 15:03:25 crc kubenswrapper[4751]: I0131 15:03:25.185681 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="63d398be-aa92-4a00-933b-549a0c4e4ad7" containerName="glance-httpd" containerID="cri-o://9c261553a8935996c28e0989101c6aa4dfa134fe2060a0f6d58b3c2531dcdda7" gracePeriod=30 Jan 31 15:03:26 crc kubenswrapper[4751]: I0131 15:03:26.194850 4751 generic.go:334] "Generic (PLEG): container finished" podID="f3d6d7db-fc12-479e-aedf-8ef829bf01e5" containerID="9a94ed6a4818186063a21a203b752d947f2a19e75f8bff027ed517efba40515f" exitCode=143 Jan 31 15:03:26 crc kubenswrapper[4751]: I0131 15:03:26.194947 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"f3d6d7db-fc12-479e-aedf-8ef829bf01e5","Type":"ContainerDied","Data":"9a94ed6a4818186063a21a203b752d947f2a19e75f8bff027ed517efba40515f"} Jan 31 15:03:26 crc kubenswrapper[4751]: I0131 15:03:26.199221 4751 generic.go:334] "Generic (PLEG): container finished" podID="63d398be-aa92-4a00-933b-549a0c4e4ad7" containerID="136f5cdff8be1979c81bded97141738e6801ed16ea2d04c80f29ec3512279160" exitCode=143 Jan 31 15:03:26 crc kubenswrapper[4751]: I0131 15:03:26.199275 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"63d398be-aa92-4a00-933b-549a0c4e4ad7","Type":"ContainerDied","Data":"136f5cdff8be1979c81bded97141738e6801ed16ea2d04c80f29ec3512279160"} Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.796450 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.800858 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.840583 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2142d4ca-115a-49b7-8f50-ac020fdbc342-etc-nvme\") pod \"2142d4ca-115a-49b7-8f50-ac020fdbc342\" (UID: \"2142d4ca-115a-49b7-8f50-ac020fdbc342\") " Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.840631 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"2142d4ca-115a-49b7-8f50-ac020fdbc342\" (UID: \"2142d4ca-115a-49b7-8f50-ac020fdbc342\") " Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.840668 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40930074-48c4-404d-a55c-bb8a4f581f56-logs\") pod \"40930074-48c4-404d-a55c-bb8a4f581f56\" (UID: \"40930074-48c4-404d-a55c-bb8a4f581f56\") " Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.840692 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2142d4ca-115a-49b7-8f50-ac020fdbc342-dev\") pod \"2142d4ca-115a-49b7-8f50-ac020fdbc342\" (UID: \"2142d4ca-115a-49b7-8f50-ac020fdbc342\") " Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.840724 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/40930074-48c4-404d-a55c-bb8a4f581f56-var-locks-brick\") pod \"40930074-48c4-404d-a55c-bb8a4f581f56\" (UID: \"40930074-48c4-404d-a55c-bb8a4f581f56\") " Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.840738 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2142d4ca-115a-49b7-8f50-ac020fdbc342-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "2142d4ca-115a-49b7-8f50-ac020fdbc342" (UID: "2142d4ca-115a-49b7-8f50-ac020fdbc342"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.840755 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2142d4ca-115a-49b7-8f50-ac020fdbc342-httpd-run\") pod \"2142d4ca-115a-49b7-8f50-ac020fdbc342\" (UID: \"2142d4ca-115a-49b7-8f50-ac020fdbc342\") " Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.840852 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/40930074-48c4-404d-a55c-bb8a4f581f56-etc-iscsi\") pod \"40930074-48c4-404d-a55c-bb8a4f581f56\" (UID: \"40930074-48c4-404d-a55c-bb8a4f581f56\") " Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.840889 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2142d4ca-115a-49b7-8f50-ac020fdbc342-etc-iscsi\") pod \"2142d4ca-115a-49b7-8f50-ac020fdbc342\" (UID: \"2142d4ca-115a-49b7-8f50-ac020fdbc342\") " Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.840943 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2142d4ca-115a-49b7-8f50-ac020fdbc342-dev" (OuterVolumeSpecName: "dev") pod "2142d4ca-115a-49b7-8f50-ac020fdbc342" (UID: "2142d4ca-115a-49b7-8f50-ac020fdbc342"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.840955 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/40930074-48c4-404d-a55c-bb8a4f581f56-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "40930074-48c4-404d-a55c-bb8a4f581f56" (UID: "40930074-48c4-404d-a55c-bb8a4f581f56"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.840987 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/40930074-48c4-404d-a55c-bb8a4f581f56-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "40930074-48c4-404d-a55c-bb8a4f581f56" (UID: "40930074-48c4-404d-a55c-bb8a4f581f56"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.841037 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2142d4ca-115a-49b7-8f50-ac020fdbc342-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2142d4ca-115a-49b7-8f50-ac020fdbc342" (UID: "2142d4ca-115a-49b7-8f50-ac020fdbc342"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.841085 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2142d4ca-115a-49b7-8f50-ac020fdbc342-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "2142d4ca-115a-49b7-8f50-ac020fdbc342" (UID: "2142d4ca-115a-49b7-8f50-ac020fdbc342"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.841297 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40930074-48c4-404d-a55c-bb8a4f581f56-logs" (OuterVolumeSpecName: "logs") pod "40930074-48c4-404d-a55c-bb8a4f581f56" (UID: "40930074-48c4-404d-a55c-bb8a4f581f56"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.841371 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2142d4ca-115a-49b7-8f50-ac020fdbc342-logs\") pod \"2142d4ca-115a-49b7-8f50-ac020fdbc342\" (UID: \"2142d4ca-115a-49b7-8f50-ac020fdbc342\") " Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.841618 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2142d4ca-115a-49b7-8f50-ac020fdbc342-logs" (OuterVolumeSpecName: "logs") pod "2142d4ca-115a-49b7-8f50-ac020fdbc342" (UID: "2142d4ca-115a-49b7-8f50-ac020fdbc342"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.841645 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75pr7\" (UniqueName: \"kubernetes.io/projected/2142d4ca-115a-49b7-8f50-ac020fdbc342-kube-api-access-75pr7\") pod \"2142d4ca-115a-49b7-8f50-ac020fdbc342\" (UID: \"2142d4ca-115a-49b7-8f50-ac020fdbc342\") " Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.841669 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/40930074-48c4-404d-a55c-bb8a4f581f56-dev\") pod \"40930074-48c4-404d-a55c-bb8a4f581f56\" (UID: \"40930074-48c4-404d-a55c-bb8a4f581f56\") " Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.842141 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40930074-48c4-404d-a55c-bb8a4f581f56-scripts\") pod \"40930074-48c4-404d-a55c-bb8a4f581f56\" (UID: \"40930074-48c4-404d-a55c-bb8a4f581f56\") " Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.842170 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"40930074-48c4-404d-a55c-bb8a4f581f56\" (UID: \"40930074-48c4-404d-a55c-bb8a4f581f56\") " Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.842189 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/40930074-48c4-404d-a55c-bb8a4f581f56-lib-modules\") pod \"40930074-48c4-404d-a55c-bb8a4f581f56\" (UID: \"40930074-48c4-404d-a55c-bb8a4f581f56\") " Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.842216 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"2142d4ca-115a-49b7-8f50-ac020fdbc342\" (UID: \"2142d4ca-115a-49b7-8f50-ac020fdbc342\") " Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.842234 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnnkf\" (UniqueName: \"kubernetes.io/projected/40930074-48c4-404d-a55c-bb8a4f581f56-kube-api-access-rnnkf\") pod \"40930074-48c4-404d-a55c-bb8a4f581f56\" (UID: \"40930074-48c4-404d-a55c-bb8a4f581f56\") " Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.842248 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2142d4ca-115a-49b7-8f50-ac020fdbc342-run\") pod \"2142d4ca-115a-49b7-8f50-ac020fdbc342\" (UID: \"2142d4ca-115a-49b7-8f50-ac020fdbc342\") " Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.842266 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/40930074-48c4-404d-a55c-bb8a4f581f56-etc-nvme\") pod \"40930074-48c4-404d-a55c-bb8a4f581f56\" (UID: \"40930074-48c4-404d-a55c-bb8a4f581f56\") " Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.842293 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40930074-48c4-404d-a55c-bb8a4f581f56-config-data\") pod \"40930074-48c4-404d-a55c-bb8a4f581f56\" (UID: \"40930074-48c4-404d-a55c-bb8a4f581f56\") " Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.842321 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2142d4ca-115a-49b7-8f50-ac020fdbc342-config-data\") pod \"2142d4ca-115a-49b7-8f50-ac020fdbc342\" (UID: \"2142d4ca-115a-49b7-8f50-ac020fdbc342\") " Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.842341 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2142d4ca-115a-49b7-8f50-ac020fdbc342-var-locks-brick\") pod \"2142d4ca-115a-49b7-8f50-ac020fdbc342\" (UID: \"2142d4ca-115a-49b7-8f50-ac020fdbc342\") " Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.842395 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/40930074-48c4-404d-a55c-bb8a4f581f56-httpd-run\") pod \"40930074-48c4-404d-a55c-bb8a4f581f56\" (UID: \"40930074-48c4-404d-a55c-bb8a4f581f56\") " Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.842419 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"40930074-48c4-404d-a55c-bb8a4f581f56\" (UID: \"40930074-48c4-404d-a55c-bb8a4f581f56\") " Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.842443 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2142d4ca-115a-49b7-8f50-ac020fdbc342-scripts\") pod \"2142d4ca-115a-49b7-8f50-ac020fdbc342\" (UID: \"2142d4ca-115a-49b7-8f50-ac020fdbc342\") " Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.842465 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2142d4ca-115a-49b7-8f50-ac020fdbc342-lib-modules\") pod \"2142d4ca-115a-49b7-8f50-ac020fdbc342\" (UID: \"2142d4ca-115a-49b7-8f50-ac020fdbc342\") " Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.842482 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2142d4ca-115a-49b7-8f50-ac020fdbc342-sys\") pod \"2142d4ca-115a-49b7-8f50-ac020fdbc342\" (UID: \"2142d4ca-115a-49b7-8f50-ac020fdbc342\") " Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.842535 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/40930074-48c4-404d-a55c-bb8a4f581f56-sys\") pod \"40930074-48c4-404d-a55c-bb8a4f581f56\" (UID: \"40930074-48c4-404d-a55c-bb8a4f581f56\") " Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.842561 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/40930074-48c4-404d-a55c-bb8a4f581f56-run\") pod \"40930074-48c4-404d-a55c-bb8a4f581f56\" (UID: \"40930074-48c4-404d-a55c-bb8a4f581f56\") " Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.843036 4751 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2142d4ca-115a-49b7-8f50-ac020fdbc342-etc-nvme\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.843054 4751 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40930074-48c4-404d-a55c-bb8a4f581f56-logs\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.843677 4751 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2142d4ca-115a-49b7-8f50-ac020fdbc342-dev\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.843692 4751 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/40930074-48c4-404d-a55c-bb8a4f581f56-var-locks-brick\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.843702 4751 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2142d4ca-115a-49b7-8f50-ac020fdbc342-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.843711 4751 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/40930074-48c4-404d-a55c-bb8a4f581f56-etc-iscsi\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.843719 4751 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2142d4ca-115a-49b7-8f50-ac020fdbc342-etc-iscsi\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.843728 4751 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2142d4ca-115a-49b7-8f50-ac020fdbc342-logs\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.843758 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/40930074-48c4-404d-a55c-bb8a4f581f56-run" (OuterVolumeSpecName: "run") pod "40930074-48c4-404d-a55c-bb8a4f581f56" (UID: "40930074-48c4-404d-a55c-bb8a4f581f56"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.843790 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/40930074-48c4-404d-a55c-bb8a4f581f56-dev" (OuterVolumeSpecName: "dev") pod "40930074-48c4-404d-a55c-bb8a4f581f56" (UID: "40930074-48c4-404d-a55c-bb8a4f581f56"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.845762 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/40930074-48c4-404d-a55c-bb8a4f581f56-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "40930074-48c4-404d-a55c-bb8a4f581f56" (UID: "40930074-48c4-404d-a55c-bb8a4f581f56"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.845795 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2142d4ca-115a-49b7-8f50-ac020fdbc342-sys" (OuterVolumeSpecName: "sys") pod "2142d4ca-115a-49b7-8f50-ac020fdbc342" (UID: "2142d4ca-115a-49b7-8f50-ac020fdbc342"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.845846 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2142d4ca-115a-49b7-8f50-ac020fdbc342-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "2142d4ca-115a-49b7-8f50-ac020fdbc342" (UID: "2142d4ca-115a-49b7-8f50-ac020fdbc342"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.845870 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/40930074-48c4-404d-a55c-bb8a4f581f56-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "40930074-48c4-404d-a55c-bb8a4f581f56" (UID: "40930074-48c4-404d-a55c-bb8a4f581f56"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.845887 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2142d4ca-115a-49b7-8f50-ac020fdbc342-run" (OuterVolumeSpecName: "run") pod "2142d4ca-115a-49b7-8f50-ac020fdbc342" (UID: "2142d4ca-115a-49b7-8f50-ac020fdbc342"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.846213 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2142d4ca-115a-49b7-8f50-ac020fdbc342-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "2142d4ca-115a-49b7-8f50-ac020fdbc342" (UID: "2142d4ca-115a-49b7-8f50-ac020fdbc342"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.846523 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40930074-48c4-404d-a55c-bb8a4f581f56-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "40930074-48c4-404d-a55c-bb8a4f581f56" (UID: "40930074-48c4-404d-a55c-bb8a4f581f56"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.846570 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/40930074-48c4-404d-a55c-bb8a4f581f56-sys" (OuterVolumeSpecName: "sys") pod "40930074-48c4-404d-a55c-bb8a4f581f56" (UID: "40930074-48c4-404d-a55c-bb8a4f581f56"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.849825 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40930074-48c4-404d-a55c-bb8a4f581f56-kube-api-access-rnnkf" (OuterVolumeSpecName: "kube-api-access-rnnkf") pod "40930074-48c4-404d-a55c-bb8a4f581f56" (UID: "40930074-48c4-404d-a55c-bb8a4f581f56"). InnerVolumeSpecName "kube-api-access-rnnkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.851637 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance-cache") pod "40930074-48c4-404d-a55c-bb8a4f581f56" (UID: "40930074-48c4-404d-a55c-bb8a4f581f56"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.854365 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "2142d4ca-115a-49b7-8f50-ac020fdbc342" (UID: "2142d4ca-115a-49b7-8f50-ac020fdbc342"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.854497 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "40930074-48c4-404d-a55c-bb8a4f581f56" (UID: "40930074-48c4-404d-a55c-bb8a4f581f56"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.855364 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2142d4ca-115a-49b7-8f50-ac020fdbc342-kube-api-access-75pr7" (OuterVolumeSpecName: "kube-api-access-75pr7") pod "2142d4ca-115a-49b7-8f50-ac020fdbc342" (UID: "2142d4ca-115a-49b7-8f50-ac020fdbc342"). InnerVolumeSpecName "kube-api-access-75pr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.858609 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40930074-48c4-404d-a55c-bb8a4f581f56-scripts" (OuterVolumeSpecName: "scripts") pod "40930074-48c4-404d-a55c-bb8a4f581f56" (UID: "40930074-48c4-404d-a55c-bb8a4f581f56"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.860742 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance-cache") pod "2142d4ca-115a-49b7-8f50-ac020fdbc342" (UID: "2142d4ca-115a-49b7-8f50-ac020fdbc342"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.862306 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2142d4ca-115a-49b7-8f50-ac020fdbc342-scripts" (OuterVolumeSpecName: "scripts") pod "2142d4ca-115a-49b7-8f50-ac020fdbc342" (UID: "2142d4ca-115a-49b7-8f50-ac020fdbc342"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.885408 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2142d4ca-115a-49b7-8f50-ac020fdbc342-config-data" (OuterVolumeSpecName: "config-data") pod "2142d4ca-115a-49b7-8f50-ac020fdbc342" (UID: "2142d4ca-115a-49b7-8f50-ac020fdbc342"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.890435 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40930074-48c4-404d-a55c-bb8a4f581f56-config-data" (OuterVolumeSpecName: "config-data") pod "40930074-48c4-404d-a55c-bb8a4f581f56" (UID: "40930074-48c4-404d-a55c-bb8a4f581f56"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.944637 4751 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2142d4ca-115a-49b7-8f50-ac020fdbc342-var-locks-brick\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.944669 4751 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/40930074-48c4-404d-a55c-bb8a4f581f56-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.944701 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.944709 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2142d4ca-115a-49b7-8f50-ac020fdbc342-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.944719 4751 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2142d4ca-115a-49b7-8f50-ac020fdbc342-lib-modules\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.944727 4751 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2142d4ca-115a-49b7-8f50-ac020fdbc342-sys\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.944735 4751 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/40930074-48c4-404d-a55c-bb8a4f581f56-sys\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.944743 4751 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/40930074-48c4-404d-a55c-bb8a4f581f56-run\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.944754 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.944763 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75pr7\" (UniqueName: \"kubernetes.io/projected/2142d4ca-115a-49b7-8f50-ac020fdbc342-kube-api-access-75pr7\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.944772 4751 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/40930074-48c4-404d-a55c-bb8a4f581f56-dev\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.944781 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40930074-48c4-404d-a55c-bb8a4f581f56-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.944792 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.944801 4751 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/40930074-48c4-404d-a55c-bb8a4f581f56-lib-modules\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.944812 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.944820 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnnkf\" (UniqueName: \"kubernetes.io/projected/40930074-48c4-404d-a55c-bb8a4f581f56-kube-api-access-rnnkf\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.944829 4751 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2142d4ca-115a-49b7-8f50-ac020fdbc342-run\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.944837 4751 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/40930074-48c4-404d-a55c-bb8a4f581f56-etc-nvme\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.944844 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40930074-48c4-404d-a55c-bb8a4f581f56-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.944851 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2142d4ca-115a-49b7-8f50-ac020fdbc342-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.960575 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.960705 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.962106 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Jan 31 15:03:27 crc kubenswrapper[4751]: I0131 15:03:27.964163 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.047459 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.047500 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.047538 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.047550 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.215352 4751 generic.go:334] "Generic (PLEG): container finished" podID="40930074-48c4-404d-a55c-bb8a4f581f56" containerID="16e5e639ab2ee9cd835695e29e8846f4088bef1a2b7a0d28e0303930d813839c" exitCode=0 Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.215423 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"40930074-48c4-404d-a55c-bb8a4f581f56","Type":"ContainerDied","Data":"16e5e639ab2ee9cd835695e29e8846f4088bef1a2b7a0d28e0303930d813839c"} Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.215454 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"40930074-48c4-404d-a55c-bb8a4f581f56","Type":"ContainerDied","Data":"95584e8f76d354aa1bb1539546392e64aa9a3d998c839b11c54c3e4e4b46195b"} Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.215473 4751 scope.go:117] "RemoveContainer" containerID="16e5e639ab2ee9cd835695e29e8846f4088bef1a2b7a0d28e0303930d813839c" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.215552 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.217239 4751 generic.go:334] "Generic (PLEG): container finished" podID="2142d4ca-115a-49b7-8f50-ac020fdbc342" containerID="dea90ac64b7d37ae389ad079e16e4c6e1e9d9d925e5e168a9ae5bb17022b8317" exitCode=0 Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.217312 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"2142d4ca-115a-49b7-8f50-ac020fdbc342","Type":"ContainerDied","Data":"dea90ac64b7d37ae389ad079e16e4c6e1e9d9d925e5e168a9ae5bb17022b8317"} Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.217422 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"2142d4ca-115a-49b7-8f50-ac020fdbc342","Type":"ContainerDied","Data":"5f1a0e7c6277e92312ce6862469a84b79e4f876e98e63b342abc9b0fa8fe5418"} Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.217336 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.238703 4751 scope.go:117] "RemoveContainer" containerID="566c7ba296d8a89d8b54dce02f2ecb937ecdc55998a198c89a4aa08ceab608d9" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.258306 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.269204 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.278609 4751 scope.go:117] "RemoveContainer" containerID="16e5e639ab2ee9cd835695e29e8846f4088bef1a2b7a0d28e0303930d813839c" Jan 31 15:03:28 crc kubenswrapper[4751]: E0131 15:03:28.279327 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16e5e639ab2ee9cd835695e29e8846f4088bef1a2b7a0d28e0303930d813839c\": container with ID starting with 16e5e639ab2ee9cd835695e29e8846f4088bef1a2b7a0d28e0303930d813839c not found: ID does not exist" containerID="16e5e639ab2ee9cd835695e29e8846f4088bef1a2b7a0d28e0303930d813839c" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.279437 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16e5e639ab2ee9cd835695e29e8846f4088bef1a2b7a0d28e0303930d813839c"} err="failed to get container status \"16e5e639ab2ee9cd835695e29e8846f4088bef1a2b7a0d28e0303930d813839c\": rpc error: code = NotFound desc = could not find container \"16e5e639ab2ee9cd835695e29e8846f4088bef1a2b7a0d28e0303930d813839c\": container with ID starting with 16e5e639ab2ee9cd835695e29e8846f4088bef1a2b7a0d28e0303930d813839c not found: ID does not exist" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.279474 4751 scope.go:117] "RemoveContainer" containerID="566c7ba296d8a89d8b54dce02f2ecb937ecdc55998a198c89a4aa08ceab608d9" Jan 31 15:03:28 crc kubenswrapper[4751]: E0131 15:03:28.279786 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"566c7ba296d8a89d8b54dce02f2ecb937ecdc55998a198c89a4aa08ceab608d9\": container with ID starting with 566c7ba296d8a89d8b54dce02f2ecb937ecdc55998a198c89a4aa08ceab608d9 not found: ID does not exist" containerID="566c7ba296d8a89d8b54dce02f2ecb937ecdc55998a198c89a4aa08ceab608d9" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.279867 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"566c7ba296d8a89d8b54dce02f2ecb937ecdc55998a198c89a4aa08ceab608d9"} err="failed to get container status \"566c7ba296d8a89d8b54dce02f2ecb937ecdc55998a198c89a4aa08ceab608d9\": rpc error: code = NotFound desc = could not find container \"566c7ba296d8a89d8b54dce02f2ecb937ecdc55998a198c89a4aa08ceab608d9\": container with ID starting with 566c7ba296d8a89d8b54dce02f2ecb937ecdc55998a198c89a4aa08ceab608d9 not found: ID does not exist" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.279900 4751 scope.go:117] "RemoveContainer" containerID="dea90ac64b7d37ae389ad079e16e4c6e1e9d9d925e5e168a9ae5bb17022b8317" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.301465 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.304604 4751 scope.go:117] "RemoveContainer" containerID="9cf14d4a3826deef636e4c06d5c8aee7e5d6320538dd47d4daa7da644059233a" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.311930 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.323908 4751 scope.go:117] "RemoveContainer" containerID="dea90ac64b7d37ae389ad079e16e4c6e1e9d9d925e5e168a9ae5bb17022b8317" Jan 31 15:03:28 crc kubenswrapper[4751]: E0131 15:03:28.324472 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dea90ac64b7d37ae389ad079e16e4c6e1e9d9d925e5e168a9ae5bb17022b8317\": container with ID starting with dea90ac64b7d37ae389ad079e16e4c6e1e9d9d925e5e168a9ae5bb17022b8317 not found: ID does not exist" containerID="dea90ac64b7d37ae389ad079e16e4c6e1e9d9d925e5e168a9ae5bb17022b8317" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.324524 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dea90ac64b7d37ae389ad079e16e4c6e1e9d9d925e5e168a9ae5bb17022b8317"} err="failed to get container status \"dea90ac64b7d37ae389ad079e16e4c6e1e9d9d925e5e168a9ae5bb17022b8317\": rpc error: code = NotFound desc = could not find container \"dea90ac64b7d37ae389ad079e16e4c6e1e9d9d925e5e168a9ae5bb17022b8317\": container with ID starting with dea90ac64b7d37ae389ad079e16e4c6e1e9d9d925e5e168a9ae5bb17022b8317 not found: ID does not exist" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.324558 4751 scope.go:117] "RemoveContainer" containerID="9cf14d4a3826deef636e4c06d5c8aee7e5d6320538dd47d4daa7da644059233a" Jan 31 15:03:28 crc kubenswrapper[4751]: E0131 15:03:28.324926 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cf14d4a3826deef636e4c06d5c8aee7e5d6320538dd47d4daa7da644059233a\": container with ID starting with 9cf14d4a3826deef636e4c06d5c8aee7e5d6320538dd47d4daa7da644059233a not found: ID does not exist" containerID="9cf14d4a3826deef636e4c06d5c8aee7e5d6320538dd47d4daa7da644059233a" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.324946 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cf14d4a3826deef636e4c06d5c8aee7e5d6320538dd47d4daa7da644059233a"} err="failed to get container status \"9cf14d4a3826deef636e4c06d5c8aee7e5d6320538dd47d4daa7da644059233a\": rpc error: code = NotFound desc = could not find container \"9cf14d4a3826deef636e4c06d5c8aee7e5d6320538dd47d4daa7da644059233a\": container with ID starting with 9cf14d4a3826deef636e4c06d5c8aee7e5d6320538dd47d4daa7da644059233a not found: ID does not exist" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.414870 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2142d4ca-115a-49b7-8f50-ac020fdbc342" path="/var/lib/kubelet/pods/2142d4ca-115a-49b7-8f50-ac020fdbc342/volumes" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.415938 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40930074-48c4-404d-a55c-bb8a4f581f56" path="/var/lib/kubelet/pods/40930074-48c4-404d-a55c-bb8a4f581f56/volumes" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.743040 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.750170 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.857792 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-sys\") pod \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\" (UID: \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\") " Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.857962 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-sys" (OuterVolumeSpecName: "sys") pod "f3d6d7db-fc12-479e-aedf-8ef829bf01e5" (UID: "f3d6d7db-fc12-479e-aedf-8ef829bf01e5"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.858125 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/63d398be-aa92-4a00-933b-549a0c4e4ad7-dev" (OuterVolumeSpecName: "dev") pod "63d398be-aa92-4a00-933b-549a0c4e4ad7" (UID: "63d398be-aa92-4a00-933b-549a0c4e4ad7"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.858059 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/63d398be-aa92-4a00-933b-549a0c4e4ad7-dev\") pod \"63d398be-aa92-4a00-933b-549a0c4e4ad7\" (UID: \"63d398be-aa92-4a00-933b-549a0c4e4ad7\") " Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.860307 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/63d398be-aa92-4a00-933b-549a0c4e4ad7-lib-modules\") pod \"63d398be-aa92-4a00-933b-549a0c4e4ad7\" (UID: \"63d398be-aa92-4a00-933b-549a0c4e4ad7\") " Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.860363 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-config-data\") pod \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\" (UID: \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\") " Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.860382 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-run\") pod \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\" (UID: \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\") " Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.860415 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/63d398be-aa92-4a00-933b-549a0c4e4ad7-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "63d398be-aa92-4a00-933b-549a0c4e4ad7" (UID: "63d398be-aa92-4a00-933b-549a0c4e4ad7"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.860437 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/63d398be-aa92-4a00-933b-549a0c4e4ad7-sys\") pod \"63d398be-aa92-4a00-933b-549a0c4e4ad7\" (UID: \"63d398be-aa92-4a00-933b-549a0c4e4ad7\") " Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.860465 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\" (UID: \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\") " Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.860504 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/63d398be-aa92-4a00-933b-549a0c4e4ad7-var-locks-brick\") pod \"63d398be-aa92-4a00-933b-549a0c4e4ad7\" (UID: \"63d398be-aa92-4a00-933b-549a0c4e4ad7\") " Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.860528 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-logs\") pod \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\" (UID: \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\") " Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.860552 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63d398be-aa92-4a00-933b-549a0c4e4ad7-logs\") pod \"63d398be-aa92-4a00-933b-549a0c4e4ad7\" (UID: \"63d398be-aa92-4a00-933b-549a0c4e4ad7\") " Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.860598 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/63d398be-aa92-4a00-933b-549a0c4e4ad7-run\") pod \"63d398be-aa92-4a00-933b-549a0c4e4ad7\" (UID: \"63d398be-aa92-4a00-933b-549a0c4e4ad7\") " Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.860616 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-lib-modules\") pod \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\" (UID: \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\") " Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.860656 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63d398be-aa92-4a00-933b-549a0c4e4ad7-config-data\") pod \"63d398be-aa92-4a00-933b-549a0c4e4ad7\" (UID: \"63d398be-aa92-4a00-933b-549a0c4e4ad7\") " Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.860680 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-scripts\") pod \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\" (UID: \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\") " Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.860699 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-etc-nvme\") pod \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\" (UID: \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\") " Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.860755 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63d398be-aa92-4a00-933b-549a0c4e4ad7-scripts\") pod \"63d398be-aa92-4a00-933b-549a0c4e4ad7\" (UID: \"63d398be-aa92-4a00-933b-549a0c4e4ad7\") " Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.860787 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7j7jg\" (UniqueName: \"kubernetes.io/projected/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-kube-api-access-7j7jg\") pod \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\" (UID: \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\") " Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.860843 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/63d398be-aa92-4a00-933b-549a0c4e4ad7-etc-nvme\") pod \"63d398be-aa92-4a00-933b-549a0c4e4ad7\" (UID: \"63d398be-aa92-4a00-933b-549a0c4e4ad7\") " Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.860870 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-httpd-run\") pod \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\" (UID: \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\") " Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.860920 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"63d398be-aa92-4a00-933b-549a0c4e4ad7\" (UID: \"63d398be-aa92-4a00-933b-549a0c4e4ad7\") " Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.860940 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-etc-iscsi\") pod \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\" (UID: \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\") " Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.860988 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/63d398be-aa92-4a00-933b-549a0c4e4ad7-etc-iscsi\") pod \"63d398be-aa92-4a00-933b-549a0c4e4ad7\" (UID: \"63d398be-aa92-4a00-933b-549a0c4e4ad7\") " Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.861021 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"63d398be-aa92-4a00-933b-549a0c4e4ad7\" (UID: \"63d398be-aa92-4a00-933b-549a0c4e4ad7\") " Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.861115 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/63d398be-aa92-4a00-933b-549a0c4e4ad7-httpd-run\") pod \"63d398be-aa92-4a00-933b-549a0c4e4ad7\" (UID: \"63d398be-aa92-4a00-933b-549a0c4e4ad7\") " Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.861133 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-run" (OuterVolumeSpecName: "run") pod "f3d6d7db-fc12-479e-aedf-8ef829bf01e5" (UID: "f3d6d7db-fc12-479e-aedf-8ef829bf01e5"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.861140 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-dev\") pod \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\" (UID: \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\") " Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.861169 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\" (UID: \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\") " Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.861173 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63d398be-aa92-4a00-933b-549a0c4e4ad7-logs" (OuterVolumeSpecName: "logs") pod "63d398be-aa92-4a00-933b-549a0c4e4ad7" (UID: "63d398be-aa92-4a00-933b-549a0c4e4ad7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.861198 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdj2p\" (UniqueName: \"kubernetes.io/projected/63d398be-aa92-4a00-933b-549a0c4e4ad7-kube-api-access-pdj2p\") pod \"63d398be-aa92-4a00-933b-549a0c4e4ad7\" (UID: \"63d398be-aa92-4a00-933b-549a0c4e4ad7\") " Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.861206 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/63d398be-aa92-4a00-933b-549a0c4e4ad7-run" (OuterVolumeSpecName: "run") pod "63d398be-aa92-4a00-933b-549a0c4e4ad7" (UID: "63d398be-aa92-4a00-933b-549a0c4e4ad7"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.861216 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-var-locks-brick\") pod \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\" (UID: \"f3d6d7db-fc12-479e-aedf-8ef829bf01e5\") " Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.861187 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-dev" (OuterVolumeSpecName: "dev") pod "f3d6d7db-fc12-479e-aedf-8ef829bf01e5" (UID: "f3d6d7db-fc12-479e-aedf-8ef829bf01e5"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.861252 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/63d398be-aa92-4a00-933b-549a0c4e4ad7-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "63d398be-aa92-4a00-933b-549a0c4e4ad7" (UID: "63d398be-aa92-4a00-933b-549a0c4e4ad7"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.861232 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "f3d6d7db-fc12-479e-aedf-8ef829bf01e5" (UID: "f3d6d7db-fc12-479e-aedf-8ef829bf01e5"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.861466 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-logs" (OuterVolumeSpecName: "logs") pod "f3d6d7db-fc12-479e-aedf-8ef829bf01e5" (UID: "f3d6d7db-fc12-479e-aedf-8ef829bf01e5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.861490 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "f3d6d7db-fc12-479e-aedf-8ef829bf01e5" (UID: "f3d6d7db-fc12-479e-aedf-8ef829bf01e5"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.861645 4751 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/63d398be-aa92-4a00-933b-549a0c4e4ad7-run\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.861658 4751 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-lib-modules\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.861667 4751 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-dev\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.861675 4751 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-var-locks-brick\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.861703 4751 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-sys\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.861711 4751 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/63d398be-aa92-4a00-933b-549a0c4e4ad7-dev\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.861720 4751 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/63d398be-aa92-4a00-933b-549a0c4e4ad7-lib-modules\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.861729 4751 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-run\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.861738 4751 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/63d398be-aa92-4a00-933b-549a0c4e4ad7-var-locks-brick\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.861747 4751 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-logs\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.861756 4751 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63d398be-aa92-4a00-933b-549a0c4e4ad7-logs\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.862242 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/63d398be-aa92-4a00-933b-549a0c4e4ad7-sys" (OuterVolumeSpecName: "sys") pod "63d398be-aa92-4a00-933b-549a0c4e4ad7" (UID: "63d398be-aa92-4a00-933b-549a0c4e4ad7"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.862347 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f3d6d7db-fc12-479e-aedf-8ef829bf01e5" (UID: "f3d6d7db-fc12-479e-aedf-8ef829bf01e5"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.862379 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "f3d6d7db-fc12-479e-aedf-8ef829bf01e5" (UID: "f3d6d7db-fc12-479e-aedf-8ef829bf01e5"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.862761 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/63d398be-aa92-4a00-933b-549a0c4e4ad7-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "63d398be-aa92-4a00-933b-549a0c4e4ad7" (UID: "63d398be-aa92-4a00-933b-549a0c4e4ad7"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.862858 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/63d398be-aa92-4a00-933b-549a0c4e4ad7-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "63d398be-aa92-4a00-933b-549a0c4e4ad7" (UID: "63d398be-aa92-4a00-933b-549a0c4e4ad7"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.862892 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "f3d6d7db-fc12-479e-aedf-8ef829bf01e5" (UID: "f3d6d7db-fc12-479e-aedf-8ef829bf01e5"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.863360 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63d398be-aa92-4a00-933b-549a0c4e4ad7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "63d398be-aa92-4a00-933b-549a0c4e4ad7" (UID: "63d398be-aa92-4a00-933b-549a0c4e4ad7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.863805 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance-cache") pod "f3d6d7db-fc12-479e-aedf-8ef829bf01e5" (UID: "f3d6d7db-fc12-479e-aedf-8ef829bf01e5"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.864843 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-scripts" (OuterVolumeSpecName: "scripts") pod "f3d6d7db-fc12-479e-aedf-8ef829bf01e5" (UID: "f3d6d7db-fc12-479e-aedf-8ef829bf01e5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.865610 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-kube-api-access-7j7jg" (OuterVolumeSpecName: "kube-api-access-7j7jg") pod "f3d6d7db-fc12-479e-aedf-8ef829bf01e5" (UID: "f3d6d7db-fc12-479e-aedf-8ef829bf01e5"). InnerVolumeSpecName "kube-api-access-7j7jg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.866046 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63d398be-aa92-4a00-933b-549a0c4e4ad7-scripts" (OuterVolumeSpecName: "scripts") pod "63d398be-aa92-4a00-933b-549a0c4e4ad7" (UID: "63d398be-aa92-4a00-933b-549a0c4e4ad7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.866512 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "f3d6d7db-fc12-479e-aedf-8ef829bf01e5" (UID: "f3d6d7db-fc12-479e-aedf-8ef829bf01e5"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.867392 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63d398be-aa92-4a00-933b-549a0c4e4ad7-kube-api-access-pdj2p" (OuterVolumeSpecName: "kube-api-access-pdj2p") pod "63d398be-aa92-4a00-933b-549a0c4e4ad7" (UID: "63d398be-aa92-4a00-933b-549a0c4e4ad7"). InnerVolumeSpecName "kube-api-access-pdj2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.868211 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage20-crc" (OuterVolumeSpecName: "glance-cache") pod "63d398be-aa92-4a00-933b-549a0c4e4ad7" (UID: "63d398be-aa92-4a00-933b-549a0c4e4ad7"). InnerVolumeSpecName "local-storage20-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.869200 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage19-crc" (OuterVolumeSpecName: "glance") pod "63d398be-aa92-4a00-933b-549a0c4e4ad7" (UID: "63d398be-aa92-4a00-933b-549a0c4e4ad7"). InnerVolumeSpecName "local-storage19-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.904231 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-config-data" (OuterVolumeSpecName: "config-data") pod "f3d6d7db-fc12-479e-aedf-8ef829bf01e5" (UID: "f3d6d7db-fc12-479e-aedf-8ef829bf01e5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.911338 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63d398be-aa92-4a00-933b-549a0c4e4ad7-config-data" (OuterVolumeSpecName: "config-data") pod "63d398be-aa92-4a00-933b-549a0c4e4ad7" (UID: "63d398be-aa92-4a00-933b-549a0c4e4ad7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.963104 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdj2p\" (UniqueName: \"kubernetes.io/projected/63d398be-aa92-4a00-933b-549a0c4e4ad7-kube-api-access-pdj2p\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.963147 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.963160 4751 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/63d398be-aa92-4a00-933b-549a0c4e4ad7-sys\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.963195 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.963204 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63d398be-aa92-4a00-933b-549a0c4e4ad7-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.963213 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.963221 4751 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-etc-nvme\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.963232 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63d398be-aa92-4a00-933b-549a0c4e4ad7-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.963240 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7j7jg\" (UniqueName: \"kubernetes.io/projected/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-kube-api-access-7j7jg\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.963249 4751 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/63d398be-aa92-4a00-933b-549a0c4e4ad7-etc-nvme\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.963257 4751 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.963271 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" " Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.963279 4751 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f3d6d7db-fc12-479e-aedf-8ef829bf01e5-etc-iscsi\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.963287 4751 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/63d398be-aa92-4a00-933b-549a0c4e4ad7-etc-iscsi\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.963300 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" " Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.963309 4751 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/63d398be-aa92-4a00-933b-549a0c4e4ad7-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.963322 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.976274 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.977228 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage19-crc" (UniqueName: "kubernetes.io/local-volume/local-storage19-crc") on node "crc" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.977326 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage20-crc" (UniqueName: "kubernetes.io/local-volume/local-storage20-crc") on node "crc" Jan 31 15:03:28 crc kubenswrapper[4751]: I0131 15:03:28.977497 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Jan 31 15:03:29 crc kubenswrapper[4751]: I0131 15:03:29.064741 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:29 crc kubenswrapper[4751]: I0131 15:03:29.064770 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:29 crc kubenswrapper[4751]: I0131 15:03:29.064779 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:29 crc kubenswrapper[4751]: I0131 15:03:29.064788 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:29 crc kubenswrapper[4751]: I0131 15:03:29.227543 4751 generic.go:334] "Generic (PLEG): container finished" podID="63d398be-aa92-4a00-933b-549a0c4e4ad7" containerID="9c261553a8935996c28e0989101c6aa4dfa134fe2060a0f6d58b3c2531dcdda7" exitCode=0 Jan 31 15:03:29 crc kubenswrapper[4751]: I0131 15:03:29.227604 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:03:29 crc kubenswrapper[4751]: I0131 15:03:29.227643 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"63d398be-aa92-4a00-933b-549a0c4e4ad7","Type":"ContainerDied","Data":"9c261553a8935996c28e0989101c6aa4dfa134fe2060a0f6d58b3c2531dcdda7"} Jan 31 15:03:29 crc kubenswrapper[4751]: I0131 15:03:29.227725 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"63d398be-aa92-4a00-933b-549a0c4e4ad7","Type":"ContainerDied","Data":"af5a8366873c62793cd928a263608e14d01ee8087ae27093c452690e6adc2f31"} Jan 31 15:03:29 crc kubenswrapper[4751]: I0131 15:03:29.227774 4751 scope.go:117] "RemoveContainer" containerID="9c261553a8935996c28e0989101c6aa4dfa134fe2060a0f6d58b3c2531dcdda7" Jan 31 15:03:29 crc kubenswrapper[4751]: I0131 15:03:29.233127 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 15:03:29 crc kubenswrapper[4751]: I0131 15:03:29.233161 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"f3d6d7db-fc12-479e-aedf-8ef829bf01e5","Type":"ContainerDied","Data":"1dc237ef9e49d030bd026bc0cf53f0f8764a60860c14efa8846ee1aaa6f4fde8"} Jan 31 15:03:29 crc kubenswrapper[4751]: I0131 15:03:29.233060 4751 generic.go:334] "Generic (PLEG): container finished" podID="f3d6d7db-fc12-479e-aedf-8ef829bf01e5" containerID="1dc237ef9e49d030bd026bc0cf53f0f8764a60860c14efa8846ee1aaa6f4fde8" exitCode=0 Jan 31 15:03:29 crc kubenswrapper[4751]: I0131 15:03:29.233347 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"f3d6d7db-fc12-479e-aedf-8ef829bf01e5","Type":"ContainerDied","Data":"cc8b1dd3b31a488ef9b0862ddc7dae65875f8b41c155c15581f698c10d6ef4dd"} Jan 31 15:03:29 crc kubenswrapper[4751]: I0131 15:03:29.251696 4751 scope.go:117] "RemoveContainer" containerID="136f5cdff8be1979c81bded97141738e6801ed16ea2d04c80f29ec3512279160" Jan 31 15:03:29 crc kubenswrapper[4751]: I0131 15:03:29.271883 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Jan 31 15:03:29 crc kubenswrapper[4751]: I0131 15:03:29.275408 4751 scope.go:117] "RemoveContainer" containerID="9c261553a8935996c28e0989101c6aa4dfa134fe2060a0f6d58b3c2531dcdda7" Jan 31 15:03:29 crc kubenswrapper[4751]: E0131 15:03:29.275932 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c261553a8935996c28e0989101c6aa4dfa134fe2060a0f6d58b3c2531dcdda7\": container with ID starting with 9c261553a8935996c28e0989101c6aa4dfa134fe2060a0f6d58b3c2531dcdda7 not found: ID does not exist" containerID="9c261553a8935996c28e0989101c6aa4dfa134fe2060a0f6d58b3c2531dcdda7" Jan 31 15:03:29 crc kubenswrapper[4751]: I0131 15:03:29.275982 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c261553a8935996c28e0989101c6aa4dfa134fe2060a0f6d58b3c2531dcdda7"} err="failed to get container status \"9c261553a8935996c28e0989101c6aa4dfa134fe2060a0f6d58b3c2531dcdda7\": rpc error: code = NotFound desc = could not find container \"9c261553a8935996c28e0989101c6aa4dfa134fe2060a0f6d58b3c2531dcdda7\": container with ID starting with 9c261553a8935996c28e0989101c6aa4dfa134fe2060a0f6d58b3c2531dcdda7 not found: ID does not exist" Jan 31 15:03:29 crc kubenswrapper[4751]: I0131 15:03:29.276019 4751 scope.go:117] "RemoveContainer" containerID="136f5cdff8be1979c81bded97141738e6801ed16ea2d04c80f29ec3512279160" Jan 31 15:03:29 crc kubenswrapper[4751]: E0131 15:03:29.276357 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"136f5cdff8be1979c81bded97141738e6801ed16ea2d04c80f29ec3512279160\": container with ID starting with 136f5cdff8be1979c81bded97141738e6801ed16ea2d04c80f29ec3512279160 not found: ID does not exist" containerID="136f5cdff8be1979c81bded97141738e6801ed16ea2d04c80f29ec3512279160" Jan 31 15:03:29 crc kubenswrapper[4751]: I0131 15:03:29.276397 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"136f5cdff8be1979c81bded97141738e6801ed16ea2d04c80f29ec3512279160"} err="failed to get container status \"136f5cdff8be1979c81bded97141738e6801ed16ea2d04c80f29ec3512279160\": rpc error: code = NotFound desc = could not find container \"136f5cdff8be1979c81bded97141738e6801ed16ea2d04c80f29ec3512279160\": container with ID starting with 136f5cdff8be1979c81bded97141738e6801ed16ea2d04c80f29ec3512279160 not found: ID does not exist" Jan 31 15:03:29 crc kubenswrapper[4751]: I0131 15:03:29.276420 4751 scope.go:117] "RemoveContainer" containerID="1dc237ef9e49d030bd026bc0cf53f0f8764a60860c14efa8846ee1aaa6f4fde8" Jan 31 15:03:29 crc kubenswrapper[4751]: I0131 15:03:29.282623 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Jan 31 15:03:29 crc kubenswrapper[4751]: I0131 15:03:29.291928 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Jan 31 15:03:29 crc kubenswrapper[4751]: I0131 15:03:29.295453 4751 scope.go:117] "RemoveContainer" containerID="9a94ed6a4818186063a21a203b752d947f2a19e75f8bff027ed517efba40515f" Jan 31 15:03:29 crc kubenswrapper[4751]: I0131 15:03:29.298103 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Jan 31 15:03:29 crc kubenswrapper[4751]: I0131 15:03:29.312848 4751 scope.go:117] "RemoveContainer" containerID="1dc237ef9e49d030bd026bc0cf53f0f8764a60860c14efa8846ee1aaa6f4fde8" Jan 31 15:03:29 crc kubenswrapper[4751]: E0131 15:03:29.313277 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1dc237ef9e49d030bd026bc0cf53f0f8764a60860c14efa8846ee1aaa6f4fde8\": container with ID starting with 1dc237ef9e49d030bd026bc0cf53f0f8764a60860c14efa8846ee1aaa6f4fde8 not found: ID does not exist" containerID="1dc237ef9e49d030bd026bc0cf53f0f8764a60860c14efa8846ee1aaa6f4fde8" Jan 31 15:03:29 crc kubenswrapper[4751]: I0131 15:03:29.313316 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1dc237ef9e49d030bd026bc0cf53f0f8764a60860c14efa8846ee1aaa6f4fde8"} err="failed to get container status \"1dc237ef9e49d030bd026bc0cf53f0f8764a60860c14efa8846ee1aaa6f4fde8\": rpc error: code = NotFound desc = could not find container \"1dc237ef9e49d030bd026bc0cf53f0f8764a60860c14efa8846ee1aaa6f4fde8\": container with ID starting with 1dc237ef9e49d030bd026bc0cf53f0f8764a60860c14efa8846ee1aaa6f4fde8 not found: ID does not exist" Jan 31 15:03:29 crc kubenswrapper[4751]: I0131 15:03:29.313345 4751 scope.go:117] "RemoveContainer" containerID="9a94ed6a4818186063a21a203b752d947f2a19e75f8bff027ed517efba40515f" Jan 31 15:03:29 crc kubenswrapper[4751]: E0131 15:03:29.313721 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a94ed6a4818186063a21a203b752d947f2a19e75f8bff027ed517efba40515f\": container with ID starting with 9a94ed6a4818186063a21a203b752d947f2a19e75f8bff027ed517efba40515f not found: ID does not exist" containerID="9a94ed6a4818186063a21a203b752d947f2a19e75f8bff027ed517efba40515f" Jan 31 15:03:29 crc kubenswrapper[4751]: I0131 15:03:29.313751 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a94ed6a4818186063a21a203b752d947f2a19e75f8bff027ed517efba40515f"} err="failed to get container status \"9a94ed6a4818186063a21a203b752d947f2a19e75f8bff027ed517efba40515f\": rpc error: code = NotFound desc = could not find container \"9a94ed6a4818186063a21a203b752d947f2a19e75f8bff027ed517efba40515f\": container with ID starting with 9a94ed6a4818186063a21a203b752d947f2a19e75f8bff027ed517efba40515f not found: ID does not exist" Jan 31 15:03:30 crc kubenswrapper[4751]: I0131 15:03:30.114110 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Jan 31 15:03:30 crc kubenswrapper[4751]: I0131 15:03:30.114452 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="a5d5c53d-eea5-4866-983f-8477eb16177b" containerName="glance-log" containerID="cri-o://1130e67347fe51af0e3a73480eea09807ed13c49b680ae612acbf9d7812bf42d" gracePeriod=30 Jan 31 15:03:30 crc kubenswrapper[4751]: I0131 15:03:30.114625 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="a5d5c53d-eea5-4866-983f-8477eb16177b" containerName="glance-httpd" containerID="cri-o://46e6d1ac28c6a7caa669a8786fb2699fde254ad44262a93b0f5099c64c3baee9" gracePeriod=30 Jan 31 15:03:30 crc kubenswrapper[4751]: I0131 15:03:30.252942 4751 generic.go:334] "Generic (PLEG): container finished" podID="a5d5c53d-eea5-4866-983f-8477eb16177b" containerID="1130e67347fe51af0e3a73480eea09807ed13c49b680ae612acbf9d7812bf42d" exitCode=143 Jan 31 15:03:30 crc kubenswrapper[4751]: I0131 15:03:30.253117 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"a5d5c53d-eea5-4866-983f-8477eb16177b","Type":"ContainerDied","Data":"1130e67347fe51af0e3a73480eea09807ed13c49b680ae612acbf9d7812bf42d"} Jan 31 15:03:30 crc kubenswrapper[4751]: I0131 15:03:30.416724 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63d398be-aa92-4a00-933b-549a0c4e4ad7" path="/var/lib/kubelet/pods/63d398be-aa92-4a00-933b-549a0c4e4ad7/volumes" Jan 31 15:03:30 crc kubenswrapper[4751]: I0131 15:03:30.417567 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3d6d7db-fc12-479e-aedf-8ef829bf01e5" path="/var/lib/kubelet/pods/f3d6d7db-fc12-479e-aedf-8ef829bf01e5/volumes" Jan 31 15:03:30 crc kubenswrapper[4751]: I0131 15:03:30.854707 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 15:03:30 crc kubenswrapper[4751]: I0131 15:03:30.855163 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="1c3cde72-72a2-4a51-a061-06397061de3c" containerName="glance-log" containerID="cri-o://5f91babe44ec234dd474a96b96ebc6224d6c3a4f5e9aa9658f789a37025d6480" gracePeriod=30 Jan 31 15:03:30 crc kubenswrapper[4751]: I0131 15:03:30.855606 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="1c3cde72-72a2-4a51-a061-06397061de3c" containerName="glance-httpd" containerID="cri-o://6735f01e0fcf3a4be1978348937bbe37355d165dbd9469a97f61438e89b09635" gracePeriod=30 Jan 31 15:03:31 crc kubenswrapper[4751]: I0131 15:03:31.265119 4751 generic.go:334] "Generic (PLEG): container finished" podID="1c3cde72-72a2-4a51-a061-06397061de3c" containerID="5f91babe44ec234dd474a96b96ebc6224d6c3a4f5e9aa9658f789a37025d6480" exitCode=143 Jan 31 15:03:31 crc kubenswrapper[4751]: I0131 15:03:31.265168 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"1c3cde72-72a2-4a51-a061-06397061de3c","Type":"ContainerDied","Data":"5f91babe44ec234dd474a96b96ebc6224d6c3a4f5e9aa9658f789a37025d6480"} Jan 31 15:03:33 crc kubenswrapper[4751]: I0131 15:03:33.652038 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:03:33 crc kubenswrapper[4751]: I0131 15:03:33.732092 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5d5c53d-eea5-4866-983f-8477eb16177b-config-data\") pod \"a5d5c53d-eea5-4866-983f-8477eb16177b\" (UID: \"a5d5c53d-eea5-4866-983f-8477eb16177b\") " Jan 31 15:03:33 crc kubenswrapper[4751]: I0131 15:03:33.732151 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a5d5c53d-eea5-4866-983f-8477eb16177b-lib-modules\") pod \"a5d5c53d-eea5-4866-983f-8477eb16177b\" (UID: \"a5d5c53d-eea5-4866-983f-8477eb16177b\") " Jan 31 15:03:33 crc kubenswrapper[4751]: I0131 15:03:33.732218 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llcmb\" (UniqueName: \"kubernetes.io/projected/a5d5c53d-eea5-4866-983f-8477eb16177b-kube-api-access-llcmb\") pod \"a5d5c53d-eea5-4866-983f-8477eb16177b\" (UID: \"a5d5c53d-eea5-4866-983f-8477eb16177b\") " Jan 31 15:03:33 crc kubenswrapper[4751]: I0131 15:03:33.732237 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a5d5c53d-eea5-4866-983f-8477eb16177b-etc-iscsi\") pod \"a5d5c53d-eea5-4866-983f-8477eb16177b\" (UID: \"a5d5c53d-eea5-4866-983f-8477eb16177b\") " Jan 31 15:03:33 crc kubenswrapper[4751]: I0131 15:03:33.732264 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5d5c53d-eea5-4866-983f-8477eb16177b-scripts\") pod \"a5d5c53d-eea5-4866-983f-8477eb16177b\" (UID: \"a5d5c53d-eea5-4866-983f-8477eb16177b\") " Jan 31 15:03:33 crc kubenswrapper[4751]: I0131 15:03:33.732317 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"a5d5c53d-eea5-4866-983f-8477eb16177b\" (UID: \"a5d5c53d-eea5-4866-983f-8477eb16177b\") " Jan 31 15:03:33 crc kubenswrapper[4751]: I0131 15:03:33.732303 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a5d5c53d-eea5-4866-983f-8477eb16177b-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "a5d5c53d-eea5-4866-983f-8477eb16177b" (UID: "a5d5c53d-eea5-4866-983f-8477eb16177b"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:03:33 crc kubenswrapper[4751]: I0131 15:03:33.732339 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a5d5c53d-eea5-4866-983f-8477eb16177b-httpd-run\") pod \"a5d5c53d-eea5-4866-983f-8477eb16177b\" (UID: \"a5d5c53d-eea5-4866-983f-8477eb16177b\") " Jan 31 15:03:33 crc kubenswrapper[4751]: I0131 15:03:33.732351 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a5d5c53d-eea5-4866-983f-8477eb16177b-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "a5d5c53d-eea5-4866-983f-8477eb16177b" (UID: "a5d5c53d-eea5-4866-983f-8477eb16177b"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:03:33 crc kubenswrapper[4751]: I0131 15:03:33.732385 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a5d5c53d-eea5-4866-983f-8477eb16177b-dev" (OuterVolumeSpecName: "dev") pod "a5d5c53d-eea5-4866-983f-8477eb16177b" (UID: "a5d5c53d-eea5-4866-983f-8477eb16177b"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:03:33 crc kubenswrapper[4751]: I0131 15:03:33.732360 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a5d5c53d-eea5-4866-983f-8477eb16177b-dev\") pod \"a5d5c53d-eea5-4866-983f-8477eb16177b\" (UID: \"a5d5c53d-eea5-4866-983f-8477eb16177b\") " Jan 31 15:03:33 crc kubenswrapper[4751]: I0131 15:03:33.732430 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"a5d5c53d-eea5-4866-983f-8477eb16177b\" (UID: \"a5d5c53d-eea5-4866-983f-8477eb16177b\") " Jan 31 15:03:33 crc kubenswrapper[4751]: I0131 15:03:33.732480 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a5d5c53d-eea5-4866-983f-8477eb16177b-var-locks-brick\") pod \"a5d5c53d-eea5-4866-983f-8477eb16177b\" (UID: \"a5d5c53d-eea5-4866-983f-8477eb16177b\") " Jan 31 15:03:33 crc kubenswrapper[4751]: I0131 15:03:33.732503 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a5d5c53d-eea5-4866-983f-8477eb16177b-sys\") pod \"a5d5c53d-eea5-4866-983f-8477eb16177b\" (UID: \"a5d5c53d-eea5-4866-983f-8477eb16177b\") " Jan 31 15:03:33 crc kubenswrapper[4751]: I0131 15:03:33.732564 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a5d5c53d-eea5-4866-983f-8477eb16177b-run\") pod \"a5d5c53d-eea5-4866-983f-8477eb16177b\" (UID: \"a5d5c53d-eea5-4866-983f-8477eb16177b\") " Jan 31 15:03:33 crc kubenswrapper[4751]: I0131 15:03:33.732598 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5d5c53d-eea5-4866-983f-8477eb16177b-logs\") pod \"a5d5c53d-eea5-4866-983f-8477eb16177b\" (UID: \"a5d5c53d-eea5-4866-983f-8477eb16177b\") " Jan 31 15:03:33 crc kubenswrapper[4751]: I0131 15:03:33.732656 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a5d5c53d-eea5-4866-983f-8477eb16177b-etc-nvme\") pod \"a5d5c53d-eea5-4866-983f-8477eb16177b\" (UID: \"a5d5c53d-eea5-4866-983f-8477eb16177b\") " Jan 31 15:03:33 crc kubenswrapper[4751]: I0131 15:03:33.732970 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5d5c53d-eea5-4866-983f-8477eb16177b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a5d5c53d-eea5-4866-983f-8477eb16177b" (UID: "a5d5c53d-eea5-4866-983f-8477eb16177b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:03:33 crc kubenswrapper[4751]: I0131 15:03:33.733046 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a5d5c53d-eea5-4866-983f-8477eb16177b-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "a5d5c53d-eea5-4866-983f-8477eb16177b" (UID: "a5d5c53d-eea5-4866-983f-8477eb16177b"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:03:33 crc kubenswrapper[4751]: I0131 15:03:33.733185 4751 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a5d5c53d-eea5-4866-983f-8477eb16177b-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:33 crc kubenswrapper[4751]: I0131 15:03:33.733207 4751 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a5d5c53d-eea5-4866-983f-8477eb16177b-dev\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:33 crc kubenswrapper[4751]: I0131 15:03:33.733216 4751 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a5d5c53d-eea5-4866-983f-8477eb16177b-etc-nvme\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:33 crc kubenswrapper[4751]: I0131 15:03:33.733224 4751 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a5d5c53d-eea5-4866-983f-8477eb16177b-lib-modules\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:33 crc kubenswrapper[4751]: I0131 15:03:33.733233 4751 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a5d5c53d-eea5-4866-983f-8477eb16177b-etc-iscsi\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:33 crc kubenswrapper[4751]: I0131 15:03:33.733257 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a5d5c53d-eea5-4866-983f-8477eb16177b-sys" (OuterVolumeSpecName: "sys") pod "a5d5c53d-eea5-4866-983f-8477eb16177b" (UID: "a5d5c53d-eea5-4866-983f-8477eb16177b"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:03:33 crc kubenswrapper[4751]: I0131 15:03:33.733277 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a5d5c53d-eea5-4866-983f-8477eb16177b-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "a5d5c53d-eea5-4866-983f-8477eb16177b" (UID: "a5d5c53d-eea5-4866-983f-8477eb16177b"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:03:33 crc kubenswrapper[4751]: I0131 15:03:33.733297 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a5d5c53d-eea5-4866-983f-8477eb16177b-run" (OuterVolumeSpecName: "run") pod "a5d5c53d-eea5-4866-983f-8477eb16177b" (UID: "a5d5c53d-eea5-4866-983f-8477eb16177b"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:03:33 crc kubenswrapper[4751]: I0131 15:03:33.733491 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5d5c53d-eea5-4866-983f-8477eb16177b-logs" (OuterVolumeSpecName: "logs") pod "a5d5c53d-eea5-4866-983f-8477eb16177b" (UID: "a5d5c53d-eea5-4866-983f-8477eb16177b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:03:33 crc kubenswrapper[4751]: I0131 15:03:33.737174 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "a5d5c53d-eea5-4866-983f-8477eb16177b" (UID: "a5d5c53d-eea5-4866-983f-8477eb16177b"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:03:33 crc kubenswrapper[4751]: I0131 15:03:33.737209 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5d5c53d-eea5-4866-983f-8477eb16177b-scripts" (OuterVolumeSpecName: "scripts") pod "a5d5c53d-eea5-4866-983f-8477eb16177b" (UID: "a5d5c53d-eea5-4866-983f-8477eb16177b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:03:33 crc kubenswrapper[4751]: I0131 15:03:33.737667 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5d5c53d-eea5-4866-983f-8477eb16177b-kube-api-access-llcmb" (OuterVolumeSpecName: "kube-api-access-llcmb") pod "a5d5c53d-eea5-4866-983f-8477eb16177b" (UID: "a5d5c53d-eea5-4866-983f-8477eb16177b"). InnerVolumeSpecName "kube-api-access-llcmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:03:33 crc kubenswrapper[4751]: I0131 15:03:33.737706 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance-cache") pod "a5d5c53d-eea5-4866-983f-8477eb16177b" (UID: "a5d5c53d-eea5-4866-983f-8477eb16177b"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:03:33 crc kubenswrapper[4751]: I0131 15:03:33.778476 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5d5c53d-eea5-4866-983f-8477eb16177b-config-data" (OuterVolumeSpecName: "config-data") pod "a5d5c53d-eea5-4866-983f-8477eb16177b" (UID: "a5d5c53d-eea5-4866-983f-8477eb16177b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:03:33 crc kubenswrapper[4751]: I0131 15:03:33.834768 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Jan 31 15:03:33 crc kubenswrapper[4751]: I0131 15:03:33.834802 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Jan 31 15:03:33 crc kubenswrapper[4751]: I0131 15:03:33.834818 4751 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a5d5c53d-eea5-4866-983f-8477eb16177b-var-locks-brick\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:33 crc kubenswrapper[4751]: I0131 15:03:33.834834 4751 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a5d5c53d-eea5-4866-983f-8477eb16177b-sys\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:33 crc kubenswrapper[4751]: I0131 15:03:33.834846 4751 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a5d5c53d-eea5-4866-983f-8477eb16177b-run\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:33 crc kubenswrapper[4751]: I0131 15:03:33.834857 4751 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5d5c53d-eea5-4866-983f-8477eb16177b-logs\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:33 crc kubenswrapper[4751]: I0131 15:03:33.834864 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5d5c53d-eea5-4866-983f-8477eb16177b-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:33 crc kubenswrapper[4751]: I0131 15:03:33.834872 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llcmb\" (UniqueName: \"kubernetes.io/projected/a5d5c53d-eea5-4866-983f-8477eb16177b-kube-api-access-llcmb\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:33 crc kubenswrapper[4751]: I0131 15:03:33.834880 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5d5c53d-eea5-4866-983f-8477eb16177b-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:33 crc kubenswrapper[4751]: I0131 15:03:33.850359 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Jan 31 15:03:33 crc kubenswrapper[4751]: I0131 15:03:33.852319 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Jan 31 15:03:33 crc kubenswrapper[4751]: I0131 15:03:33.935535 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:33 crc kubenswrapper[4751]: I0131 15:03:33.935568 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.277810 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.296682 4751 generic.go:334] "Generic (PLEG): container finished" podID="1c3cde72-72a2-4a51-a061-06397061de3c" containerID="6735f01e0fcf3a4be1978348937bbe37355d165dbd9469a97f61438e89b09635" exitCode=0 Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.296724 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"1c3cde72-72a2-4a51-a061-06397061de3c","Type":"ContainerDied","Data":"6735f01e0fcf3a4be1978348937bbe37355d165dbd9469a97f61438e89b09635"} Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.296771 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"1c3cde72-72a2-4a51-a061-06397061de3c","Type":"ContainerDied","Data":"c898b2da5714126a65ee741ceb3ed33e63dd1016489a4c5e916d0a834f254ea4"} Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.296741 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.296793 4751 scope.go:117] "RemoveContainer" containerID="6735f01e0fcf3a4be1978348937bbe37355d165dbd9469a97f61438e89b09635" Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.310298 4751 generic.go:334] "Generic (PLEG): container finished" podID="a5d5c53d-eea5-4866-983f-8477eb16177b" containerID="46e6d1ac28c6a7caa669a8786fb2699fde254ad44262a93b0f5099c64c3baee9" exitCode=0 Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.310337 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"a5d5c53d-eea5-4866-983f-8477eb16177b","Type":"ContainerDied","Data":"46e6d1ac28c6a7caa669a8786fb2699fde254ad44262a93b0f5099c64c3baee9"} Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.310360 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"a5d5c53d-eea5-4866-983f-8477eb16177b","Type":"ContainerDied","Data":"f196d65739ae0a450d1f988eb2e240599d07c3bc3006a4085be84398709e7ee3"} Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.310389 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.334287 4751 scope.go:117] "RemoveContainer" containerID="5f91babe44ec234dd474a96b96ebc6224d6c3a4f5e9aa9658f789a37025d6480" Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.342886 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"1c3cde72-72a2-4a51-a061-06397061de3c\" (UID: \"1c3cde72-72a2-4a51-a061-06397061de3c\") " Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.343447 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1c3cde72-72a2-4a51-a061-06397061de3c-lib-modules\") pod \"1c3cde72-72a2-4a51-a061-06397061de3c\" (UID: \"1c3cde72-72a2-4a51-a061-06397061de3c\") " Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.343550 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1c3cde72-72a2-4a51-a061-06397061de3c-var-locks-brick\") pod \"1c3cde72-72a2-4a51-a061-06397061de3c\" (UID: \"1c3cde72-72a2-4a51-a061-06397061de3c\") " Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.343620 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1c3cde72-72a2-4a51-a061-06397061de3c-etc-nvme\") pod \"1c3cde72-72a2-4a51-a061-06397061de3c\" (UID: \"1c3cde72-72a2-4a51-a061-06397061de3c\") " Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.343689 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c3cde72-72a2-4a51-a061-06397061de3c-logs\") pod \"1c3cde72-72a2-4a51-a061-06397061de3c\" (UID: \"1c3cde72-72a2-4a51-a061-06397061de3c\") " Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.343807 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1c3cde72-72a2-4a51-a061-06397061de3c-httpd-run\") pod \"1c3cde72-72a2-4a51-a061-06397061de3c\" (UID: \"1c3cde72-72a2-4a51-a061-06397061de3c\") " Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.343884 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxgsq\" (UniqueName: \"kubernetes.io/projected/1c3cde72-72a2-4a51-a061-06397061de3c-kube-api-access-kxgsq\") pod \"1c3cde72-72a2-4a51-a061-06397061de3c\" (UID: \"1c3cde72-72a2-4a51-a061-06397061de3c\") " Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.343949 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1c3cde72-72a2-4a51-a061-06397061de3c-run\") pod \"1c3cde72-72a2-4a51-a061-06397061de3c\" (UID: \"1c3cde72-72a2-4a51-a061-06397061de3c\") " Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.344177 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1c3cde72-72a2-4a51-a061-06397061de3c-sys\") pod \"1c3cde72-72a2-4a51-a061-06397061de3c\" (UID: \"1c3cde72-72a2-4a51-a061-06397061de3c\") " Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.344262 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c3cde72-72a2-4a51-a061-06397061de3c-config-data\") pod \"1c3cde72-72a2-4a51-a061-06397061de3c\" (UID: \"1c3cde72-72a2-4a51-a061-06397061de3c\") " Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.344339 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1c3cde72-72a2-4a51-a061-06397061de3c-dev\") pod \"1c3cde72-72a2-4a51-a061-06397061de3c\" (UID: \"1c3cde72-72a2-4a51-a061-06397061de3c\") " Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.344409 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1c3cde72-72a2-4a51-a061-06397061de3c-etc-iscsi\") pod \"1c3cde72-72a2-4a51-a061-06397061de3c\" (UID: \"1c3cde72-72a2-4a51-a061-06397061de3c\") " Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.344485 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c3cde72-72a2-4a51-a061-06397061de3c-scripts\") pod \"1c3cde72-72a2-4a51-a061-06397061de3c\" (UID: \"1c3cde72-72a2-4a51-a061-06397061de3c\") " Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.344559 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"1c3cde72-72a2-4a51-a061-06397061de3c\" (UID: \"1c3cde72-72a2-4a51-a061-06397061de3c\") " Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.346436 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c3cde72-72a2-4a51-a061-06397061de3c-run" (OuterVolumeSpecName: "run") pod "1c3cde72-72a2-4a51-a061-06397061de3c" (UID: "1c3cde72-72a2-4a51-a061-06397061de3c"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.346520 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c3cde72-72a2-4a51-a061-06397061de3c-sys" (OuterVolumeSpecName: "sys") pod "1c3cde72-72a2-4a51-a061-06397061de3c" (UID: "1c3cde72-72a2-4a51-a061-06397061de3c"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.346532 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c3cde72-72a2-4a51-a061-06397061de3c-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "1c3cde72-72a2-4a51-a061-06397061de3c" (UID: "1c3cde72-72a2-4a51-a061-06397061de3c"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.346751 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c3cde72-72a2-4a51-a061-06397061de3c-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "1c3cde72-72a2-4a51-a061-06397061de3c" (UID: "1c3cde72-72a2-4a51-a061-06397061de3c"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.346909 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c3cde72-72a2-4a51-a061-06397061de3c-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "1c3cde72-72a2-4a51-a061-06397061de3c" (UID: "1c3cde72-72a2-4a51-a061-06397061de3c"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.347042 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c3cde72-72a2-4a51-a061-06397061de3c-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "1c3cde72-72a2-4a51-a061-06397061de3c" (UID: "1c3cde72-72a2-4a51-a061-06397061de3c"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.347140 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c3cde72-72a2-4a51-a061-06397061de3c-logs" (OuterVolumeSpecName: "logs") pod "1c3cde72-72a2-4a51-a061-06397061de3c" (UID: "1c3cde72-72a2-4a51-a061-06397061de3c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.347154 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c3cde72-72a2-4a51-a061-06397061de3c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1c3cde72-72a2-4a51-a061-06397061de3c" (UID: "1c3cde72-72a2-4a51-a061-06397061de3c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.348276 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c3cde72-72a2-4a51-a061-06397061de3c-dev" (OuterVolumeSpecName: "dev") pod "1c3cde72-72a2-4a51-a061-06397061de3c" (UID: "1c3cde72-72a2-4a51-a061-06397061de3c"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.349400 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.349786 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage13-crc" (OuterVolumeSpecName: "glance-cache") pod "1c3cde72-72a2-4a51-a061-06397061de3c" (UID: "1c3cde72-72a2-4a51-a061-06397061de3c"). InnerVolumeSpecName "local-storage13-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.357244 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage17-crc" (OuterVolumeSpecName: "glance") pod "1c3cde72-72a2-4a51-a061-06397061de3c" (UID: "1c3cde72-72a2-4a51-a061-06397061de3c"). InnerVolumeSpecName "local-storage17-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.357368 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c3cde72-72a2-4a51-a061-06397061de3c-scripts" (OuterVolumeSpecName: "scripts") pod "1c3cde72-72a2-4a51-a061-06397061de3c" (UID: "1c3cde72-72a2-4a51-a061-06397061de3c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.363100 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.363865 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c3cde72-72a2-4a51-a061-06397061de3c-kube-api-access-kxgsq" (OuterVolumeSpecName: "kube-api-access-kxgsq") pod "1c3cde72-72a2-4a51-a061-06397061de3c" (UID: "1c3cde72-72a2-4a51-a061-06397061de3c"). InnerVolumeSpecName "kube-api-access-kxgsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.390968 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c3cde72-72a2-4a51-a061-06397061de3c-config-data" (OuterVolumeSpecName: "config-data") pod "1c3cde72-72a2-4a51-a061-06397061de3c" (UID: "1c3cde72-72a2-4a51-a061-06397061de3c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.416753 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5d5c53d-eea5-4866-983f-8477eb16177b" path="/var/lib/kubelet/pods/a5d5c53d-eea5-4866-983f-8477eb16177b/volumes" Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.445808 4751 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1c3cde72-72a2-4a51-a061-06397061de3c-etc-iscsi\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.445944 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c3cde72-72a2-4a51-a061-06397061de3c-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.446014 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" " Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.446124 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" " Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.446370 4751 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1c3cde72-72a2-4a51-a061-06397061de3c-lib-modules\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.446663 4751 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1c3cde72-72a2-4a51-a061-06397061de3c-var-locks-brick\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.446684 4751 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1c3cde72-72a2-4a51-a061-06397061de3c-etc-nvme\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.446693 4751 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c3cde72-72a2-4a51-a061-06397061de3c-logs\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.446703 4751 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1c3cde72-72a2-4a51-a061-06397061de3c-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.446712 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxgsq\" (UniqueName: \"kubernetes.io/projected/1c3cde72-72a2-4a51-a061-06397061de3c-kube-api-access-kxgsq\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.446723 4751 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1c3cde72-72a2-4a51-a061-06397061de3c-run\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.446731 4751 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1c3cde72-72a2-4a51-a061-06397061de3c-sys\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.446740 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c3cde72-72a2-4a51-a061-06397061de3c-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.446748 4751 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1c3cde72-72a2-4a51-a061-06397061de3c-dev\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.448678 4751 scope.go:117] "RemoveContainer" containerID="6735f01e0fcf3a4be1978348937bbe37355d165dbd9469a97f61438e89b09635" Jan 31 15:03:34 crc kubenswrapper[4751]: E0131 15:03:34.449034 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6735f01e0fcf3a4be1978348937bbe37355d165dbd9469a97f61438e89b09635\": container with ID starting with 6735f01e0fcf3a4be1978348937bbe37355d165dbd9469a97f61438e89b09635 not found: ID does not exist" containerID="6735f01e0fcf3a4be1978348937bbe37355d165dbd9469a97f61438e89b09635" Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.449077 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6735f01e0fcf3a4be1978348937bbe37355d165dbd9469a97f61438e89b09635"} err="failed to get container status \"6735f01e0fcf3a4be1978348937bbe37355d165dbd9469a97f61438e89b09635\": rpc error: code = NotFound desc = could not find container \"6735f01e0fcf3a4be1978348937bbe37355d165dbd9469a97f61438e89b09635\": container with ID starting with 6735f01e0fcf3a4be1978348937bbe37355d165dbd9469a97f61438e89b09635 not found: ID does not exist" Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.449099 4751 scope.go:117] "RemoveContainer" containerID="5f91babe44ec234dd474a96b96ebc6224d6c3a4f5e9aa9658f789a37025d6480" Jan 31 15:03:34 crc kubenswrapper[4751]: E0131 15:03:34.449962 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f91babe44ec234dd474a96b96ebc6224d6c3a4f5e9aa9658f789a37025d6480\": container with ID starting with 5f91babe44ec234dd474a96b96ebc6224d6c3a4f5e9aa9658f789a37025d6480 not found: ID does not exist" containerID="5f91babe44ec234dd474a96b96ebc6224d6c3a4f5e9aa9658f789a37025d6480" Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.449986 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f91babe44ec234dd474a96b96ebc6224d6c3a4f5e9aa9658f789a37025d6480"} err="failed to get container status \"5f91babe44ec234dd474a96b96ebc6224d6c3a4f5e9aa9658f789a37025d6480\": rpc error: code = NotFound desc = could not find container \"5f91babe44ec234dd474a96b96ebc6224d6c3a4f5e9aa9658f789a37025d6480\": container with ID starting with 5f91babe44ec234dd474a96b96ebc6224d6c3a4f5e9aa9658f789a37025d6480 not found: ID does not exist" Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.450000 4751 scope.go:117] "RemoveContainer" containerID="46e6d1ac28c6a7caa669a8786fb2699fde254ad44262a93b0f5099c64c3baee9" Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.460268 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage13-crc" (UniqueName: "kubernetes.io/local-volume/local-storage13-crc") on node "crc" Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.461730 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage17-crc" (UniqueName: "kubernetes.io/local-volume/local-storage17-crc") on node "crc" Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.470807 4751 scope.go:117] "RemoveContainer" containerID="1130e67347fe51af0e3a73480eea09807ed13c49b680ae612acbf9d7812bf42d" Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.487087 4751 scope.go:117] "RemoveContainer" containerID="46e6d1ac28c6a7caa669a8786fb2699fde254ad44262a93b0f5099c64c3baee9" Jan 31 15:03:34 crc kubenswrapper[4751]: E0131 15:03:34.487467 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46e6d1ac28c6a7caa669a8786fb2699fde254ad44262a93b0f5099c64c3baee9\": container with ID starting with 46e6d1ac28c6a7caa669a8786fb2699fde254ad44262a93b0f5099c64c3baee9 not found: ID does not exist" containerID="46e6d1ac28c6a7caa669a8786fb2699fde254ad44262a93b0f5099c64c3baee9" Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.487572 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46e6d1ac28c6a7caa669a8786fb2699fde254ad44262a93b0f5099c64c3baee9"} err="failed to get container status \"46e6d1ac28c6a7caa669a8786fb2699fde254ad44262a93b0f5099c64c3baee9\": rpc error: code = NotFound desc = could not find container \"46e6d1ac28c6a7caa669a8786fb2699fde254ad44262a93b0f5099c64c3baee9\": container with ID starting with 46e6d1ac28c6a7caa669a8786fb2699fde254ad44262a93b0f5099c64c3baee9 not found: ID does not exist" Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.487648 4751 scope.go:117] "RemoveContainer" containerID="1130e67347fe51af0e3a73480eea09807ed13c49b680ae612acbf9d7812bf42d" Jan 31 15:03:34 crc kubenswrapper[4751]: E0131 15:03:34.488020 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1130e67347fe51af0e3a73480eea09807ed13c49b680ae612acbf9d7812bf42d\": container with ID starting with 1130e67347fe51af0e3a73480eea09807ed13c49b680ae612acbf9d7812bf42d not found: ID does not exist" containerID="1130e67347fe51af0e3a73480eea09807ed13c49b680ae612acbf9d7812bf42d" Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.488088 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1130e67347fe51af0e3a73480eea09807ed13c49b680ae612acbf9d7812bf42d"} err="failed to get container status \"1130e67347fe51af0e3a73480eea09807ed13c49b680ae612acbf9d7812bf42d\": rpc error: code = NotFound desc = could not find container \"1130e67347fe51af0e3a73480eea09807ed13c49b680ae612acbf9d7812bf42d\": container with ID starting with 1130e67347fe51af0e3a73480eea09807ed13c49b680ae612acbf9d7812bf42d not found: ID does not exist" Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.547594 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.547630 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.621282 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 15:03:34 crc kubenswrapper[4751]: I0131 15:03:34.626761 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 15:03:35 crc kubenswrapper[4751]: I0131 15:03:35.724091 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-sync-pn752"] Jan 31 15:03:35 crc kubenswrapper[4751]: I0131 15:03:35.731596 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-sync-pn752"] Jan 31 15:03:35 crc kubenswrapper[4751]: I0131 15:03:35.791762 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance5797-account-delete-bfbxf"] Jan 31 15:03:35 crc kubenswrapper[4751]: E0131 15:03:35.792323 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3d6d7db-fc12-479e-aedf-8ef829bf01e5" containerName="glance-log" Jan 31 15:03:35 crc kubenswrapper[4751]: I0131 15:03:35.792420 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3d6d7db-fc12-479e-aedf-8ef829bf01e5" containerName="glance-log" Jan 31 15:03:35 crc kubenswrapper[4751]: E0131 15:03:35.792493 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5d5c53d-eea5-4866-983f-8477eb16177b" containerName="glance-httpd" Jan 31 15:03:35 crc kubenswrapper[4751]: I0131 15:03:35.792566 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5d5c53d-eea5-4866-983f-8477eb16177b" containerName="glance-httpd" Jan 31 15:03:35 crc kubenswrapper[4751]: E0131 15:03:35.792752 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40930074-48c4-404d-a55c-bb8a4f581f56" containerName="glance-log" Jan 31 15:03:35 crc kubenswrapper[4751]: I0131 15:03:35.792827 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="40930074-48c4-404d-a55c-bb8a4f581f56" containerName="glance-log" Jan 31 15:03:35 crc kubenswrapper[4751]: E0131 15:03:35.792915 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2142d4ca-115a-49b7-8f50-ac020fdbc342" containerName="glance-httpd" Jan 31 15:03:35 crc kubenswrapper[4751]: I0131 15:03:35.792985 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="2142d4ca-115a-49b7-8f50-ac020fdbc342" containerName="glance-httpd" Jan 31 15:03:35 crc kubenswrapper[4751]: E0131 15:03:35.793060 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5d5c53d-eea5-4866-983f-8477eb16177b" containerName="glance-log" Jan 31 15:03:35 crc kubenswrapper[4751]: I0131 15:03:35.793148 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5d5c53d-eea5-4866-983f-8477eb16177b" containerName="glance-log" Jan 31 15:03:35 crc kubenswrapper[4751]: E0131 15:03:35.793245 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2142d4ca-115a-49b7-8f50-ac020fdbc342" containerName="glance-log" Jan 31 15:03:35 crc kubenswrapper[4751]: I0131 15:03:35.793321 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="2142d4ca-115a-49b7-8f50-ac020fdbc342" containerName="glance-log" Jan 31 15:03:35 crc kubenswrapper[4751]: E0131 15:03:35.793405 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63d398be-aa92-4a00-933b-549a0c4e4ad7" containerName="glance-log" Jan 31 15:03:35 crc kubenswrapper[4751]: I0131 15:03:35.793470 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="63d398be-aa92-4a00-933b-549a0c4e4ad7" containerName="glance-log" Jan 31 15:03:35 crc kubenswrapper[4751]: E0131 15:03:35.793581 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c3cde72-72a2-4a51-a061-06397061de3c" containerName="glance-httpd" Jan 31 15:03:35 crc kubenswrapper[4751]: I0131 15:03:35.793650 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c3cde72-72a2-4a51-a061-06397061de3c" containerName="glance-httpd" Jan 31 15:03:35 crc kubenswrapper[4751]: E0131 15:03:35.793748 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63d398be-aa92-4a00-933b-549a0c4e4ad7" containerName="glance-httpd" Jan 31 15:03:35 crc kubenswrapper[4751]: I0131 15:03:35.793817 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="63d398be-aa92-4a00-933b-549a0c4e4ad7" containerName="glance-httpd" Jan 31 15:03:35 crc kubenswrapper[4751]: E0131 15:03:35.793894 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40930074-48c4-404d-a55c-bb8a4f581f56" containerName="glance-httpd" Jan 31 15:03:35 crc kubenswrapper[4751]: I0131 15:03:35.793963 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="40930074-48c4-404d-a55c-bb8a4f581f56" containerName="glance-httpd" Jan 31 15:03:35 crc kubenswrapper[4751]: E0131 15:03:35.794044 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c3cde72-72a2-4a51-a061-06397061de3c" containerName="glance-log" Jan 31 15:03:35 crc kubenswrapper[4751]: I0131 15:03:35.794138 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c3cde72-72a2-4a51-a061-06397061de3c" containerName="glance-log" Jan 31 15:03:35 crc kubenswrapper[4751]: E0131 15:03:35.794207 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3d6d7db-fc12-479e-aedf-8ef829bf01e5" containerName="glance-httpd" Jan 31 15:03:35 crc kubenswrapper[4751]: I0131 15:03:35.794290 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3d6d7db-fc12-479e-aedf-8ef829bf01e5" containerName="glance-httpd" Jan 31 15:03:35 crc kubenswrapper[4751]: I0131 15:03:35.794505 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5d5c53d-eea5-4866-983f-8477eb16177b" containerName="glance-log" Jan 31 15:03:35 crc kubenswrapper[4751]: I0131 15:03:35.794580 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="2142d4ca-115a-49b7-8f50-ac020fdbc342" containerName="glance-httpd" Jan 31 15:03:35 crc kubenswrapper[4751]: I0131 15:03:35.794648 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="40930074-48c4-404d-a55c-bb8a4f581f56" containerName="glance-log" Jan 31 15:03:35 crc kubenswrapper[4751]: I0131 15:03:35.794715 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="40930074-48c4-404d-a55c-bb8a4f581f56" containerName="glance-httpd" Jan 31 15:03:35 crc kubenswrapper[4751]: I0131 15:03:35.794780 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3d6d7db-fc12-479e-aedf-8ef829bf01e5" containerName="glance-httpd" Jan 31 15:03:35 crc kubenswrapper[4751]: I0131 15:03:35.794857 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3d6d7db-fc12-479e-aedf-8ef829bf01e5" containerName="glance-log" Jan 31 15:03:35 crc kubenswrapper[4751]: I0131 15:03:35.794941 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5d5c53d-eea5-4866-983f-8477eb16177b" containerName="glance-httpd" Jan 31 15:03:35 crc kubenswrapper[4751]: I0131 15:03:35.795014 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c3cde72-72a2-4a51-a061-06397061de3c" containerName="glance-httpd" Jan 31 15:03:35 crc kubenswrapper[4751]: I0131 15:03:35.795102 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="2142d4ca-115a-49b7-8f50-ac020fdbc342" containerName="glance-log" Jan 31 15:03:35 crc kubenswrapper[4751]: I0131 15:03:35.795176 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="63d398be-aa92-4a00-933b-549a0c4e4ad7" containerName="glance-log" Jan 31 15:03:35 crc kubenswrapper[4751]: I0131 15:03:35.795249 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c3cde72-72a2-4a51-a061-06397061de3c" containerName="glance-log" Jan 31 15:03:35 crc kubenswrapper[4751]: I0131 15:03:35.795336 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="63d398be-aa92-4a00-933b-549a0c4e4ad7" containerName="glance-httpd" Jan 31 15:03:35 crc kubenswrapper[4751]: I0131 15:03:35.796004 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance5797-account-delete-bfbxf" Jan 31 15:03:35 crc kubenswrapper[4751]: I0131 15:03:35.805127 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance5797-account-delete-bfbxf"] Jan 31 15:03:35 crc kubenswrapper[4751]: I0131 15:03:35.867768 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31a4e4de-2e47-46c3-8b73-06c6a7fe5282-operator-scripts\") pod \"glance5797-account-delete-bfbxf\" (UID: \"31a4e4de-2e47-46c3-8b73-06c6a7fe5282\") " pod="glance-kuttl-tests/glance5797-account-delete-bfbxf" Jan 31 15:03:35 crc kubenswrapper[4751]: I0131 15:03:35.867842 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5d7p\" (UniqueName: \"kubernetes.io/projected/31a4e4de-2e47-46c3-8b73-06c6a7fe5282-kube-api-access-r5d7p\") pod \"glance5797-account-delete-bfbxf\" (UID: \"31a4e4de-2e47-46c3-8b73-06c6a7fe5282\") " pod="glance-kuttl-tests/glance5797-account-delete-bfbxf" Jan 31 15:03:35 crc kubenswrapper[4751]: I0131 15:03:35.969017 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31a4e4de-2e47-46c3-8b73-06c6a7fe5282-operator-scripts\") pod \"glance5797-account-delete-bfbxf\" (UID: \"31a4e4de-2e47-46c3-8b73-06c6a7fe5282\") " pod="glance-kuttl-tests/glance5797-account-delete-bfbxf" Jan 31 15:03:35 crc kubenswrapper[4751]: I0131 15:03:35.969133 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5d7p\" (UniqueName: \"kubernetes.io/projected/31a4e4de-2e47-46c3-8b73-06c6a7fe5282-kube-api-access-r5d7p\") pod \"glance5797-account-delete-bfbxf\" (UID: \"31a4e4de-2e47-46c3-8b73-06c6a7fe5282\") " pod="glance-kuttl-tests/glance5797-account-delete-bfbxf" Jan 31 15:03:35 crc kubenswrapper[4751]: I0131 15:03:35.969723 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31a4e4de-2e47-46c3-8b73-06c6a7fe5282-operator-scripts\") pod \"glance5797-account-delete-bfbxf\" (UID: \"31a4e4de-2e47-46c3-8b73-06c6a7fe5282\") " pod="glance-kuttl-tests/glance5797-account-delete-bfbxf" Jan 31 15:03:35 crc kubenswrapper[4751]: I0131 15:03:35.999459 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5d7p\" (UniqueName: \"kubernetes.io/projected/31a4e4de-2e47-46c3-8b73-06c6a7fe5282-kube-api-access-r5d7p\") pod \"glance5797-account-delete-bfbxf\" (UID: \"31a4e4de-2e47-46c3-8b73-06c6a7fe5282\") " pod="glance-kuttl-tests/glance5797-account-delete-bfbxf" Jan 31 15:03:36 crc kubenswrapper[4751]: I0131 15:03:36.112038 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance5797-account-delete-bfbxf" Jan 31 15:03:36 crc kubenswrapper[4751]: I0131 15:03:36.419458 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c3cde72-72a2-4a51-a061-06397061de3c" path="/var/lib/kubelet/pods/1c3cde72-72a2-4a51-a061-06397061de3c/volumes" Jan 31 15:03:36 crc kubenswrapper[4751]: I0131 15:03:36.420758 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d350f693-ea74-48d5-a7a7-3fa3264174ca" path="/var/lib/kubelet/pods/d350f693-ea74-48d5-a7a7-3fa3264174ca/volumes" Jan 31 15:03:36 crc kubenswrapper[4751]: I0131 15:03:36.530709 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance5797-account-delete-bfbxf"] Jan 31 15:03:37 crc kubenswrapper[4751]: I0131 15:03:37.340427 4751 generic.go:334] "Generic (PLEG): container finished" podID="31a4e4de-2e47-46c3-8b73-06c6a7fe5282" containerID="04a2620fed6cde572c43eab031fe61d9c4a7478ffe007510ee4e0e1e7a876ff4" exitCode=0 Jan 31 15:03:37 crc kubenswrapper[4751]: I0131 15:03:37.340730 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance5797-account-delete-bfbxf" event={"ID":"31a4e4de-2e47-46c3-8b73-06c6a7fe5282","Type":"ContainerDied","Data":"04a2620fed6cde572c43eab031fe61d9c4a7478ffe007510ee4e0e1e7a876ff4"} Jan 31 15:03:37 crc kubenswrapper[4751]: I0131 15:03:37.340784 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance5797-account-delete-bfbxf" event={"ID":"31a4e4de-2e47-46c3-8b73-06c6a7fe5282","Type":"ContainerStarted","Data":"8db058d18edbc68f572a370f2e2fbcbf76b515e8af7d5c00e3296e137637bb91"} Jan 31 15:03:38 crc kubenswrapper[4751]: I0131 15:03:38.651420 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance5797-account-delete-bfbxf" Jan 31 15:03:38 crc kubenswrapper[4751]: I0131 15:03:38.703726 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31a4e4de-2e47-46c3-8b73-06c6a7fe5282-operator-scripts\") pod \"31a4e4de-2e47-46c3-8b73-06c6a7fe5282\" (UID: \"31a4e4de-2e47-46c3-8b73-06c6a7fe5282\") " Jan 31 15:03:38 crc kubenswrapper[4751]: I0131 15:03:38.703805 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5d7p\" (UniqueName: \"kubernetes.io/projected/31a4e4de-2e47-46c3-8b73-06c6a7fe5282-kube-api-access-r5d7p\") pod \"31a4e4de-2e47-46c3-8b73-06c6a7fe5282\" (UID: \"31a4e4de-2e47-46c3-8b73-06c6a7fe5282\") " Jan 31 15:03:38 crc kubenswrapper[4751]: I0131 15:03:38.705247 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31a4e4de-2e47-46c3-8b73-06c6a7fe5282-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "31a4e4de-2e47-46c3-8b73-06c6a7fe5282" (UID: "31a4e4de-2e47-46c3-8b73-06c6a7fe5282"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 15:03:38 crc kubenswrapper[4751]: I0131 15:03:38.711633 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31a4e4de-2e47-46c3-8b73-06c6a7fe5282-kube-api-access-r5d7p" (OuterVolumeSpecName: "kube-api-access-r5d7p") pod "31a4e4de-2e47-46c3-8b73-06c6a7fe5282" (UID: "31a4e4de-2e47-46c3-8b73-06c6a7fe5282"). InnerVolumeSpecName "kube-api-access-r5d7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:03:38 crc kubenswrapper[4751]: I0131 15:03:38.805047 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5d7p\" (UniqueName: \"kubernetes.io/projected/31a4e4de-2e47-46c3-8b73-06c6a7fe5282-kube-api-access-r5d7p\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:38 crc kubenswrapper[4751]: I0131 15:03:38.805100 4751 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31a4e4de-2e47-46c3-8b73-06c6a7fe5282-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:39 crc kubenswrapper[4751]: I0131 15:03:39.358264 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance5797-account-delete-bfbxf" event={"ID":"31a4e4de-2e47-46c3-8b73-06c6a7fe5282","Type":"ContainerDied","Data":"8db058d18edbc68f572a370f2e2fbcbf76b515e8af7d5c00e3296e137637bb91"} Jan 31 15:03:39 crc kubenswrapper[4751]: I0131 15:03:39.358302 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8db058d18edbc68f572a370f2e2fbcbf76b515e8af7d5c00e3296e137637bb91" Jan 31 15:03:39 crc kubenswrapper[4751]: I0131 15:03:39.358306 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance5797-account-delete-bfbxf" Jan 31 15:03:40 crc kubenswrapper[4751]: I0131 15:03:40.822156 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-create-b924d"] Jan 31 15:03:40 crc kubenswrapper[4751]: I0131 15:03:40.829767 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-create-b924d"] Jan 31 15:03:40 crc kubenswrapper[4751]: I0131 15:03:40.837475 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance5797-account-delete-bfbxf"] Jan 31 15:03:40 crc kubenswrapper[4751]: I0131 15:03:40.845785 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-5797-account-create-update-hfdl2"] Jan 31 15:03:40 crc kubenswrapper[4751]: I0131 15:03:40.852728 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance5797-account-delete-bfbxf"] Jan 31 15:03:40 crc kubenswrapper[4751]: I0131 15:03:40.858038 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-5797-account-create-update-hfdl2"] Jan 31 15:03:41 crc kubenswrapper[4751]: I0131 15:03:41.480624 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-633a-account-create-update-kmj9d"] Jan 31 15:03:41 crc kubenswrapper[4751]: E0131 15:03:41.481671 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31a4e4de-2e47-46c3-8b73-06c6a7fe5282" containerName="mariadb-account-delete" Jan 31 15:03:41 crc kubenswrapper[4751]: I0131 15:03:41.481705 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="31a4e4de-2e47-46c3-8b73-06c6a7fe5282" containerName="mariadb-account-delete" Jan 31 15:03:41 crc kubenswrapper[4751]: I0131 15:03:41.482031 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="31a4e4de-2e47-46c3-8b73-06c6a7fe5282" containerName="mariadb-account-delete" Jan 31 15:03:41 crc kubenswrapper[4751]: I0131 15:03:41.482827 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-633a-account-create-update-kmj9d" Jan 31 15:03:41 crc kubenswrapper[4751]: I0131 15:03:41.487580 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-db-secret" Jan 31 15:03:41 crc kubenswrapper[4751]: I0131 15:03:41.489242 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-create-9z5l9"] Jan 31 15:03:41 crc kubenswrapper[4751]: I0131 15:03:41.490251 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-9z5l9" Jan 31 15:03:41 crc kubenswrapper[4751]: I0131 15:03:41.496335 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-633a-account-create-update-kmj9d"] Jan 31 15:03:41 crc kubenswrapper[4751]: I0131 15:03:41.507096 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-9z5l9"] Jan 31 15:03:41 crc kubenswrapper[4751]: I0131 15:03:41.644015 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d66e78e-6853-45e7-966f-cd9ec9586439-operator-scripts\") pod \"glance-db-create-9z5l9\" (UID: \"4d66e78e-6853-45e7-966f-cd9ec9586439\") " pod="glance-kuttl-tests/glance-db-create-9z5l9" Jan 31 15:03:41 crc kubenswrapper[4751]: I0131 15:03:41.644058 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcl4h\" (UniqueName: \"kubernetes.io/projected/8028c623-f182-4e00-9c6d-c864a023abb5-kube-api-access-hcl4h\") pod \"glance-633a-account-create-update-kmj9d\" (UID: \"8028c623-f182-4e00-9c6d-c864a023abb5\") " pod="glance-kuttl-tests/glance-633a-account-create-update-kmj9d" Jan 31 15:03:41 crc kubenswrapper[4751]: I0131 15:03:41.644180 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnr8g\" (UniqueName: \"kubernetes.io/projected/4d66e78e-6853-45e7-966f-cd9ec9586439-kube-api-access-qnr8g\") pod \"glance-db-create-9z5l9\" (UID: \"4d66e78e-6853-45e7-966f-cd9ec9586439\") " pod="glance-kuttl-tests/glance-db-create-9z5l9" Jan 31 15:03:41 crc kubenswrapper[4751]: I0131 15:03:41.644212 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8028c623-f182-4e00-9c6d-c864a023abb5-operator-scripts\") pod \"glance-633a-account-create-update-kmj9d\" (UID: \"8028c623-f182-4e00-9c6d-c864a023abb5\") " pod="glance-kuttl-tests/glance-633a-account-create-update-kmj9d" Jan 31 15:03:41 crc kubenswrapper[4751]: I0131 15:03:41.745542 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnr8g\" (UniqueName: \"kubernetes.io/projected/4d66e78e-6853-45e7-966f-cd9ec9586439-kube-api-access-qnr8g\") pod \"glance-db-create-9z5l9\" (UID: \"4d66e78e-6853-45e7-966f-cd9ec9586439\") " pod="glance-kuttl-tests/glance-db-create-9z5l9" Jan 31 15:03:41 crc kubenswrapper[4751]: I0131 15:03:41.745616 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8028c623-f182-4e00-9c6d-c864a023abb5-operator-scripts\") pod \"glance-633a-account-create-update-kmj9d\" (UID: \"8028c623-f182-4e00-9c6d-c864a023abb5\") " pod="glance-kuttl-tests/glance-633a-account-create-update-kmj9d" Jan 31 15:03:41 crc kubenswrapper[4751]: I0131 15:03:41.745698 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d66e78e-6853-45e7-966f-cd9ec9586439-operator-scripts\") pod \"glance-db-create-9z5l9\" (UID: \"4d66e78e-6853-45e7-966f-cd9ec9586439\") " pod="glance-kuttl-tests/glance-db-create-9z5l9" Jan 31 15:03:41 crc kubenswrapper[4751]: I0131 15:03:41.745730 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcl4h\" (UniqueName: \"kubernetes.io/projected/8028c623-f182-4e00-9c6d-c864a023abb5-kube-api-access-hcl4h\") pod \"glance-633a-account-create-update-kmj9d\" (UID: \"8028c623-f182-4e00-9c6d-c864a023abb5\") " pod="glance-kuttl-tests/glance-633a-account-create-update-kmj9d" Jan 31 15:03:41 crc kubenswrapper[4751]: I0131 15:03:41.747020 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8028c623-f182-4e00-9c6d-c864a023abb5-operator-scripts\") pod \"glance-633a-account-create-update-kmj9d\" (UID: \"8028c623-f182-4e00-9c6d-c864a023abb5\") " pod="glance-kuttl-tests/glance-633a-account-create-update-kmj9d" Jan 31 15:03:41 crc kubenswrapper[4751]: I0131 15:03:41.747629 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d66e78e-6853-45e7-966f-cd9ec9586439-operator-scripts\") pod \"glance-db-create-9z5l9\" (UID: \"4d66e78e-6853-45e7-966f-cd9ec9586439\") " pod="glance-kuttl-tests/glance-db-create-9z5l9" Jan 31 15:03:41 crc kubenswrapper[4751]: I0131 15:03:41.766672 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnr8g\" (UniqueName: \"kubernetes.io/projected/4d66e78e-6853-45e7-966f-cd9ec9586439-kube-api-access-qnr8g\") pod \"glance-db-create-9z5l9\" (UID: \"4d66e78e-6853-45e7-966f-cd9ec9586439\") " pod="glance-kuttl-tests/glance-db-create-9z5l9" Jan 31 15:03:41 crc kubenswrapper[4751]: I0131 15:03:41.766990 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcl4h\" (UniqueName: \"kubernetes.io/projected/8028c623-f182-4e00-9c6d-c864a023abb5-kube-api-access-hcl4h\") pod \"glance-633a-account-create-update-kmj9d\" (UID: \"8028c623-f182-4e00-9c6d-c864a023abb5\") " pod="glance-kuttl-tests/glance-633a-account-create-update-kmj9d" Jan 31 15:03:41 crc kubenswrapper[4751]: I0131 15:03:41.804259 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-633a-account-create-update-kmj9d" Jan 31 15:03:41 crc kubenswrapper[4751]: I0131 15:03:41.814125 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-9z5l9" Jan 31 15:03:42 crc kubenswrapper[4751]: I0131 15:03:42.107656 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-9z5l9"] Jan 31 15:03:42 crc kubenswrapper[4751]: I0131 15:03:42.257470 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-633a-account-create-update-kmj9d"] Jan 31 15:03:42 crc kubenswrapper[4751]: W0131 15:03:42.265213 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8028c623_f182_4e00_9c6d_c864a023abb5.slice/crio-56637ac13b21967f19a3fac42b6a3500babb3afbe65dda72046eb3b31f096f6c WatchSource:0}: Error finding container 56637ac13b21967f19a3fac42b6a3500babb3afbe65dda72046eb3b31f096f6c: Status 404 returned error can't find the container with id 56637ac13b21967f19a3fac42b6a3500babb3afbe65dda72046eb3b31f096f6c Jan 31 15:03:42 crc kubenswrapper[4751]: I0131 15:03:42.383514 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-633a-account-create-update-kmj9d" event={"ID":"8028c623-f182-4e00-9c6d-c864a023abb5","Type":"ContainerStarted","Data":"25f84d0f51f45c02503d2025ee5bcd86d54fb4126f654afe8e8c27150f9da926"} Jan 31 15:03:42 crc kubenswrapper[4751]: I0131 15:03:42.383613 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-633a-account-create-update-kmj9d" event={"ID":"8028c623-f182-4e00-9c6d-c864a023abb5","Type":"ContainerStarted","Data":"56637ac13b21967f19a3fac42b6a3500babb3afbe65dda72046eb3b31f096f6c"} Jan 31 15:03:42 crc kubenswrapper[4751]: I0131 15:03:42.388335 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-9z5l9" event={"ID":"4d66e78e-6853-45e7-966f-cd9ec9586439","Type":"ContainerStarted","Data":"2e90cc31ff36ceaadebe3379b42c48741b099a92854117d92e72b66bca77ad69"} Jan 31 15:03:42 crc kubenswrapper[4751]: I0131 15:03:42.388382 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-9z5l9" event={"ID":"4d66e78e-6853-45e7-966f-cd9ec9586439","Type":"ContainerStarted","Data":"dfb2dd4464903e172bc85af5fa5fcbb80bcb16e0435c60c5df0d3d28f7986ca4"} Jan 31 15:03:42 crc kubenswrapper[4751]: I0131 15:03:42.400651 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-633a-account-create-update-kmj9d" podStartSLOduration=1.400625856 podStartE2EDuration="1.400625856s" podCreationTimestamp="2026-01-31 15:03:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 15:03:42.39775125 +0000 UTC m=+1326.772464135" watchObservedRunningTime="2026-01-31 15:03:42.400625856 +0000 UTC m=+1326.775338741" Jan 31 15:03:42 crc kubenswrapper[4751]: I0131 15:03:42.414135 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31a4e4de-2e47-46c3-8b73-06c6a7fe5282" path="/var/lib/kubelet/pods/31a4e4de-2e47-46c3-8b73-06c6a7fe5282/volumes" Jan 31 15:03:42 crc kubenswrapper[4751]: I0131 15:03:42.414808 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58c33299-57ac-4fc9-9751-b521d31e60cf" path="/var/lib/kubelet/pods/58c33299-57ac-4fc9-9751-b521d31e60cf/volumes" Jan 31 15:03:42 crc kubenswrapper[4751]: I0131 15:03:42.415281 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="896f2e37-3440-46e7-81ed-2805ab336470" path="/var/lib/kubelet/pods/896f2e37-3440-46e7-81ed-2805ab336470/volumes" Jan 31 15:03:42 crc kubenswrapper[4751]: I0131 15:03:42.417986 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-create-9z5l9" podStartSLOduration=1.4179702330000001 podStartE2EDuration="1.417970233s" podCreationTimestamp="2026-01-31 15:03:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 15:03:42.416964237 +0000 UTC m=+1326.791677112" watchObservedRunningTime="2026-01-31 15:03:42.417970233 +0000 UTC m=+1326.792683118" Jan 31 15:03:43 crc kubenswrapper[4751]: I0131 15:03:43.400031 4751 generic.go:334] "Generic (PLEG): container finished" podID="8028c623-f182-4e00-9c6d-c864a023abb5" containerID="25f84d0f51f45c02503d2025ee5bcd86d54fb4126f654afe8e8c27150f9da926" exitCode=0 Jan 31 15:03:43 crc kubenswrapper[4751]: I0131 15:03:43.400130 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-633a-account-create-update-kmj9d" event={"ID":"8028c623-f182-4e00-9c6d-c864a023abb5","Type":"ContainerDied","Data":"25f84d0f51f45c02503d2025ee5bcd86d54fb4126f654afe8e8c27150f9da926"} Jan 31 15:03:43 crc kubenswrapper[4751]: I0131 15:03:43.403990 4751 generic.go:334] "Generic (PLEG): container finished" podID="4d66e78e-6853-45e7-966f-cd9ec9586439" containerID="2e90cc31ff36ceaadebe3379b42c48741b099a92854117d92e72b66bca77ad69" exitCode=0 Jan 31 15:03:43 crc kubenswrapper[4751]: I0131 15:03:43.404037 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-9z5l9" event={"ID":"4d66e78e-6853-45e7-966f-cd9ec9586439","Type":"ContainerDied","Data":"2e90cc31ff36ceaadebe3379b42c48741b099a92854117d92e72b66bca77ad69"} Jan 31 15:03:44 crc kubenswrapper[4751]: I0131 15:03:44.769667 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-9z5l9" Jan 31 15:03:44 crc kubenswrapper[4751]: I0131 15:03:44.776469 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-633a-account-create-update-kmj9d" Jan 31 15:03:44 crc kubenswrapper[4751]: I0131 15:03:44.794208 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8028c623-f182-4e00-9c6d-c864a023abb5-operator-scripts\") pod \"8028c623-f182-4e00-9c6d-c864a023abb5\" (UID: \"8028c623-f182-4e00-9c6d-c864a023abb5\") " Jan 31 15:03:44 crc kubenswrapper[4751]: I0131 15:03:44.794290 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d66e78e-6853-45e7-966f-cd9ec9586439-operator-scripts\") pod \"4d66e78e-6853-45e7-966f-cd9ec9586439\" (UID: \"4d66e78e-6853-45e7-966f-cd9ec9586439\") " Jan 31 15:03:44 crc kubenswrapper[4751]: I0131 15:03:44.794378 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnr8g\" (UniqueName: \"kubernetes.io/projected/4d66e78e-6853-45e7-966f-cd9ec9586439-kube-api-access-qnr8g\") pod \"4d66e78e-6853-45e7-966f-cd9ec9586439\" (UID: \"4d66e78e-6853-45e7-966f-cd9ec9586439\") " Jan 31 15:03:44 crc kubenswrapper[4751]: I0131 15:03:44.794421 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcl4h\" (UniqueName: \"kubernetes.io/projected/8028c623-f182-4e00-9c6d-c864a023abb5-kube-api-access-hcl4h\") pod \"8028c623-f182-4e00-9c6d-c864a023abb5\" (UID: \"8028c623-f182-4e00-9c6d-c864a023abb5\") " Jan 31 15:03:44 crc kubenswrapper[4751]: I0131 15:03:44.802307 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8028c623-f182-4e00-9c6d-c864a023abb5-kube-api-access-hcl4h" (OuterVolumeSpecName: "kube-api-access-hcl4h") pod "8028c623-f182-4e00-9c6d-c864a023abb5" (UID: "8028c623-f182-4e00-9c6d-c864a023abb5"). InnerVolumeSpecName "kube-api-access-hcl4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:03:44 crc kubenswrapper[4751]: I0131 15:03:44.803936 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d66e78e-6853-45e7-966f-cd9ec9586439-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4d66e78e-6853-45e7-966f-cd9ec9586439" (UID: "4d66e78e-6853-45e7-966f-cd9ec9586439"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 15:03:44 crc kubenswrapper[4751]: I0131 15:03:44.804990 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8028c623-f182-4e00-9c6d-c864a023abb5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8028c623-f182-4e00-9c6d-c864a023abb5" (UID: "8028c623-f182-4e00-9c6d-c864a023abb5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 15:03:44 crc kubenswrapper[4751]: I0131 15:03:44.810717 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d66e78e-6853-45e7-966f-cd9ec9586439-kube-api-access-qnr8g" (OuterVolumeSpecName: "kube-api-access-qnr8g") pod "4d66e78e-6853-45e7-966f-cd9ec9586439" (UID: "4d66e78e-6853-45e7-966f-cd9ec9586439"). InnerVolumeSpecName "kube-api-access-qnr8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:03:44 crc kubenswrapper[4751]: I0131 15:03:44.895578 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcl4h\" (UniqueName: \"kubernetes.io/projected/8028c623-f182-4e00-9c6d-c864a023abb5-kube-api-access-hcl4h\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:44 crc kubenswrapper[4751]: I0131 15:03:44.895614 4751 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8028c623-f182-4e00-9c6d-c864a023abb5-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:44 crc kubenswrapper[4751]: I0131 15:03:44.895623 4751 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d66e78e-6853-45e7-966f-cd9ec9586439-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:44 crc kubenswrapper[4751]: I0131 15:03:44.895631 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnr8g\" (UniqueName: \"kubernetes.io/projected/4d66e78e-6853-45e7-966f-cd9ec9586439-kube-api-access-qnr8g\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:45 crc kubenswrapper[4751]: I0131 15:03:45.429238 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-633a-account-create-update-kmj9d" event={"ID":"8028c623-f182-4e00-9c6d-c864a023abb5","Type":"ContainerDied","Data":"56637ac13b21967f19a3fac42b6a3500babb3afbe65dda72046eb3b31f096f6c"} Jan 31 15:03:45 crc kubenswrapper[4751]: I0131 15:03:45.429598 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56637ac13b21967f19a3fac42b6a3500babb3afbe65dda72046eb3b31f096f6c" Jan 31 15:03:45 crc kubenswrapper[4751]: I0131 15:03:45.429368 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-633a-account-create-update-kmj9d" Jan 31 15:03:45 crc kubenswrapper[4751]: I0131 15:03:45.432759 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-9z5l9" Jan 31 15:03:45 crc kubenswrapper[4751]: I0131 15:03:45.433050 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-9z5l9" event={"ID":"4d66e78e-6853-45e7-966f-cd9ec9586439","Type":"ContainerDied","Data":"dfb2dd4464903e172bc85af5fa5fcbb80bcb16e0435c60c5df0d3d28f7986ca4"} Jan 31 15:03:45 crc kubenswrapper[4751]: I0131 15:03:45.433283 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dfb2dd4464903e172bc85af5fa5fcbb80bcb16e0435c60c5df0d3d28f7986ca4" Jan 31 15:03:45 crc kubenswrapper[4751]: I0131 15:03:45.711171 4751 scope.go:117] "RemoveContainer" containerID="c5d30bd3425343861aefae2acc945d17403c59649b3737361473864cd06659ea" Jan 31 15:03:45 crc kubenswrapper[4751]: I0131 15:03:45.731322 4751 scope.go:117] "RemoveContainer" containerID="8f5e6c80881d23c78dc00e4be207273e5eb7f1474c90cd90d5b02783a4206916" Jan 31 15:03:45 crc kubenswrapper[4751]: I0131 15:03:45.774885 4751 scope.go:117] "RemoveContainer" containerID="0a2dcc31122c7c5482843a5e80399a6846c7271da25c796eef9ce298a6180701" Jan 31 15:03:46 crc kubenswrapper[4751]: I0131 15:03:46.693164 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-sync-zzxfv"] Jan 31 15:03:46 crc kubenswrapper[4751]: E0131 15:03:46.693646 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8028c623-f182-4e00-9c6d-c864a023abb5" containerName="mariadb-account-create-update" Jan 31 15:03:46 crc kubenswrapper[4751]: I0131 15:03:46.693676 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="8028c623-f182-4e00-9c6d-c864a023abb5" containerName="mariadb-account-create-update" Jan 31 15:03:46 crc kubenswrapper[4751]: E0131 15:03:46.693707 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d66e78e-6853-45e7-966f-cd9ec9586439" containerName="mariadb-database-create" Jan 31 15:03:46 crc kubenswrapper[4751]: I0131 15:03:46.693726 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d66e78e-6853-45e7-966f-cd9ec9586439" containerName="mariadb-database-create" Jan 31 15:03:46 crc kubenswrapper[4751]: I0131 15:03:46.693959 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="8028c623-f182-4e00-9c6d-c864a023abb5" containerName="mariadb-account-create-update" Jan 31 15:03:46 crc kubenswrapper[4751]: I0131 15:03:46.694019 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d66e78e-6853-45e7-966f-cd9ec9586439" containerName="mariadb-database-create" Jan 31 15:03:46 crc kubenswrapper[4751]: I0131 15:03:46.694783 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-zzxfv" Jan 31 15:03:46 crc kubenswrapper[4751]: I0131 15:03:46.702136 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-zzxfv"] Jan 31 15:03:46 crc kubenswrapper[4751]: I0131 15:03:46.702276 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-lgkvw" Jan 31 15:03:46 crc kubenswrapper[4751]: I0131 15:03:46.702276 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-config-data" Jan 31 15:03:46 crc kubenswrapper[4751]: I0131 15:03:46.723125 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6f8n\" (UniqueName: \"kubernetes.io/projected/e91c4dc3-9319-4e4b-951a-4e1f117c3215-kube-api-access-n6f8n\") pod \"glance-db-sync-zzxfv\" (UID: \"e91c4dc3-9319-4e4b-951a-4e1f117c3215\") " pod="glance-kuttl-tests/glance-db-sync-zzxfv" Jan 31 15:03:46 crc kubenswrapper[4751]: I0131 15:03:46.723184 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e91c4dc3-9319-4e4b-951a-4e1f117c3215-config-data\") pod \"glance-db-sync-zzxfv\" (UID: \"e91c4dc3-9319-4e4b-951a-4e1f117c3215\") " pod="glance-kuttl-tests/glance-db-sync-zzxfv" Jan 31 15:03:46 crc kubenswrapper[4751]: I0131 15:03:46.723203 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e91c4dc3-9319-4e4b-951a-4e1f117c3215-db-sync-config-data\") pod \"glance-db-sync-zzxfv\" (UID: \"e91c4dc3-9319-4e4b-951a-4e1f117c3215\") " pod="glance-kuttl-tests/glance-db-sync-zzxfv" Jan 31 15:03:46 crc kubenswrapper[4751]: I0131 15:03:46.824481 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e91c4dc3-9319-4e4b-951a-4e1f117c3215-config-data\") pod \"glance-db-sync-zzxfv\" (UID: \"e91c4dc3-9319-4e4b-951a-4e1f117c3215\") " pod="glance-kuttl-tests/glance-db-sync-zzxfv" Jan 31 15:03:46 crc kubenswrapper[4751]: I0131 15:03:46.824533 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e91c4dc3-9319-4e4b-951a-4e1f117c3215-db-sync-config-data\") pod \"glance-db-sync-zzxfv\" (UID: \"e91c4dc3-9319-4e4b-951a-4e1f117c3215\") " pod="glance-kuttl-tests/glance-db-sync-zzxfv" Jan 31 15:03:46 crc kubenswrapper[4751]: I0131 15:03:46.824642 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6f8n\" (UniqueName: \"kubernetes.io/projected/e91c4dc3-9319-4e4b-951a-4e1f117c3215-kube-api-access-n6f8n\") pod \"glance-db-sync-zzxfv\" (UID: \"e91c4dc3-9319-4e4b-951a-4e1f117c3215\") " pod="glance-kuttl-tests/glance-db-sync-zzxfv" Jan 31 15:03:46 crc kubenswrapper[4751]: I0131 15:03:46.829349 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e91c4dc3-9319-4e4b-951a-4e1f117c3215-db-sync-config-data\") pod \"glance-db-sync-zzxfv\" (UID: \"e91c4dc3-9319-4e4b-951a-4e1f117c3215\") " pod="glance-kuttl-tests/glance-db-sync-zzxfv" Jan 31 15:03:46 crc kubenswrapper[4751]: I0131 15:03:46.838829 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e91c4dc3-9319-4e4b-951a-4e1f117c3215-config-data\") pod \"glance-db-sync-zzxfv\" (UID: \"e91c4dc3-9319-4e4b-951a-4e1f117c3215\") " pod="glance-kuttl-tests/glance-db-sync-zzxfv" Jan 31 15:03:46 crc kubenswrapper[4751]: I0131 15:03:46.849169 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6f8n\" (UniqueName: \"kubernetes.io/projected/e91c4dc3-9319-4e4b-951a-4e1f117c3215-kube-api-access-n6f8n\") pod \"glance-db-sync-zzxfv\" (UID: \"e91c4dc3-9319-4e4b-951a-4e1f117c3215\") " pod="glance-kuttl-tests/glance-db-sync-zzxfv" Jan 31 15:03:47 crc kubenswrapper[4751]: I0131 15:03:47.014512 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-zzxfv" Jan 31 15:03:47 crc kubenswrapper[4751]: I0131 15:03:47.435916 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-zzxfv"] Jan 31 15:03:47 crc kubenswrapper[4751]: W0131 15:03:47.440219 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode91c4dc3_9319_4e4b_951a_4e1f117c3215.slice/crio-6a6ad5508209ce047cf73ef6b50355c4bb3cbbab034952d341b7d096d51f510f WatchSource:0}: Error finding container 6a6ad5508209ce047cf73ef6b50355c4bb3cbbab034952d341b7d096d51f510f: Status 404 returned error can't find the container with id 6a6ad5508209ce047cf73ef6b50355c4bb3cbbab034952d341b7d096d51f510f Jan 31 15:03:48 crc kubenswrapper[4751]: I0131 15:03:48.458148 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-zzxfv" event={"ID":"e91c4dc3-9319-4e4b-951a-4e1f117c3215","Type":"ContainerStarted","Data":"15a7c13661f7a9dc9cca48ea38cbda46b049856ab09d05ef63e1d7c0a14b8bb5"} Jan 31 15:03:48 crc kubenswrapper[4751]: I0131 15:03:48.458519 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-zzxfv" event={"ID":"e91c4dc3-9319-4e4b-951a-4e1f117c3215","Type":"ContainerStarted","Data":"6a6ad5508209ce047cf73ef6b50355c4bb3cbbab034952d341b7d096d51f510f"} Jan 31 15:03:48 crc kubenswrapper[4751]: I0131 15:03:48.477155 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-sync-zzxfv" podStartSLOduration=2.477138016 podStartE2EDuration="2.477138016s" podCreationTimestamp="2026-01-31 15:03:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 15:03:48.472112284 +0000 UTC m=+1332.846825189" watchObservedRunningTime="2026-01-31 15:03:48.477138016 +0000 UTC m=+1332.851850901" Jan 31 15:03:51 crc kubenswrapper[4751]: I0131 15:03:51.488725 4751 generic.go:334] "Generic (PLEG): container finished" podID="e91c4dc3-9319-4e4b-951a-4e1f117c3215" containerID="15a7c13661f7a9dc9cca48ea38cbda46b049856ab09d05ef63e1d7c0a14b8bb5" exitCode=0 Jan 31 15:03:51 crc kubenswrapper[4751]: I0131 15:03:51.489040 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-zzxfv" event={"ID":"e91c4dc3-9319-4e4b-951a-4e1f117c3215","Type":"ContainerDied","Data":"15a7c13661f7a9dc9cca48ea38cbda46b049856ab09d05ef63e1d7c0a14b8bb5"} Jan 31 15:03:52 crc kubenswrapper[4751]: I0131 15:03:52.814273 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-zzxfv" Jan 31 15:03:52 crc kubenswrapper[4751]: I0131 15:03:52.914208 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e91c4dc3-9319-4e4b-951a-4e1f117c3215-db-sync-config-data\") pod \"e91c4dc3-9319-4e4b-951a-4e1f117c3215\" (UID: \"e91c4dc3-9319-4e4b-951a-4e1f117c3215\") " Jan 31 15:03:52 crc kubenswrapper[4751]: I0131 15:03:52.914360 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6f8n\" (UniqueName: \"kubernetes.io/projected/e91c4dc3-9319-4e4b-951a-4e1f117c3215-kube-api-access-n6f8n\") pod \"e91c4dc3-9319-4e4b-951a-4e1f117c3215\" (UID: \"e91c4dc3-9319-4e4b-951a-4e1f117c3215\") " Jan 31 15:03:52 crc kubenswrapper[4751]: I0131 15:03:52.914407 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e91c4dc3-9319-4e4b-951a-4e1f117c3215-config-data\") pod \"e91c4dc3-9319-4e4b-951a-4e1f117c3215\" (UID: \"e91c4dc3-9319-4e4b-951a-4e1f117c3215\") " Jan 31 15:03:52 crc kubenswrapper[4751]: I0131 15:03:52.920212 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e91c4dc3-9319-4e4b-951a-4e1f117c3215-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "e91c4dc3-9319-4e4b-951a-4e1f117c3215" (UID: "e91c4dc3-9319-4e4b-951a-4e1f117c3215"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:03:52 crc kubenswrapper[4751]: I0131 15:03:52.920262 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e91c4dc3-9319-4e4b-951a-4e1f117c3215-kube-api-access-n6f8n" (OuterVolumeSpecName: "kube-api-access-n6f8n") pod "e91c4dc3-9319-4e4b-951a-4e1f117c3215" (UID: "e91c4dc3-9319-4e4b-951a-4e1f117c3215"). InnerVolumeSpecName "kube-api-access-n6f8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:03:52 crc kubenswrapper[4751]: I0131 15:03:52.959981 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e91c4dc3-9319-4e4b-951a-4e1f117c3215-config-data" (OuterVolumeSpecName: "config-data") pod "e91c4dc3-9319-4e4b-951a-4e1f117c3215" (UID: "e91c4dc3-9319-4e4b-951a-4e1f117c3215"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:03:53 crc kubenswrapper[4751]: I0131 15:03:53.016151 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6f8n\" (UniqueName: \"kubernetes.io/projected/e91c4dc3-9319-4e4b-951a-4e1f117c3215-kube-api-access-n6f8n\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:53 crc kubenswrapper[4751]: I0131 15:03:53.016188 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e91c4dc3-9319-4e4b-951a-4e1f117c3215-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:53 crc kubenswrapper[4751]: I0131 15:03:53.016201 4751 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e91c4dc3-9319-4e4b-951a-4e1f117c3215-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:53 crc kubenswrapper[4751]: I0131 15:03:53.513444 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-zzxfv" event={"ID":"e91c4dc3-9319-4e4b-951a-4e1f117c3215","Type":"ContainerDied","Data":"6a6ad5508209ce047cf73ef6b50355c4bb3cbbab034952d341b7d096d51f510f"} Jan 31 15:03:53 crc kubenswrapper[4751]: I0131 15:03:53.513927 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a6ad5508209ce047cf73ef6b50355c4bb3cbbab034952d341b7d096d51f510f" Jan 31 15:03:53 crc kubenswrapper[4751]: I0131 15:03:53.513495 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-zzxfv" Jan 31 15:03:53 crc kubenswrapper[4751]: I0131 15:03:53.922380 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Jan 31 15:03:53 crc kubenswrapper[4751]: E0131 15:03:53.922631 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e91c4dc3-9319-4e4b-951a-4e1f117c3215" containerName="glance-db-sync" Jan 31 15:03:53 crc kubenswrapper[4751]: I0131 15:03:53.922642 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="e91c4dc3-9319-4e4b-951a-4e1f117c3215" containerName="glance-db-sync" Jan 31 15:03:53 crc kubenswrapper[4751]: I0131 15:03:53.922805 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="e91c4dc3-9319-4e4b-951a-4e1f117c3215" containerName="glance-db-sync" Jan 31 15:03:53 crc kubenswrapper[4751]: I0131 15:03:53.923523 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:53 crc kubenswrapper[4751]: I0131 15:03:53.925487 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-lgkvw" Jan 31 15:03:53 crc kubenswrapper[4751]: I0131 15:03:53.925820 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Jan 31 15:03:53 crc kubenswrapper[4751]: I0131 15:03:53.925899 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-single-config-data" Jan 31 15:03:53 crc kubenswrapper[4751]: I0131 15:03:53.935363 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Jan 31 15:03:54 crc kubenswrapper[4751]: I0131 15:03:54.032985 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0afd722e-d093-428e-9f16-85a889d08de1-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"0afd722e-d093-428e-9f16-85a889d08de1\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:54 crc kubenswrapper[4751]: I0131 15:03:54.033030 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p949\" (UniqueName: \"kubernetes.io/projected/0afd722e-d093-428e-9f16-85a889d08de1-kube-api-access-9p949\") pod \"glance-default-single-0\" (UID: \"0afd722e-d093-428e-9f16-85a889d08de1\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:54 crc kubenswrapper[4751]: I0131 15:03:54.033055 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0afd722e-d093-428e-9f16-85a889d08de1-sys\") pod \"glance-default-single-0\" (UID: \"0afd722e-d093-428e-9f16-85a889d08de1\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:54 crc kubenswrapper[4751]: I0131 15:03:54.033098 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0afd722e-d093-428e-9f16-85a889d08de1-scripts\") pod \"glance-default-single-0\" (UID: \"0afd722e-d093-428e-9f16-85a889d08de1\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:54 crc kubenswrapper[4751]: I0131 15:03:54.033126 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0afd722e-d093-428e-9f16-85a889d08de1-config-data\") pod \"glance-default-single-0\" (UID: \"0afd722e-d093-428e-9f16-85a889d08de1\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:54 crc kubenswrapper[4751]: I0131 15:03:54.033189 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0afd722e-d093-428e-9f16-85a889d08de1-httpd-run\") pod \"glance-default-single-0\" (UID: \"0afd722e-d093-428e-9f16-85a889d08de1\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:54 crc kubenswrapper[4751]: I0131 15:03:54.033208 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0afd722e-d093-428e-9f16-85a889d08de1-etc-nvme\") pod \"glance-default-single-0\" (UID: \"0afd722e-d093-428e-9f16-85a889d08de1\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:54 crc kubenswrapper[4751]: I0131 15:03:54.033375 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-single-0\" (UID: \"0afd722e-d093-428e-9f16-85a889d08de1\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:54 crc kubenswrapper[4751]: I0131 15:03:54.033422 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0afd722e-d093-428e-9f16-85a889d08de1-lib-modules\") pod \"glance-default-single-0\" (UID: \"0afd722e-d093-428e-9f16-85a889d08de1\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:54 crc kubenswrapper[4751]: I0131 15:03:54.033490 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0afd722e-d093-428e-9f16-85a889d08de1-logs\") pod \"glance-default-single-0\" (UID: \"0afd722e-d093-428e-9f16-85a889d08de1\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:54 crc kubenswrapper[4751]: I0131 15:03:54.033506 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-single-0\" (UID: \"0afd722e-d093-428e-9f16-85a889d08de1\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:54 crc kubenswrapper[4751]: I0131 15:03:54.033538 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0afd722e-d093-428e-9f16-85a889d08de1-run\") pod \"glance-default-single-0\" (UID: \"0afd722e-d093-428e-9f16-85a889d08de1\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:54 crc kubenswrapper[4751]: I0131 15:03:54.033558 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0afd722e-d093-428e-9f16-85a889d08de1-dev\") pod \"glance-default-single-0\" (UID: \"0afd722e-d093-428e-9f16-85a889d08de1\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:54 crc kubenswrapper[4751]: I0131 15:03:54.033578 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0afd722e-d093-428e-9f16-85a889d08de1-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"0afd722e-d093-428e-9f16-85a889d08de1\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:54 crc kubenswrapper[4751]: I0131 15:03:54.134978 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0afd722e-d093-428e-9f16-85a889d08de1-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"0afd722e-d093-428e-9f16-85a889d08de1\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:54 crc kubenswrapper[4751]: I0131 15:03:54.135024 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9p949\" (UniqueName: \"kubernetes.io/projected/0afd722e-d093-428e-9f16-85a889d08de1-kube-api-access-9p949\") pod \"glance-default-single-0\" (UID: \"0afd722e-d093-428e-9f16-85a889d08de1\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:54 crc kubenswrapper[4751]: I0131 15:03:54.135052 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0afd722e-d093-428e-9f16-85a889d08de1-sys\") pod \"glance-default-single-0\" (UID: \"0afd722e-d093-428e-9f16-85a889d08de1\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:54 crc kubenswrapper[4751]: I0131 15:03:54.135100 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0afd722e-d093-428e-9f16-85a889d08de1-scripts\") pod \"glance-default-single-0\" (UID: \"0afd722e-d093-428e-9f16-85a889d08de1\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:54 crc kubenswrapper[4751]: I0131 15:03:54.135126 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0afd722e-d093-428e-9f16-85a889d08de1-config-data\") pod \"glance-default-single-0\" (UID: \"0afd722e-d093-428e-9f16-85a889d08de1\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:54 crc kubenswrapper[4751]: I0131 15:03:54.135188 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0afd722e-d093-428e-9f16-85a889d08de1-httpd-run\") pod \"glance-default-single-0\" (UID: \"0afd722e-d093-428e-9f16-85a889d08de1\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:54 crc kubenswrapper[4751]: I0131 15:03:54.135212 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0afd722e-d093-428e-9f16-85a889d08de1-etc-nvme\") pod \"glance-default-single-0\" (UID: \"0afd722e-d093-428e-9f16-85a889d08de1\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:54 crc kubenswrapper[4751]: I0131 15:03:54.135230 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0afd722e-d093-428e-9f16-85a889d08de1-sys\") pod \"glance-default-single-0\" (UID: \"0afd722e-d093-428e-9f16-85a889d08de1\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:54 crc kubenswrapper[4751]: I0131 15:03:54.135268 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-single-0\" (UID: \"0afd722e-d093-428e-9f16-85a889d08de1\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:54 crc kubenswrapper[4751]: I0131 15:03:54.135336 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0afd722e-d093-428e-9f16-85a889d08de1-lib-modules\") pod \"glance-default-single-0\" (UID: \"0afd722e-d093-428e-9f16-85a889d08de1\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:54 crc kubenswrapper[4751]: I0131 15:03:54.135432 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0afd722e-d093-428e-9f16-85a889d08de1-logs\") pod \"glance-default-single-0\" (UID: \"0afd722e-d093-428e-9f16-85a889d08de1\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:54 crc kubenswrapper[4751]: I0131 15:03:54.135459 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-single-0\" (UID: \"0afd722e-d093-428e-9f16-85a889d08de1\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:54 crc kubenswrapper[4751]: I0131 15:03:54.135455 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0afd722e-d093-428e-9f16-85a889d08de1-etc-nvme\") pod \"glance-default-single-0\" (UID: \"0afd722e-d093-428e-9f16-85a889d08de1\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:54 crc kubenswrapper[4751]: I0131 15:03:54.135494 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0afd722e-d093-428e-9f16-85a889d08de1-lib-modules\") pod \"glance-default-single-0\" (UID: \"0afd722e-d093-428e-9f16-85a889d08de1\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:54 crc kubenswrapper[4751]: I0131 15:03:54.135505 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0afd722e-d093-428e-9f16-85a889d08de1-run\") pod \"glance-default-single-0\" (UID: \"0afd722e-d093-428e-9f16-85a889d08de1\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:54 crc kubenswrapper[4751]: I0131 15:03:54.135529 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0afd722e-d093-428e-9f16-85a889d08de1-run\") pod \"glance-default-single-0\" (UID: \"0afd722e-d093-428e-9f16-85a889d08de1\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:54 crc kubenswrapper[4751]: I0131 15:03:54.135537 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0afd722e-d093-428e-9f16-85a889d08de1-dev\") pod \"glance-default-single-0\" (UID: \"0afd722e-d093-428e-9f16-85a889d08de1\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:54 crc kubenswrapper[4751]: I0131 15:03:54.135201 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0afd722e-d093-428e-9f16-85a889d08de1-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"0afd722e-d093-428e-9f16-85a889d08de1\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:54 crc kubenswrapper[4751]: I0131 15:03:54.135563 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0afd722e-d093-428e-9f16-85a889d08de1-dev\") pod \"glance-default-single-0\" (UID: \"0afd722e-d093-428e-9f16-85a889d08de1\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:54 crc kubenswrapper[4751]: I0131 15:03:54.135574 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0afd722e-d093-428e-9f16-85a889d08de1-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"0afd722e-d093-428e-9f16-85a889d08de1\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:54 crc kubenswrapper[4751]: I0131 15:03:54.135628 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0afd722e-d093-428e-9f16-85a889d08de1-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"0afd722e-d093-428e-9f16-85a889d08de1\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:54 crc kubenswrapper[4751]: I0131 15:03:54.135639 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-single-0\" (UID: \"0afd722e-d093-428e-9f16-85a889d08de1\") device mount path \"/mnt/openstack/pv16\"" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:54 crc kubenswrapper[4751]: I0131 15:03:54.135709 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-single-0\" (UID: \"0afd722e-d093-428e-9f16-85a889d08de1\") device mount path \"/mnt/openstack/pv15\"" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:54 crc kubenswrapper[4751]: I0131 15:03:54.136032 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0afd722e-d093-428e-9f16-85a889d08de1-httpd-run\") pod \"glance-default-single-0\" (UID: \"0afd722e-d093-428e-9f16-85a889d08de1\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:54 crc kubenswrapper[4751]: I0131 15:03:54.136201 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0afd722e-d093-428e-9f16-85a889d08de1-logs\") pod \"glance-default-single-0\" (UID: \"0afd722e-d093-428e-9f16-85a889d08de1\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:54 crc kubenswrapper[4751]: I0131 15:03:54.141193 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0afd722e-d093-428e-9f16-85a889d08de1-scripts\") pod \"glance-default-single-0\" (UID: \"0afd722e-d093-428e-9f16-85a889d08de1\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:54 crc kubenswrapper[4751]: I0131 15:03:54.141671 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0afd722e-d093-428e-9f16-85a889d08de1-config-data\") pod \"glance-default-single-0\" (UID: \"0afd722e-d093-428e-9f16-85a889d08de1\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:54 crc kubenswrapper[4751]: I0131 15:03:54.158693 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-single-0\" (UID: \"0afd722e-d093-428e-9f16-85a889d08de1\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:54 crc kubenswrapper[4751]: I0131 15:03:54.172786 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-single-0\" (UID: \"0afd722e-d093-428e-9f16-85a889d08de1\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:54 crc kubenswrapper[4751]: I0131 15:03:54.177748 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9p949\" (UniqueName: \"kubernetes.io/projected/0afd722e-d093-428e-9f16-85a889d08de1-kube-api-access-9p949\") pod \"glance-default-single-0\" (UID: \"0afd722e-d093-428e-9f16-85a889d08de1\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:54 crc kubenswrapper[4751]: I0131 15:03:54.239773 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:54 crc kubenswrapper[4751]: I0131 15:03:54.652726 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Jan 31 15:03:54 crc kubenswrapper[4751]: I0131 15:03:54.720163 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Jan 31 15:03:55 crc kubenswrapper[4751]: I0131 15:03:55.537105 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"0afd722e-d093-428e-9f16-85a889d08de1","Type":"ContainerStarted","Data":"ad052eff92c8d2861fc114164e87006f55dc34b295c394c1d33b1cf962ac16ee"} Jan 31 15:03:55 crc kubenswrapper[4751]: I0131 15:03:55.537657 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"0afd722e-d093-428e-9f16-85a889d08de1","Type":"ContainerStarted","Data":"982d4d2fcf6c2bcc38cabd751f3c1dc69dd5d9aa18ff83f7b91bae8effb85e28"} Jan 31 15:03:55 crc kubenswrapper[4751]: I0131 15:03:55.537672 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"0afd722e-d093-428e-9f16-85a889d08de1","Type":"ContainerStarted","Data":"ede3ffddd3e2cca5c5d06668cb40b6dbe9a582dbacbe5e907af617419d72cb7b"} Jan 31 15:03:55 crc kubenswrapper[4751]: I0131 15:03:55.537309 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="0afd722e-d093-428e-9f16-85a889d08de1" containerName="glance-httpd" containerID="cri-o://ad052eff92c8d2861fc114164e87006f55dc34b295c394c1d33b1cf962ac16ee" gracePeriod=30 Jan 31 15:03:55 crc kubenswrapper[4751]: I0131 15:03:55.537242 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="0afd722e-d093-428e-9f16-85a889d08de1" containerName="glance-log" containerID="cri-o://982d4d2fcf6c2bcc38cabd751f3c1dc69dd5d9aa18ff83f7b91bae8effb85e28" gracePeriod=30 Jan 31 15:03:55 crc kubenswrapper[4751]: I0131 15:03:55.569416 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-0" podStartSLOduration=2.569394978 podStartE2EDuration="2.569394978s" podCreationTimestamp="2026-01-31 15:03:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 15:03:55.56605271 +0000 UTC m=+1339.940765605" watchObservedRunningTime="2026-01-31 15:03:55.569394978 +0000 UTC m=+1339.944107883" Jan 31 15:03:55 crc kubenswrapper[4751]: I0131 15:03:55.931733 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.062351 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0afd722e-d093-428e-9f16-85a889d08de1-var-locks-brick\") pod \"0afd722e-d093-428e-9f16-85a889d08de1\" (UID: \"0afd722e-d093-428e-9f16-85a889d08de1\") " Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.062666 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0afd722e-d093-428e-9f16-85a889d08de1-run\") pod \"0afd722e-d093-428e-9f16-85a889d08de1\" (UID: \"0afd722e-d093-428e-9f16-85a889d08de1\") " Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.062720 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9p949\" (UniqueName: \"kubernetes.io/projected/0afd722e-d093-428e-9f16-85a889d08de1-kube-api-access-9p949\") pod \"0afd722e-d093-428e-9f16-85a889d08de1\" (UID: \"0afd722e-d093-428e-9f16-85a889d08de1\") " Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.062455 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0afd722e-d093-428e-9f16-85a889d08de1-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "0afd722e-d093-428e-9f16-85a889d08de1" (UID: "0afd722e-d093-428e-9f16-85a889d08de1"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.062751 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0afd722e-d093-428e-9f16-85a889d08de1-config-data\") pod \"0afd722e-d093-428e-9f16-85a889d08de1\" (UID: \"0afd722e-d093-428e-9f16-85a889d08de1\") " Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.062771 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0afd722e-d093-428e-9f16-85a889d08de1-scripts\") pod \"0afd722e-d093-428e-9f16-85a889d08de1\" (UID: \"0afd722e-d093-428e-9f16-85a889d08de1\") " Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.062775 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0afd722e-d093-428e-9f16-85a889d08de1-run" (OuterVolumeSpecName: "run") pod "0afd722e-d093-428e-9f16-85a889d08de1" (UID: "0afd722e-d093-428e-9f16-85a889d08de1"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.062839 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0afd722e-d093-428e-9f16-85a889d08de1-dev\") pod \"0afd722e-d093-428e-9f16-85a889d08de1\" (UID: \"0afd722e-d093-428e-9f16-85a889d08de1\") " Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.062878 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"0afd722e-d093-428e-9f16-85a889d08de1\" (UID: \"0afd722e-d093-428e-9f16-85a889d08de1\") " Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.062898 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0afd722e-d093-428e-9f16-85a889d08de1-lib-modules\") pod \"0afd722e-d093-428e-9f16-85a889d08de1\" (UID: \"0afd722e-d093-428e-9f16-85a889d08de1\") " Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.062930 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0afd722e-d093-428e-9f16-85a889d08de1-logs\") pod \"0afd722e-d093-428e-9f16-85a889d08de1\" (UID: \"0afd722e-d093-428e-9f16-85a889d08de1\") " Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.062944 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0afd722e-d093-428e-9f16-85a889d08de1-sys\") pod \"0afd722e-d093-428e-9f16-85a889d08de1\" (UID: \"0afd722e-d093-428e-9f16-85a889d08de1\") " Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.062959 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0afd722e-d093-428e-9f16-85a889d08de1-etc-iscsi\") pod \"0afd722e-d093-428e-9f16-85a889d08de1\" (UID: \"0afd722e-d093-428e-9f16-85a889d08de1\") " Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.062975 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0afd722e-d093-428e-9f16-85a889d08de1-httpd-run\") pod \"0afd722e-d093-428e-9f16-85a889d08de1\" (UID: \"0afd722e-d093-428e-9f16-85a889d08de1\") " Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.063022 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0afd722e-d093-428e-9f16-85a889d08de1-etc-nvme\") pod \"0afd722e-d093-428e-9f16-85a889d08de1\" (UID: \"0afd722e-d093-428e-9f16-85a889d08de1\") " Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.063036 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"0afd722e-d093-428e-9f16-85a889d08de1\" (UID: \"0afd722e-d093-428e-9f16-85a889d08de1\") " Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.063366 4751 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0afd722e-d093-428e-9f16-85a889d08de1-var-locks-brick\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.063389 4751 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0afd722e-d093-428e-9f16-85a889d08de1-run\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.063428 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0afd722e-d093-428e-9f16-85a889d08de1-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "0afd722e-d093-428e-9f16-85a889d08de1" (UID: "0afd722e-d093-428e-9f16-85a889d08de1"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.063474 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0afd722e-d093-428e-9f16-85a889d08de1-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "0afd722e-d093-428e-9f16-85a889d08de1" (UID: "0afd722e-d093-428e-9f16-85a889d08de1"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.063535 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0afd722e-d093-428e-9f16-85a889d08de1-sys" (OuterVolumeSpecName: "sys") pod "0afd722e-d093-428e-9f16-85a889d08de1" (UID: "0afd722e-d093-428e-9f16-85a889d08de1"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.063579 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0afd722e-d093-428e-9f16-85a889d08de1-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "0afd722e-d093-428e-9f16-85a889d08de1" (UID: "0afd722e-d093-428e-9f16-85a889d08de1"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.063585 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0afd722e-d093-428e-9f16-85a889d08de1-dev" (OuterVolumeSpecName: "dev") pod "0afd722e-d093-428e-9f16-85a889d08de1" (UID: "0afd722e-d093-428e-9f16-85a889d08de1"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.063684 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0afd722e-d093-428e-9f16-85a889d08de1-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "0afd722e-d093-428e-9f16-85a889d08de1" (UID: "0afd722e-d093-428e-9f16-85a889d08de1"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.064011 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0afd722e-d093-428e-9f16-85a889d08de1-logs" (OuterVolumeSpecName: "logs") pod "0afd722e-d093-428e-9f16-85a889d08de1" (UID: "0afd722e-d093-428e-9f16-85a889d08de1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.068248 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage16-crc" (OuterVolumeSpecName: "glance") pod "0afd722e-d093-428e-9f16-85a889d08de1" (UID: "0afd722e-d093-428e-9f16-85a889d08de1"). InnerVolumeSpecName "local-storage16-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.068878 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage15-crc" (OuterVolumeSpecName: "glance-cache") pod "0afd722e-d093-428e-9f16-85a889d08de1" (UID: "0afd722e-d093-428e-9f16-85a889d08de1"). InnerVolumeSpecName "local-storage15-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.069975 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0afd722e-d093-428e-9f16-85a889d08de1-kube-api-access-9p949" (OuterVolumeSpecName: "kube-api-access-9p949") pod "0afd722e-d093-428e-9f16-85a889d08de1" (UID: "0afd722e-d093-428e-9f16-85a889d08de1"). InnerVolumeSpecName "kube-api-access-9p949". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.070358 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0afd722e-d093-428e-9f16-85a889d08de1-scripts" (OuterVolumeSpecName: "scripts") pod "0afd722e-d093-428e-9f16-85a889d08de1" (UID: "0afd722e-d093-428e-9f16-85a889d08de1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.109438 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0afd722e-d093-428e-9f16-85a889d08de1-config-data" (OuterVolumeSpecName: "config-data") pod "0afd722e-d093-428e-9f16-85a889d08de1" (UID: "0afd722e-d093-428e-9f16-85a889d08de1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.165034 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0afd722e-d093-428e-9f16-85a889d08de1-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.165167 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0afd722e-d093-428e-9f16-85a889d08de1-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.165234 4751 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0afd722e-d093-428e-9f16-85a889d08de1-dev\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.165314 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" " Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.165386 4751 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0afd722e-d093-428e-9f16-85a889d08de1-lib-modules\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.165479 4751 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0afd722e-d093-428e-9f16-85a889d08de1-logs\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.165538 4751 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0afd722e-d093-428e-9f16-85a889d08de1-sys\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.165591 4751 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0afd722e-d093-428e-9f16-85a889d08de1-etc-iscsi\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.165648 4751 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0afd722e-d093-428e-9f16-85a889d08de1-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.165702 4751 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0afd722e-d093-428e-9f16-85a889d08de1-etc-nvme\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.165769 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" " Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.165831 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9p949\" (UniqueName: \"kubernetes.io/projected/0afd722e-d093-428e-9f16-85a889d08de1-kube-api-access-9p949\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.178624 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage16-crc" (UniqueName: "kubernetes.io/local-volume/local-storage16-crc") on node "crc" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.180587 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage15-crc" (UniqueName: "kubernetes.io/local-volume/local-storage15-crc") on node "crc" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.268458 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.268494 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.545443 4751 generic.go:334] "Generic (PLEG): container finished" podID="0afd722e-d093-428e-9f16-85a889d08de1" containerID="ad052eff92c8d2861fc114164e87006f55dc34b295c394c1d33b1cf962ac16ee" exitCode=143 Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.545470 4751 generic.go:334] "Generic (PLEG): container finished" podID="0afd722e-d093-428e-9f16-85a889d08de1" containerID="982d4d2fcf6c2bcc38cabd751f3c1dc69dd5d9aa18ff83f7b91bae8effb85e28" exitCode=143 Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.545491 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"0afd722e-d093-428e-9f16-85a889d08de1","Type":"ContainerDied","Data":"ad052eff92c8d2861fc114164e87006f55dc34b295c394c1d33b1cf962ac16ee"} Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.545515 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.545536 4751 scope.go:117] "RemoveContainer" containerID="ad052eff92c8d2861fc114164e87006f55dc34b295c394c1d33b1cf962ac16ee" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.545516 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"0afd722e-d093-428e-9f16-85a889d08de1","Type":"ContainerDied","Data":"982d4d2fcf6c2bcc38cabd751f3c1dc69dd5d9aa18ff83f7b91bae8effb85e28"} Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.545677 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"0afd722e-d093-428e-9f16-85a889d08de1","Type":"ContainerDied","Data":"ede3ffddd3e2cca5c5d06668cb40b6dbe9a582dbacbe5e907af617419d72cb7b"} Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.567348 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.575619 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.588684 4751 scope.go:117] "RemoveContainer" containerID="982d4d2fcf6c2bcc38cabd751f3c1dc69dd5d9aa18ff83f7b91bae8effb85e28" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.597039 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Jan 31 15:03:56 crc kubenswrapper[4751]: E0131 15:03:56.597376 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0afd722e-d093-428e-9f16-85a889d08de1" containerName="glance-httpd" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.597394 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="0afd722e-d093-428e-9f16-85a889d08de1" containerName="glance-httpd" Jan 31 15:03:56 crc kubenswrapper[4751]: E0131 15:03:56.597411 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0afd722e-d093-428e-9f16-85a889d08de1" containerName="glance-log" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.597416 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="0afd722e-d093-428e-9f16-85a889d08de1" containerName="glance-log" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.597554 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="0afd722e-d093-428e-9f16-85a889d08de1" containerName="glance-httpd" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.597570 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="0afd722e-d093-428e-9f16-85a889d08de1" containerName="glance-log" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.598361 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.604235 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.604336 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-lgkvw" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.604620 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-single-config-data" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.616582 4751 scope.go:117] "RemoveContainer" containerID="ad052eff92c8d2861fc114164e87006f55dc34b295c394c1d33b1cf962ac16ee" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.620494 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Jan 31 15:03:56 crc kubenswrapper[4751]: E0131 15:03:56.628742 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad052eff92c8d2861fc114164e87006f55dc34b295c394c1d33b1cf962ac16ee\": container with ID starting with ad052eff92c8d2861fc114164e87006f55dc34b295c394c1d33b1cf962ac16ee not found: ID does not exist" containerID="ad052eff92c8d2861fc114164e87006f55dc34b295c394c1d33b1cf962ac16ee" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.628789 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad052eff92c8d2861fc114164e87006f55dc34b295c394c1d33b1cf962ac16ee"} err="failed to get container status \"ad052eff92c8d2861fc114164e87006f55dc34b295c394c1d33b1cf962ac16ee\": rpc error: code = NotFound desc = could not find container \"ad052eff92c8d2861fc114164e87006f55dc34b295c394c1d33b1cf962ac16ee\": container with ID starting with ad052eff92c8d2861fc114164e87006f55dc34b295c394c1d33b1cf962ac16ee not found: ID does not exist" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.628814 4751 scope.go:117] "RemoveContainer" containerID="982d4d2fcf6c2bcc38cabd751f3c1dc69dd5d9aa18ff83f7b91bae8effb85e28" Jan 31 15:03:56 crc kubenswrapper[4751]: E0131 15:03:56.629195 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"982d4d2fcf6c2bcc38cabd751f3c1dc69dd5d9aa18ff83f7b91bae8effb85e28\": container with ID starting with 982d4d2fcf6c2bcc38cabd751f3c1dc69dd5d9aa18ff83f7b91bae8effb85e28 not found: ID does not exist" containerID="982d4d2fcf6c2bcc38cabd751f3c1dc69dd5d9aa18ff83f7b91bae8effb85e28" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.629232 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"982d4d2fcf6c2bcc38cabd751f3c1dc69dd5d9aa18ff83f7b91bae8effb85e28"} err="failed to get container status \"982d4d2fcf6c2bcc38cabd751f3c1dc69dd5d9aa18ff83f7b91bae8effb85e28\": rpc error: code = NotFound desc = could not find container \"982d4d2fcf6c2bcc38cabd751f3c1dc69dd5d9aa18ff83f7b91bae8effb85e28\": container with ID starting with 982d4d2fcf6c2bcc38cabd751f3c1dc69dd5d9aa18ff83f7b91bae8effb85e28 not found: ID does not exist" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.629247 4751 scope.go:117] "RemoveContainer" containerID="ad052eff92c8d2861fc114164e87006f55dc34b295c394c1d33b1cf962ac16ee" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.629639 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad052eff92c8d2861fc114164e87006f55dc34b295c394c1d33b1cf962ac16ee"} err="failed to get container status \"ad052eff92c8d2861fc114164e87006f55dc34b295c394c1d33b1cf962ac16ee\": rpc error: code = NotFound desc = could not find container \"ad052eff92c8d2861fc114164e87006f55dc34b295c394c1d33b1cf962ac16ee\": container with ID starting with ad052eff92c8d2861fc114164e87006f55dc34b295c394c1d33b1cf962ac16ee not found: ID does not exist" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.629679 4751 scope.go:117] "RemoveContainer" containerID="982d4d2fcf6c2bcc38cabd751f3c1dc69dd5d9aa18ff83f7b91bae8effb85e28" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.630814 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"982d4d2fcf6c2bcc38cabd751f3c1dc69dd5d9aa18ff83f7b91bae8effb85e28"} err="failed to get container status \"982d4d2fcf6c2bcc38cabd751f3c1dc69dd5d9aa18ff83f7b91bae8effb85e28\": rpc error: code = NotFound desc = could not find container \"982d4d2fcf6c2bcc38cabd751f3c1dc69dd5d9aa18ff83f7b91bae8effb85e28\": container with ID starting with 982d4d2fcf6c2bcc38cabd751f3c1dc69dd5d9aa18ff83f7b91bae8effb85e28 not found: ID does not exist" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.673642 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-scripts\") pod \"glance-default-single-0\" (UID: \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.673686 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-single-0\" (UID: \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.673708 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-etc-nvme\") pod \"glance-default-single-0\" (UID: \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.673743 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-single-0\" (UID: \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.673763 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.673786 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-config-data\") pod \"glance-default-single-0\" (UID: \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.673801 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.673818 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-dev\") pod \"glance-default-single-0\" (UID: \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.673856 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-sys\") pod \"glance-default-single-0\" (UID: \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.673876 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-lib-modules\") pod \"glance-default-single-0\" (UID: \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.673893 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-httpd-run\") pod \"glance-default-single-0\" (UID: \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.673917 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-run\") pod \"glance-default-single-0\" (UID: \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.673969 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-logs\") pod \"glance-default-single-0\" (UID: \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.674093 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hwqz\" (UniqueName: \"kubernetes.io/projected/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-kube-api-access-7hwqz\") pod \"glance-default-single-0\" (UID: \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.775965 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-sys\") pod \"glance-default-single-0\" (UID: \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.776367 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-lib-modules\") pod \"glance-default-single-0\" (UID: \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.776390 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-httpd-run\") pod \"glance-default-single-0\" (UID: \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.776429 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-run\") pod \"glance-default-single-0\" (UID: \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.776459 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-logs\") pod \"glance-default-single-0\" (UID: \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.776494 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hwqz\" (UniqueName: \"kubernetes.io/projected/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-kube-api-access-7hwqz\") pod \"glance-default-single-0\" (UID: \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.776516 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-scripts\") pod \"glance-default-single-0\" (UID: \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.776579 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-single-0\" (UID: \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.776601 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-etc-nvme\") pod \"glance-default-single-0\" (UID: \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.776652 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-single-0\" (UID: \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.776678 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.776713 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-config-data\") pod \"glance-default-single-0\" (UID: \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.776737 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.776769 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-dev\") pod \"glance-default-single-0\" (UID: \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.777065 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-dev\") pod \"glance-default-single-0\" (UID: \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.777134 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-sys\") pod \"glance-default-single-0\" (UID: \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.777162 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-lib-modules\") pod \"glance-default-single-0\" (UID: \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.777624 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-httpd-run\") pod \"glance-default-single-0\" (UID: \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.777665 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-run\") pod \"glance-default-single-0\" (UID: \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.777924 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-logs\") pod \"glance-default-single-0\" (UID: \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.778286 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-etc-nvme\") pod \"glance-default-single-0\" (UID: \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.778374 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-single-0\" (UID: \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\") device mount path \"/mnt/openstack/pv15\"" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.778406 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-single-0\" (UID: \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\") device mount path \"/mnt/openstack/pv16\"" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.778799 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.778927 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.786412 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-config-data\") pod \"glance-default-single-0\" (UID: \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.797238 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-single-0\" (UID: \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.798318 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-scripts\") pod \"glance-default-single-0\" (UID: \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.798397 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hwqz\" (UniqueName: \"kubernetes.io/projected/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-kube-api-access-7hwqz\") pod \"glance-default-single-0\" (UID: \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.798696 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-single-0\" (UID: \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:56 crc kubenswrapper[4751]: I0131 15:03:56.917740 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:03:57 crc kubenswrapper[4751]: I0131 15:03:57.333102 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Jan 31 15:03:57 crc kubenswrapper[4751]: I0131 15:03:57.556835 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d","Type":"ContainerStarted","Data":"9f3ae92fe0ff829368293b9ad1323f714b2c1d0600b09baf58692dca7948e19c"} Jan 31 15:03:57 crc kubenswrapper[4751]: I0131 15:03:57.556873 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d","Type":"ContainerStarted","Data":"3752a1af33745a097b553c49a65db992ed225f5133c4335738998e6644147a8d"} Jan 31 15:03:58 crc kubenswrapper[4751]: I0131 15:03:58.429676 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0afd722e-d093-428e-9f16-85a889d08de1" path="/var/lib/kubelet/pods/0afd722e-d093-428e-9f16-85a889d08de1/volumes" Jan 31 15:03:58 crc kubenswrapper[4751]: I0131 15:03:58.569111 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d","Type":"ContainerStarted","Data":"028ae3739718b4f9fa1406588bccd6e53a5c4051e633a6571cc7446d59642b17"} Jan 31 15:03:58 crc kubenswrapper[4751]: I0131 15:03:58.598257 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-0" podStartSLOduration=2.598236099 podStartE2EDuration="2.598236099s" podCreationTimestamp="2026-01-31 15:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 15:03:58.589893369 +0000 UTC m=+1342.964606294" watchObservedRunningTime="2026-01-31 15:03:58.598236099 +0000 UTC m=+1342.972948994" Jan 31 15:04:06 crc kubenswrapper[4751]: I0131 15:04:06.918439 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:04:06 crc kubenswrapper[4751]: I0131 15:04:06.918865 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:04:06 crc kubenswrapper[4751]: I0131 15:04:06.950555 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:04:06 crc kubenswrapper[4751]: I0131 15:04:06.974715 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:04:07 crc kubenswrapper[4751]: I0131 15:04:07.643058 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:04:07 crc kubenswrapper[4751]: I0131 15:04:07.643132 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:04:09 crc kubenswrapper[4751]: I0131 15:04:09.713011 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:04:09 crc kubenswrapper[4751]: I0131 15:04:09.713864 4751 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 31 15:04:09 crc kubenswrapper[4751]: I0131 15:04:09.717270 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.423309 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-2"] Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.427434 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-2" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.462666 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.470042 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.491601 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-2"] Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.503202 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.515146 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q42nd\" (UniqueName: \"kubernetes.io/projected/34a7875b-0d63-43d7-9833-07b4ddc85ff6-kube-api-access-q42nd\") pod \"glance-default-single-2\" (UID: \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.515194 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/34a7875b-0d63-43d7-9833-07b4ddc85ff6-httpd-run\") pod \"glance-default-single-2\" (UID: \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.515287 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-single-2\" (UID: \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.515329 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-single-2\" (UID: \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.515347 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34a7875b-0d63-43d7-9833-07b4ddc85ff6-config-data\") pod \"glance-default-single-2\" (UID: \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.515374 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/34a7875b-0d63-43d7-9833-07b4ddc85ff6-run\") pod \"glance-default-single-2\" (UID: \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.515396 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34a7875b-0d63-43d7-9833-07b4ddc85ff6-logs\") pod \"glance-default-single-2\" (UID: \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.515422 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/34a7875b-0d63-43d7-9833-07b4ddc85ff6-var-locks-brick\") pod \"glance-default-single-2\" (UID: \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.515448 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/34a7875b-0d63-43d7-9833-07b4ddc85ff6-etc-nvme\") pod \"glance-default-single-2\" (UID: \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.515471 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34a7875b-0d63-43d7-9833-07b4ddc85ff6-scripts\") pod \"glance-default-single-2\" (UID: \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.515487 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/34a7875b-0d63-43d7-9833-07b4ddc85ff6-sys\") pod \"glance-default-single-2\" (UID: \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.515502 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/34a7875b-0d63-43d7-9833-07b4ddc85ff6-lib-modules\") pod \"glance-default-single-2\" (UID: \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.515520 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/34a7875b-0d63-43d7-9833-07b4ddc85ff6-dev\") pod \"glance-default-single-2\" (UID: \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.515534 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/34a7875b-0d63-43d7-9833-07b4ddc85ff6-etc-iscsi\") pod \"glance-default-single-2\" (UID: \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.616804 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.616851 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-single-2\" (UID: \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.616879 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.616910 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-httpd-run\") pod \"glance-default-single-1\" (UID: \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.616928 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-sys\") pod \"glance-default-single-1\" (UID: \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.616946 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-single-2\" (UID: \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.616965 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34a7875b-0d63-43d7-9833-07b4ddc85ff6-config-data\") pod \"glance-default-single-2\" (UID: \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.616991 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-single-1\" (UID: \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.617007 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/34a7875b-0d63-43d7-9833-07b4ddc85ff6-run\") pod \"glance-default-single-2\" (UID: \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.617041 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/34a7875b-0d63-43d7-9833-07b4ddc85ff6-run\") pod \"glance-default-single-2\" (UID: \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.617108 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-etc-nvme\") pod \"glance-default-single-1\" (UID: \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.617193 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-dev\") pod \"glance-default-single-1\" (UID: \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.617224 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34a7875b-0d63-43d7-9833-07b4ddc85ff6-logs\") pod \"glance-default-single-2\" (UID: \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.617302 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/34a7875b-0d63-43d7-9833-07b4ddc85ff6-var-locks-brick\") pod \"glance-default-single-2\" (UID: \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.617365 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/34a7875b-0d63-43d7-9833-07b4ddc85ff6-var-locks-brick\") pod \"glance-default-single-2\" (UID: \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.617396 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-single-2\" (UID: \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\") device mount path \"/mnt/openstack/pv11\"" pod="glance-kuttl-tests/glance-default-single-2" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.617416 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-single-2\" (UID: \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\") device mount path \"/mnt/openstack/pv01\"" pod="glance-kuttl-tests/glance-default-single-2" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.617609 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/34a7875b-0d63-43d7-9833-07b4ddc85ff6-etc-nvme\") pod \"glance-default-single-2\" (UID: \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.617644 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-lib-modules\") pod \"glance-default-single-1\" (UID: \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.617664 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-single-1\" (UID: \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.617680 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34a7875b-0d63-43d7-9833-07b4ddc85ff6-scripts\") pod \"glance-default-single-2\" (UID: \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.617698 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqz7z\" (UniqueName: \"kubernetes.io/projected/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-kube-api-access-pqz7z\") pod \"glance-default-single-1\" (UID: \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.617709 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34a7875b-0d63-43d7-9833-07b4ddc85ff6-logs\") pod \"glance-default-single-2\" (UID: \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.617717 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/34a7875b-0d63-43d7-9833-07b4ddc85ff6-sys\") pod \"glance-default-single-2\" (UID: \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.617745 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/34a7875b-0d63-43d7-9833-07b4ddc85ff6-sys\") pod \"glance-default-single-2\" (UID: \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.617766 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/34a7875b-0d63-43d7-9833-07b4ddc85ff6-lib-modules\") pod \"glance-default-single-2\" (UID: \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.617869 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/34a7875b-0d63-43d7-9833-07b4ddc85ff6-etc-nvme\") pod \"glance-default-single-2\" (UID: \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.617745 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/34a7875b-0d63-43d7-9833-07b4ddc85ff6-lib-modules\") pod \"glance-default-single-2\" (UID: \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.617910 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/34a7875b-0d63-43d7-9833-07b4ddc85ff6-dev\") pod \"glance-default-single-2\" (UID: \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.617926 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/34a7875b-0d63-43d7-9833-07b4ddc85ff6-etc-iscsi\") pod \"glance-default-single-2\" (UID: \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.617941 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-config-data\") pod \"glance-default-single-1\" (UID: \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.617962 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-scripts\") pod \"glance-default-single-1\" (UID: \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.617982 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-logs\") pod \"glance-default-single-1\" (UID: \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.618000 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-run\") pod \"glance-default-single-1\" (UID: \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.618016 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q42nd\" (UniqueName: \"kubernetes.io/projected/34a7875b-0d63-43d7-9833-07b4ddc85ff6-kube-api-access-q42nd\") pod \"glance-default-single-2\" (UID: \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.618033 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/34a7875b-0d63-43d7-9833-07b4ddc85ff6-httpd-run\") pod \"glance-default-single-2\" (UID: \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.618200 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/34a7875b-0d63-43d7-9833-07b4ddc85ff6-dev\") pod \"glance-default-single-2\" (UID: \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.618233 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/34a7875b-0d63-43d7-9833-07b4ddc85ff6-etc-iscsi\") pod \"glance-default-single-2\" (UID: \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.618296 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/34a7875b-0d63-43d7-9833-07b4ddc85ff6-httpd-run\") pod \"glance-default-single-2\" (UID: \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.625402 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34a7875b-0d63-43d7-9833-07b4ddc85ff6-config-data\") pod \"glance-default-single-2\" (UID: \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.626222 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34a7875b-0d63-43d7-9833-07b4ddc85ff6-scripts\") pod \"glance-default-single-2\" (UID: \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.634436 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q42nd\" (UniqueName: \"kubernetes.io/projected/34a7875b-0d63-43d7-9833-07b4ddc85ff6-kube-api-access-q42nd\") pod \"glance-default-single-2\" (UID: \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.637852 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-single-2\" (UID: \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.648225 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-single-2\" (UID: \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.719344 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-single-1\" (UID: \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.719388 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-etc-nvme\") pod \"glance-default-single-1\" (UID: \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.719418 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-dev\") pod \"glance-default-single-1\" (UID: \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.719465 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-lib-modules\") pod \"glance-default-single-1\" (UID: \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.719489 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-single-1\" (UID: \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.719510 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqz7z\" (UniqueName: \"kubernetes.io/projected/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-kube-api-access-pqz7z\") pod \"glance-default-single-1\" (UID: \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.719545 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-config-data\") pod \"glance-default-single-1\" (UID: \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.719572 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-scripts\") pod \"glance-default-single-1\" (UID: \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.719579 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-lib-modules\") pod \"glance-default-single-1\" (UID: \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.719611 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-logs\") pod \"glance-default-single-1\" (UID: \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.719637 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-run\") pod \"glance-default-single-1\" (UID: \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.719665 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-dev\") pod \"glance-default-single-1\" (UID: \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.719675 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.719706 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.719730 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-single-1\" (UID: \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\") device mount path \"/mnt/openstack/pv20\"" pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.719748 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-httpd-run\") pod \"glance-default-single-1\" (UID: \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.719770 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-sys\") pod \"glance-default-single-1\" (UID: \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.719855 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-sys\") pod \"glance-default-single-1\" (UID: \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.720195 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.720278 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-run\") pod \"glance-default-single-1\" (UID: \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.719641 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-etc-nvme\") pod \"glance-default-single-1\" (UID: \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.719542 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-single-1\" (UID: \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\") device mount path \"/mnt/openstack/pv17\"" pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.720305 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.720648 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-logs\") pod \"glance-default-single-1\" (UID: \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.720837 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-httpd-run\") pod \"glance-default-single-1\" (UID: \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.723246 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-scripts\") pod \"glance-default-single-1\" (UID: \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.724316 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-config-data\") pod \"glance-default-single-1\" (UID: \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.735798 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqz7z\" (UniqueName: \"kubernetes.io/projected/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-kube-api-access-pqz7z\") pod \"glance-default-single-1\" (UID: \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.739418 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-single-1\" (UID: \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.740057 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-single-1\" (UID: \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.780651 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-2" Jan 31 15:04:12 crc kubenswrapper[4751]: I0131 15:04:12.804027 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:04:13 crc kubenswrapper[4751]: I0131 15:04:13.204927 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-2"] Jan 31 15:04:13 crc kubenswrapper[4751]: I0131 15:04:13.257827 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Jan 31 15:04:13 crc kubenswrapper[4751]: W0131 15:04:13.330807 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2dfb5a9a_fa92_40c6_84ab_7b6b081cc688.slice/crio-402147d966fca3f55a794a137f4a6b76809d5828d0e869e767465f78800c790f WatchSource:0}: Error finding container 402147d966fca3f55a794a137f4a6b76809d5828d0e869e767465f78800c790f: Status 404 returned error can't find the container with id 402147d966fca3f55a794a137f4a6b76809d5828d0e869e767465f78800c790f Jan 31 15:04:13 crc kubenswrapper[4751]: I0131 15:04:13.692711 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688","Type":"ContainerStarted","Data":"4fbc4ee6ad6c761a5b5a51163a560ee167d3f98243bc794a7addb24ce85b940b"} Jan 31 15:04:13 crc kubenswrapper[4751]: I0131 15:04:13.693230 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688","Type":"ContainerStarted","Data":"645226e20597fef9cf7c812cae862c4ac3ab03116b7fb8b29ff6d081dd574d5f"} Jan 31 15:04:13 crc kubenswrapper[4751]: I0131 15:04:13.693249 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688","Type":"ContainerStarted","Data":"402147d966fca3f55a794a137f4a6b76809d5828d0e869e767465f78800c790f"} Jan 31 15:04:13 crc kubenswrapper[4751]: I0131 15:04:13.694935 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-2" event={"ID":"34a7875b-0d63-43d7-9833-07b4ddc85ff6","Type":"ContainerStarted","Data":"b5eadfa9c57b278838c0ae7db714f9a98b67d467f6d66b571b2645118f81fc0e"} Jan 31 15:04:13 crc kubenswrapper[4751]: I0131 15:04:13.694966 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-2" event={"ID":"34a7875b-0d63-43d7-9833-07b4ddc85ff6","Type":"ContainerStarted","Data":"619f96ea7220c280d8a22355d64a0c2812492ed032b5d0869b1e6b943637206b"} Jan 31 15:04:13 crc kubenswrapper[4751]: I0131 15:04:13.694980 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-2" event={"ID":"34a7875b-0d63-43d7-9833-07b4ddc85ff6","Type":"ContainerStarted","Data":"072c566469ce5399f0b0c6f71e56fe648451b73be39d67d13b4a1e8f7f64cd97"} Jan 31 15:04:13 crc kubenswrapper[4751]: I0131 15:04:13.719609 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-1" podStartSLOduration=2.719584883 podStartE2EDuration="2.719584883s" podCreationTimestamp="2026-01-31 15:04:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 15:04:13.712833705 +0000 UTC m=+1358.087546580" watchObservedRunningTime="2026-01-31 15:04:13.719584883 +0000 UTC m=+1358.094297778" Jan 31 15:04:13 crc kubenswrapper[4751]: I0131 15:04:13.739600 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-2" podStartSLOduration=2.739577731 podStartE2EDuration="2.739577731s" podCreationTimestamp="2026-01-31 15:04:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 15:04:13.733042728 +0000 UTC m=+1358.107755633" watchObservedRunningTime="2026-01-31 15:04:13.739577731 +0000 UTC m=+1358.114290656" Jan 31 15:04:22 crc kubenswrapper[4751]: I0131 15:04:22.781603 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-2" Jan 31 15:04:22 crc kubenswrapper[4751]: I0131 15:04:22.784319 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-2" Jan 31 15:04:22 crc kubenswrapper[4751]: I0131 15:04:22.804912 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:04:22 crc kubenswrapper[4751]: I0131 15:04:22.805117 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:04:22 crc kubenswrapper[4751]: I0131 15:04:22.810129 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-2" Jan 31 15:04:22 crc kubenswrapper[4751]: I0131 15:04:22.830727 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:04:22 crc kubenswrapper[4751]: I0131 15:04:22.841934 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:04:22 crc kubenswrapper[4751]: I0131 15:04:22.842257 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-2" Jan 31 15:04:23 crc kubenswrapper[4751]: I0131 15:04:23.793260 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:04:23 crc kubenswrapper[4751]: I0131 15:04:23.793331 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:04:23 crc kubenswrapper[4751]: I0131 15:04:23.793354 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-2" Jan 31 15:04:23 crc kubenswrapper[4751]: I0131 15:04:23.793374 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-2" Jan 31 15:04:25 crc kubenswrapper[4751]: I0131 15:04:25.737765 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-2" Jan 31 15:04:25 crc kubenswrapper[4751]: I0131 15:04:25.812040 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-2" Jan 31 15:04:25 crc kubenswrapper[4751]: I0131 15:04:25.881242 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:04:25 crc kubenswrapper[4751]: I0131 15:04:25.881390 4751 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 31 15:04:25 crc kubenswrapper[4751]: I0131 15:04:25.895848 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:04:26 crc kubenswrapper[4751]: I0131 15:04:26.892737 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-2"] Jan 31 15:04:26 crc kubenswrapper[4751]: I0131 15:04:26.908040 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Jan 31 15:04:27 crc kubenswrapper[4751]: I0131 15:04:27.828745 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-1" podUID="2dfb5a9a-fa92-40c6-84ab-7b6b081cc688" containerName="glance-log" containerID="cri-o://645226e20597fef9cf7c812cae862c4ac3ab03116b7fb8b29ff6d081dd574d5f" gracePeriod=30 Jan 31 15:04:27 crc kubenswrapper[4751]: I0131 15:04:27.828996 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-2" podUID="34a7875b-0d63-43d7-9833-07b4ddc85ff6" containerName="glance-httpd" containerID="cri-o://b5eadfa9c57b278838c0ae7db714f9a98b67d467f6d66b571b2645118f81fc0e" gracePeriod=30 Jan 31 15:04:27 crc kubenswrapper[4751]: I0131 15:04:27.828900 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-1" podUID="2dfb5a9a-fa92-40c6-84ab-7b6b081cc688" containerName="glance-httpd" containerID="cri-o://4fbc4ee6ad6c761a5b5a51163a560ee167d3f98243bc794a7addb24ce85b940b" gracePeriod=30 Jan 31 15:04:27 crc kubenswrapper[4751]: I0131 15:04:27.828962 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-2" podUID="34a7875b-0d63-43d7-9833-07b4ddc85ff6" containerName="glance-log" containerID="cri-o://619f96ea7220c280d8a22355d64a0c2812492ed032b5d0869b1e6b943637206b" gracePeriod=30 Jan 31 15:04:27 crc kubenswrapper[4751]: I0131 15:04:27.835932 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-single-1" podUID="2dfb5a9a-fa92-40c6-84ab-7b6b081cc688" containerName="glance-httpd" probeResult="failure" output="Get \"http://10.217.0.140:9292/healthcheck\": EOF" Jan 31 15:04:28 crc kubenswrapper[4751]: I0131 15:04:28.839839 4751 generic.go:334] "Generic (PLEG): container finished" podID="2dfb5a9a-fa92-40c6-84ab-7b6b081cc688" containerID="645226e20597fef9cf7c812cae862c4ac3ab03116b7fb8b29ff6d081dd574d5f" exitCode=143 Jan 31 15:04:28 crc kubenswrapper[4751]: I0131 15:04:28.839939 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688","Type":"ContainerDied","Data":"645226e20597fef9cf7c812cae862c4ac3ab03116b7fb8b29ff6d081dd574d5f"} Jan 31 15:04:28 crc kubenswrapper[4751]: I0131 15:04:28.843897 4751 generic.go:334] "Generic (PLEG): container finished" podID="34a7875b-0d63-43d7-9833-07b4ddc85ff6" containerID="619f96ea7220c280d8a22355d64a0c2812492ed032b5d0869b1e6b943637206b" exitCode=143 Jan 31 15:04:28 crc kubenswrapper[4751]: I0131 15:04:28.843946 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-2" event={"ID":"34a7875b-0d63-43d7-9833-07b4ddc85ff6","Type":"ContainerDied","Data":"619f96ea7220c280d8a22355d64a0c2812492ed032b5d0869b1e6b943637206b"} Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.389325 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-2" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.462338 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/34a7875b-0d63-43d7-9833-07b4ddc85ff6-lib-modules\") pod \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\" (UID: \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\") " Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.462403 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34a7875b-0d63-43d7-9833-07b4ddc85ff6-config-data\") pod \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\" (UID: \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\") " Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.462434 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\" (UID: \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\") " Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.462454 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34a7875b-0d63-43d7-9833-07b4ddc85ff6-logs\") pod \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\" (UID: \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\") " Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.462500 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/34a7875b-0d63-43d7-9833-07b4ddc85ff6-run\") pod \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\" (UID: \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\") " Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.462519 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34a7875b-0d63-43d7-9833-07b4ddc85ff6-scripts\") pod \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\" (UID: \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\") " Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.462565 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q42nd\" (UniqueName: \"kubernetes.io/projected/34a7875b-0d63-43d7-9833-07b4ddc85ff6-kube-api-access-q42nd\") pod \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\" (UID: \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\") " Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.462597 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/34a7875b-0d63-43d7-9833-07b4ddc85ff6-etc-nvme\") pod \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\" (UID: \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\") " Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.462615 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\" (UID: \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\") " Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.462658 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/34a7875b-0d63-43d7-9833-07b4ddc85ff6-dev\") pod \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\" (UID: \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\") " Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.462890 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/34a7875b-0d63-43d7-9833-07b4ddc85ff6-httpd-run\") pod \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\" (UID: \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\") " Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.462932 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/34a7875b-0d63-43d7-9833-07b4ddc85ff6-var-locks-brick\") pod \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\" (UID: \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\") " Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.462955 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/34a7875b-0d63-43d7-9833-07b4ddc85ff6-sys\") pod \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\" (UID: \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\") " Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.462977 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/34a7875b-0d63-43d7-9833-07b4ddc85ff6-etc-iscsi\") pod \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\" (UID: \"34a7875b-0d63-43d7-9833-07b4ddc85ff6\") " Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.462982 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/34a7875b-0d63-43d7-9833-07b4ddc85ff6-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "34a7875b-0d63-43d7-9833-07b4ddc85ff6" (UID: "34a7875b-0d63-43d7-9833-07b4ddc85ff6"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.463122 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/34a7875b-0d63-43d7-9833-07b4ddc85ff6-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "34a7875b-0d63-43d7-9833-07b4ddc85ff6" (UID: "34a7875b-0d63-43d7-9833-07b4ddc85ff6"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.463217 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/34a7875b-0d63-43d7-9833-07b4ddc85ff6-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "34a7875b-0d63-43d7-9833-07b4ddc85ff6" (UID: "34a7875b-0d63-43d7-9833-07b4ddc85ff6"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.463285 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/34a7875b-0d63-43d7-9833-07b4ddc85ff6-run" (OuterVolumeSpecName: "run") pod "34a7875b-0d63-43d7-9833-07b4ddc85ff6" (UID: "34a7875b-0d63-43d7-9833-07b4ddc85ff6"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.463965 4751 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/34a7875b-0d63-43d7-9833-07b4ddc85ff6-lib-modules\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.467270 4751 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/34a7875b-0d63-43d7-9833-07b4ddc85ff6-etc-nvme\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.467284 4751 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/34a7875b-0d63-43d7-9833-07b4ddc85ff6-etc-iscsi\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.463399 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34a7875b-0d63-43d7-9833-07b4ddc85ff6-logs" (OuterVolumeSpecName: "logs") pod "34a7875b-0d63-43d7-9833-07b4ddc85ff6" (UID: "34a7875b-0d63-43d7-9833-07b4ddc85ff6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.463635 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34a7875b-0d63-43d7-9833-07b4ddc85ff6-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "34a7875b-0d63-43d7-9833-07b4ddc85ff6" (UID: "34a7875b-0d63-43d7-9833-07b4ddc85ff6"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.463654 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/34a7875b-0d63-43d7-9833-07b4ddc85ff6-dev" (OuterVolumeSpecName: "dev") pod "34a7875b-0d63-43d7-9833-07b4ddc85ff6" (UID: "34a7875b-0d63-43d7-9833-07b4ddc85ff6"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.463675 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/34a7875b-0d63-43d7-9833-07b4ddc85ff6-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "34a7875b-0d63-43d7-9833-07b4ddc85ff6" (UID: "34a7875b-0d63-43d7-9833-07b4ddc85ff6"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.463693 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/34a7875b-0d63-43d7-9833-07b4ddc85ff6-sys" (OuterVolumeSpecName: "sys") pod "34a7875b-0d63-43d7-9833-07b4ddc85ff6" (UID: "34a7875b-0d63-43d7-9833-07b4ddc85ff6"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.468464 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "34a7875b-0d63-43d7-9833-07b4ddc85ff6" (UID: "34a7875b-0d63-43d7-9833-07b4ddc85ff6"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.470694 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34a7875b-0d63-43d7-9833-07b4ddc85ff6-kube-api-access-q42nd" (OuterVolumeSpecName: "kube-api-access-q42nd") pod "34a7875b-0d63-43d7-9833-07b4ddc85ff6" (UID: "34a7875b-0d63-43d7-9833-07b4ddc85ff6"). InnerVolumeSpecName "kube-api-access-q42nd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.471398 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance-cache") pod "34a7875b-0d63-43d7-9833-07b4ddc85ff6" (UID: "34a7875b-0d63-43d7-9833-07b4ddc85ff6"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.474836 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34a7875b-0d63-43d7-9833-07b4ddc85ff6-scripts" (OuterVolumeSpecName: "scripts") pod "34a7875b-0d63-43d7-9833-07b4ddc85ff6" (UID: "34a7875b-0d63-43d7-9833-07b4ddc85ff6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.508452 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.511646 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34a7875b-0d63-43d7-9833-07b4ddc85ff6-config-data" (OuterVolumeSpecName: "config-data") pod "34a7875b-0d63-43d7-9833-07b4ddc85ff6" (UID: "34a7875b-0d63-43d7-9833-07b4ddc85ff6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.568085 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\" (UID: \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\") " Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.568140 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-run\") pod \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\" (UID: \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\") " Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.568181 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-var-locks-brick\") pod \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\" (UID: \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\") " Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.568214 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\" (UID: \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\") " Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.568246 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-config-data\") pod \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\" (UID: \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\") " Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.568263 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-etc-iscsi\") pod \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\" (UID: \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\") " Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.568315 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-dev\") pod \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\" (UID: \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\") " Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.568352 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-httpd-run\") pod \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\" (UID: \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\") " Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.568375 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-logs\") pod \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\" (UID: \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\") " Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.568423 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqz7z\" (UniqueName: \"kubernetes.io/projected/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-kube-api-access-pqz7z\") pod \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\" (UID: \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\") " Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.568451 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-etc-nvme\") pod \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\" (UID: \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\") " Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.568469 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-sys\") pod \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\" (UID: \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\") " Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.568499 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-lib-modules\") pod \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\" (UID: \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\") " Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.568526 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-scripts\") pod \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\" (UID: \"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688\") " Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.568825 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.568847 4751 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34a7875b-0d63-43d7-9833-07b4ddc85ff6-logs\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.568856 4751 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/34a7875b-0d63-43d7-9833-07b4ddc85ff6-run\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.568865 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34a7875b-0d63-43d7-9833-07b4ddc85ff6-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.568873 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q42nd\" (UniqueName: \"kubernetes.io/projected/34a7875b-0d63-43d7-9833-07b4ddc85ff6-kube-api-access-q42nd\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.568889 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.568899 4751 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/34a7875b-0d63-43d7-9833-07b4ddc85ff6-dev\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.568906 4751 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/34a7875b-0d63-43d7-9833-07b4ddc85ff6-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.568914 4751 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/34a7875b-0d63-43d7-9833-07b4ddc85ff6-var-locks-brick\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.568923 4751 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/34a7875b-0d63-43d7-9833-07b4ddc85ff6-sys\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.568931 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34a7875b-0d63-43d7-9833-07b4ddc85ff6-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.571493 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage17-crc" (OuterVolumeSpecName: "glance-cache") pod "2dfb5a9a-fa92-40c6-84ab-7b6b081cc688" (UID: "2dfb5a9a-fa92-40c6-84ab-7b6b081cc688"). InnerVolumeSpecName "local-storage17-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.571561 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-run" (OuterVolumeSpecName: "run") pod "2dfb5a9a-fa92-40c6-84ab-7b6b081cc688" (UID: "2dfb5a9a-fa92-40c6-84ab-7b6b081cc688"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.571590 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "2dfb5a9a-fa92-40c6-84ab-7b6b081cc688" (UID: "2dfb5a9a-fa92-40c6-84ab-7b6b081cc688"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.571618 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "2dfb5a9a-fa92-40c6-84ab-7b6b081cc688" (UID: "2dfb5a9a-fa92-40c6-84ab-7b6b081cc688"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.571643 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "2dfb5a9a-fa92-40c6-84ab-7b6b081cc688" (UID: "2dfb5a9a-fa92-40c6-84ab-7b6b081cc688"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.571667 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-sys" (OuterVolumeSpecName: "sys") pod "2dfb5a9a-fa92-40c6-84ab-7b6b081cc688" (UID: "2dfb5a9a-fa92-40c6-84ab-7b6b081cc688"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.572042 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage20-crc" (OuterVolumeSpecName: "glance") pod "2dfb5a9a-fa92-40c6-84ab-7b6b081cc688" (UID: "2dfb5a9a-fa92-40c6-84ab-7b6b081cc688"). InnerVolumeSpecName "local-storage20-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.572907 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-dev" (OuterVolumeSpecName: "dev") pod "2dfb5a9a-fa92-40c6-84ab-7b6b081cc688" (UID: "2dfb5a9a-fa92-40c6-84ab-7b6b081cc688"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.573236 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "2dfb5a9a-fa92-40c6-84ab-7b6b081cc688" (UID: "2dfb5a9a-fa92-40c6-84ab-7b6b081cc688"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.573529 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2dfb5a9a-fa92-40c6-84ab-7b6b081cc688" (UID: "2dfb5a9a-fa92-40c6-84ab-7b6b081cc688"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.573583 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-logs" (OuterVolumeSpecName: "logs") pod "2dfb5a9a-fa92-40c6-84ab-7b6b081cc688" (UID: "2dfb5a9a-fa92-40c6-84ab-7b6b081cc688"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.574015 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-kube-api-access-pqz7z" (OuterVolumeSpecName: "kube-api-access-pqz7z") pod "2dfb5a9a-fa92-40c6-84ab-7b6b081cc688" (UID: "2dfb5a9a-fa92-40c6-84ab-7b6b081cc688"). InnerVolumeSpecName "kube-api-access-pqz7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.575038 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-scripts" (OuterVolumeSpecName: "scripts") pod "2dfb5a9a-fa92-40c6-84ab-7b6b081cc688" (UID: "2dfb5a9a-fa92-40c6-84ab-7b6b081cc688"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.583943 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.587186 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.603739 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-config-data" (OuterVolumeSpecName: "config-data") pod "2dfb5a9a-fa92-40c6-84ab-7b6b081cc688" (UID: "2dfb5a9a-fa92-40c6-84ab-7b6b081cc688"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.670118 4751 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-var-locks-brick\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.670150 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.670184 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" " Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.670192 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.670201 4751 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-etc-iscsi\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.670213 4751 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-dev\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.670221 4751 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.670231 4751 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-logs\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.670239 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqz7z\" (UniqueName: \"kubernetes.io/projected/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-kube-api-access-pqz7z\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.670248 4751 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-etc-nvme\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.670256 4751 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-sys\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.670264 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.670271 4751 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-lib-modules\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.670279 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.670300 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" " Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.670308 4751 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688-run\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.682776 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage20-crc" (UniqueName: "kubernetes.io/local-volume/local-storage20-crc") on node "crc" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.683329 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage17-crc" (UniqueName: "kubernetes.io/local-volume/local-storage17-crc") on node "crc" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.772391 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.772754 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.887861 4751 generic.go:334] "Generic (PLEG): container finished" podID="2dfb5a9a-fa92-40c6-84ab-7b6b081cc688" containerID="4fbc4ee6ad6c761a5b5a51163a560ee167d3f98243bc794a7addb24ce85b940b" exitCode=0 Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.887920 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688","Type":"ContainerDied","Data":"4fbc4ee6ad6c761a5b5a51163a560ee167d3f98243bc794a7addb24ce85b940b"} Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.887944 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"2dfb5a9a-fa92-40c6-84ab-7b6b081cc688","Type":"ContainerDied","Data":"402147d966fca3f55a794a137f4a6b76809d5828d0e869e767465f78800c790f"} Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.887960 4751 scope.go:117] "RemoveContainer" containerID="4fbc4ee6ad6c761a5b5a51163a560ee167d3f98243bc794a7addb24ce85b940b" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.888050 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.891838 4751 generic.go:334] "Generic (PLEG): container finished" podID="34a7875b-0d63-43d7-9833-07b4ddc85ff6" containerID="b5eadfa9c57b278838c0ae7db714f9a98b67d467f6d66b571b2645118f81fc0e" exitCode=0 Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.891896 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-2" event={"ID":"34a7875b-0d63-43d7-9833-07b4ddc85ff6","Type":"ContainerDied","Data":"b5eadfa9c57b278838c0ae7db714f9a98b67d467f6d66b571b2645118f81fc0e"} Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.891947 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-2" event={"ID":"34a7875b-0d63-43d7-9833-07b4ddc85ff6","Type":"ContainerDied","Data":"072c566469ce5399f0b0c6f71e56fe648451b73be39d67d13b4a1e8f7f64cd97"} Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.891879 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-2" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.912241 4751 scope.go:117] "RemoveContainer" containerID="645226e20597fef9cf7c812cae862c4ac3ab03116b7fb8b29ff6d081dd574d5f" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.929704 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.939881 4751 scope.go:117] "RemoveContainer" containerID="4fbc4ee6ad6c761a5b5a51163a560ee167d3f98243bc794a7addb24ce85b940b" Jan 31 15:04:31 crc kubenswrapper[4751]: E0131 15:04:31.940397 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fbc4ee6ad6c761a5b5a51163a560ee167d3f98243bc794a7addb24ce85b940b\": container with ID starting with 4fbc4ee6ad6c761a5b5a51163a560ee167d3f98243bc794a7addb24ce85b940b not found: ID does not exist" containerID="4fbc4ee6ad6c761a5b5a51163a560ee167d3f98243bc794a7addb24ce85b940b" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.940435 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fbc4ee6ad6c761a5b5a51163a560ee167d3f98243bc794a7addb24ce85b940b"} err="failed to get container status \"4fbc4ee6ad6c761a5b5a51163a560ee167d3f98243bc794a7addb24ce85b940b\": rpc error: code = NotFound desc = could not find container \"4fbc4ee6ad6c761a5b5a51163a560ee167d3f98243bc794a7addb24ce85b940b\": container with ID starting with 4fbc4ee6ad6c761a5b5a51163a560ee167d3f98243bc794a7addb24ce85b940b not found: ID does not exist" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.940462 4751 scope.go:117] "RemoveContainer" containerID="645226e20597fef9cf7c812cae862c4ac3ab03116b7fb8b29ff6d081dd574d5f" Jan 31 15:04:31 crc kubenswrapper[4751]: E0131 15:04:31.940899 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"645226e20597fef9cf7c812cae862c4ac3ab03116b7fb8b29ff6d081dd574d5f\": container with ID starting with 645226e20597fef9cf7c812cae862c4ac3ab03116b7fb8b29ff6d081dd574d5f not found: ID does not exist" containerID="645226e20597fef9cf7c812cae862c4ac3ab03116b7fb8b29ff6d081dd574d5f" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.940943 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"645226e20597fef9cf7c812cae862c4ac3ab03116b7fb8b29ff6d081dd574d5f"} err="failed to get container status \"645226e20597fef9cf7c812cae862c4ac3ab03116b7fb8b29ff6d081dd574d5f\": rpc error: code = NotFound desc = could not find container \"645226e20597fef9cf7c812cae862c4ac3ab03116b7fb8b29ff6d081dd574d5f\": container with ID starting with 645226e20597fef9cf7c812cae862c4ac3ab03116b7fb8b29ff6d081dd574d5f not found: ID does not exist" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.940978 4751 scope.go:117] "RemoveContainer" containerID="b5eadfa9c57b278838c0ae7db714f9a98b67d467f6d66b571b2645118f81fc0e" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.946272 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.956525 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-2"] Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.962299 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-2"] Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.965109 4751 scope.go:117] "RemoveContainer" containerID="619f96ea7220c280d8a22355d64a0c2812492ed032b5d0869b1e6b943637206b" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.982878 4751 scope.go:117] "RemoveContainer" containerID="b5eadfa9c57b278838c0ae7db714f9a98b67d467f6d66b571b2645118f81fc0e" Jan 31 15:04:31 crc kubenswrapper[4751]: E0131 15:04:31.983400 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5eadfa9c57b278838c0ae7db714f9a98b67d467f6d66b571b2645118f81fc0e\": container with ID starting with b5eadfa9c57b278838c0ae7db714f9a98b67d467f6d66b571b2645118f81fc0e not found: ID does not exist" containerID="b5eadfa9c57b278838c0ae7db714f9a98b67d467f6d66b571b2645118f81fc0e" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.983441 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5eadfa9c57b278838c0ae7db714f9a98b67d467f6d66b571b2645118f81fc0e"} err="failed to get container status \"b5eadfa9c57b278838c0ae7db714f9a98b67d467f6d66b571b2645118f81fc0e\": rpc error: code = NotFound desc = could not find container \"b5eadfa9c57b278838c0ae7db714f9a98b67d467f6d66b571b2645118f81fc0e\": container with ID starting with b5eadfa9c57b278838c0ae7db714f9a98b67d467f6d66b571b2645118f81fc0e not found: ID does not exist" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.983468 4751 scope.go:117] "RemoveContainer" containerID="619f96ea7220c280d8a22355d64a0c2812492ed032b5d0869b1e6b943637206b" Jan 31 15:04:31 crc kubenswrapper[4751]: E0131 15:04:31.983927 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"619f96ea7220c280d8a22355d64a0c2812492ed032b5d0869b1e6b943637206b\": container with ID starting with 619f96ea7220c280d8a22355d64a0c2812492ed032b5d0869b1e6b943637206b not found: ID does not exist" containerID="619f96ea7220c280d8a22355d64a0c2812492ed032b5d0869b1e6b943637206b" Jan 31 15:04:31 crc kubenswrapper[4751]: I0131 15:04:31.983966 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"619f96ea7220c280d8a22355d64a0c2812492ed032b5d0869b1e6b943637206b"} err="failed to get container status \"619f96ea7220c280d8a22355d64a0c2812492ed032b5d0869b1e6b943637206b\": rpc error: code = NotFound desc = could not find container \"619f96ea7220c280d8a22355d64a0c2812492ed032b5d0869b1e6b943637206b\": container with ID starting with 619f96ea7220c280d8a22355d64a0c2812492ed032b5d0869b1e6b943637206b not found: ID does not exist" Jan 31 15:04:32 crc kubenswrapper[4751]: I0131 15:04:32.423385 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dfb5a9a-fa92-40c6-84ab-7b6b081cc688" path="/var/lib/kubelet/pods/2dfb5a9a-fa92-40c6-84ab-7b6b081cc688/volumes" Jan 31 15:04:32 crc kubenswrapper[4751]: I0131 15:04:32.424758 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34a7875b-0d63-43d7-9833-07b4ddc85ff6" path="/var/lib/kubelet/pods/34a7875b-0d63-43d7-9833-07b4ddc85ff6/volumes" Jan 31 15:04:33 crc kubenswrapper[4751]: I0131 15:04:33.204827 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Jan 31 15:04:33 crc kubenswrapper[4751]: I0131 15:04:33.205364 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d" containerName="glance-log" containerID="cri-o://9f3ae92fe0ff829368293b9ad1323f714b2c1d0600b09baf58692dca7948e19c" gracePeriod=30 Jan 31 15:04:33 crc kubenswrapper[4751]: I0131 15:04:33.205434 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d" containerName="glance-httpd" containerID="cri-o://028ae3739718b4f9fa1406588bccd6e53a5c4051e633a6571cc7446d59642b17" gracePeriod=30 Jan 31 15:04:33 crc kubenswrapper[4751]: I0131 15:04:33.924150 4751 generic.go:334] "Generic (PLEG): container finished" podID="46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d" containerID="9f3ae92fe0ff829368293b9ad1323f714b2c1d0600b09baf58692dca7948e19c" exitCode=143 Jan 31 15:04:33 crc kubenswrapper[4751]: I0131 15:04:33.924228 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d","Type":"ContainerDied","Data":"9f3ae92fe0ff829368293b9ad1323f714b2c1d0600b09baf58692dca7948e19c"} Jan 31 15:04:36 crc kubenswrapper[4751]: I0131 15:04:36.741541 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:04:36 crc kubenswrapper[4751]: I0131 15:04:36.852528 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\" (UID: \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\") " Jan 31 15:04:36 crc kubenswrapper[4751]: I0131 15:04:36.853020 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-logs\") pod \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\" (UID: \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\") " Jan 31 15:04:36 crc kubenswrapper[4751]: I0131 15:04:36.853160 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-sys\") pod \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\" (UID: \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\") " Jan 31 15:04:36 crc kubenswrapper[4751]: I0131 15:04:36.853280 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-run\") pod \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\" (UID: \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\") " Jan 31 15:04:36 crc kubenswrapper[4751]: I0131 15:04:36.853415 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-httpd-run\") pod \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\" (UID: \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\") " Jan 31 15:04:36 crc kubenswrapper[4751]: I0131 15:04:36.853746 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\" (UID: \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\") " Jan 31 15:04:36 crc kubenswrapper[4751]: I0131 15:04:36.853858 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-config-data\") pod \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\" (UID: \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\") " Jan 31 15:04:36 crc kubenswrapper[4751]: I0131 15:04:36.853976 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-var-locks-brick\") pod \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\" (UID: \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\") " Jan 31 15:04:36 crc kubenswrapper[4751]: I0131 15:04:36.854196 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-dev\") pod \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\" (UID: \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\") " Jan 31 15:04:36 crc kubenswrapper[4751]: I0131 15:04:36.854300 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-etc-nvme\") pod \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\" (UID: \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\") " Jan 31 15:04:36 crc kubenswrapper[4751]: I0131 15:04:36.854408 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-etc-iscsi\") pod \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\" (UID: \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\") " Jan 31 15:04:36 crc kubenswrapper[4751]: I0131 15:04:36.854535 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-scripts\") pod \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\" (UID: \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\") " Jan 31 15:04:36 crc kubenswrapper[4751]: I0131 15:04:36.854625 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hwqz\" (UniqueName: \"kubernetes.io/projected/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-kube-api-access-7hwqz\") pod \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\" (UID: \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\") " Jan 31 15:04:36 crc kubenswrapper[4751]: I0131 15:04:36.854719 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-lib-modules\") pod \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\" (UID: \"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d\") " Jan 31 15:04:36 crc kubenswrapper[4751]: I0131 15:04:36.855282 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d" (UID: "46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:04:36 crc kubenswrapper[4751]: I0131 15:04:36.855398 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d" (UID: "46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:04:36 crc kubenswrapper[4751]: I0131 15:04:36.857382 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d" (UID: "46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:04:36 crc kubenswrapper[4751]: I0131 15:04:36.858490 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-sys" (OuterVolumeSpecName: "sys") pod "46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d" (UID: "46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:04:36 crc kubenswrapper[4751]: I0131 15:04:36.858534 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-run" (OuterVolumeSpecName: "run") pod "46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d" (UID: "46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:04:36 crc kubenswrapper[4751]: I0131 15:04:36.858821 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-logs" (OuterVolumeSpecName: "logs") pod "46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d" (UID: "46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:04:36 crc kubenswrapper[4751]: I0131 15:04:36.858858 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d" (UID: "46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:04:36 crc kubenswrapper[4751]: I0131 15:04:36.858859 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d" (UID: "46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:04:36 crc kubenswrapper[4751]: I0131 15:04:36.858881 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-dev" (OuterVolumeSpecName: "dev") pod "46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d" (UID: "46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:04:36 crc kubenswrapper[4751]: I0131 15:04:36.868770 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-scripts" (OuterVolumeSpecName: "scripts") pod "46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d" (UID: "46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:04:36 crc kubenswrapper[4751]: I0131 15:04:36.871242 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage16-crc" (OuterVolumeSpecName: "glance") pod "46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d" (UID: "46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d"). InnerVolumeSpecName "local-storage16-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:04:36 crc kubenswrapper[4751]: I0131 15:04:36.882984 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage15-crc" (OuterVolumeSpecName: "glance-cache") pod "46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d" (UID: "46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d"). InnerVolumeSpecName "local-storage15-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:04:36 crc kubenswrapper[4751]: I0131 15:04:36.885263 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-kube-api-access-7hwqz" (OuterVolumeSpecName: "kube-api-access-7hwqz") pod "46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d" (UID: "46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d"). InnerVolumeSpecName "kube-api-access-7hwqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:04:36 crc kubenswrapper[4751]: I0131 15:04:36.963881 4751 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-sys\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:36 crc kubenswrapper[4751]: I0131 15:04:36.964188 4751 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-run\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:36 crc kubenswrapper[4751]: I0131 15:04:36.964198 4751 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:36 crc kubenswrapper[4751]: I0131 15:04:36.964219 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" " Jan 31 15:04:36 crc kubenswrapper[4751]: I0131 15:04:36.964230 4751 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-var-locks-brick\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:36 crc kubenswrapper[4751]: I0131 15:04:36.964238 4751 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-dev\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:36 crc kubenswrapper[4751]: I0131 15:04:36.964246 4751 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-etc-nvme\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:36 crc kubenswrapper[4751]: I0131 15:04:36.964254 4751 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-etc-iscsi\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:36 crc kubenswrapper[4751]: I0131 15:04:36.964262 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:36 crc kubenswrapper[4751]: I0131 15:04:36.964270 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hwqz\" (UniqueName: \"kubernetes.io/projected/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-kube-api-access-7hwqz\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:36 crc kubenswrapper[4751]: I0131 15:04:36.964279 4751 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-lib-modules\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:36 crc kubenswrapper[4751]: I0131 15:04:36.964293 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" " Jan 31 15:04:36 crc kubenswrapper[4751]: I0131 15:04:36.964301 4751 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-logs\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:36 crc kubenswrapper[4751]: I0131 15:04:36.977494 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-config-data" (OuterVolumeSpecName: "config-data") pod "46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d" (UID: "46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:04:36 crc kubenswrapper[4751]: I0131 15:04:36.980239 4751 generic.go:334] "Generic (PLEG): container finished" podID="46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d" containerID="028ae3739718b4f9fa1406588bccd6e53a5c4051e633a6571cc7446d59642b17" exitCode=0 Jan 31 15:04:36 crc kubenswrapper[4751]: I0131 15:04:36.980277 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d","Type":"ContainerDied","Data":"028ae3739718b4f9fa1406588bccd6e53a5c4051e633a6571cc7446d59642b17"} Jan 31 15:04:36 crc kubenswrapper[4751]: I0131 15:04:36.980302 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d","Type":"ContainerDied","Data":"3752a1af33745a097b553c49a65db992ed225f5133c4335738998e6644147a8d"} Jan 31 15:04:36 crc kubenswrapper[4751]: I0131 15:04:36.980317 4751 scope.go:117] "RemoveContainer" containerID="028ae3739718b4f9fa1406588bccd6e53a5c4051e633a6571cc7446d59642b17" Jan 31 15:04:36 crc kubenswrapper[4751]: I0131 15:04:36.980447 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 15:04:36 crc kubenswrapper[4751]: I0131 15:04:36.989801 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage15-crc" (UniqueName: "kubernetes.io/local-volume/local-storage15-crc") on node "crc" Jan 31 15:04:36 crc kubenswrapper[4751]: I0131 15:04:36.996617 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage16-crc" (UniqueName: "kubernetes.io/local-volume/local-storage16-crc") on node "crc" Jan 31 15:04:37 crc kubenswrapper[4751]: I0131 15:04:37.010543 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Jan 31 15:04:37 crc kubenswrapper[4751]: I0131 15:04:37.016940 4751 scope.go:117] "RemoveContainer" containerID="9f3ae92fe0ff829368293b9ad1323f714b2c1d0600b09baf58692dca7948e19c" Jan 31 15:04:37 crc kubenswrapper[4751]: I0131 15:04:37.018956 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Jan 31 15:04:37 crc kubenswrapper[4751]: I0131 15:04:37.046118 4751 scope.go:117] "RemoveContainer" containerID="028ae3739718b4f9fa1406588bccd6e53a5c4051e633a6571cc7446d59642b17" Jan 31 15:04:37 crc kubenswrapper[4751]: E0131 15:04:37.046679 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"028ae3739718b4f9fa1406588bccd6e53a5c4051e633a6571cc7446d59642b17\": container with ID starting with 028ae3739718b4f9fa1406588bccd6e53a5c4051e633a6571cc7446d59642b17 not found: ID does not exist" containerID="028ae3739718b4f9fa1406588bccd6e53a5c4051e633a6571cc7446d59642b17" Jan 31 15:04:37 crc kubenswrapper[4751]: I0131 15:04:37.047128 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"028ae3739718b4f9fa1406588bccd6e53a5c4051e633a6571cc7446d59642b17"} err="failed to get container status \"028ae3739718b4f9fa1406588bccd6e53a5c4051e633a6571cc7446d59642b17\": rpc error: code = NotFound desc = could not find container \"028ae3739718b4f9fa1406588bccd6e53a5c4051e633a6571cc7446d59642b17\": container with ID starting with 028ae3739718b4f9fa1406588bccd6e53a5c4051e633a6571cc7446d59642b17 not found: ID does not exist" Jan 31 15:04:37 crc kubenswrapper[4751]: I0131 15:04:37.047167 4751 scope.go:117] "RemoveContainer" containerID="9f3ae92fe0ff829368293b9ad1323f714b2c1d0600b09baf58692dca7948e19c" Jan 31 15:04:37 crc kubenswrapper[4751]: E0131 15:04:37.047405 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f3ae92fe0ff829368293b9ad1323f714b2c1d0600b09baf58692dca7948e19c\": container with ID starting with 9f3ae92fe0ff829368293b9ad1323f714b2c1d0600b09baf58692dca7948e19c not found: ID does not exist" containerID="9f3ae92fe0ff829368293b9ad1323f714b2c1d0600b09baf58692dca7948e19c" Jan 31 15:04:37 crc kubenswrapper[4751]: I0131 15:04:37.047446 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f3ae92fe0ff829368293b9ad1323f714b2c1d0600b09baf58692dca7948e19c"} err="failed to get container status \"9f3ae92fe0ff829368293b9ad1323f714b2c1d0600b09baf58692dca7948e19c\": rpc error: code = NotFound desc = could not find container \"9f3ae92fe0ff829368293b9ad1323f714b2c1d0600b09baf58692dca7948e19c\": container with ID starting with 9f3ae92fe0ff829368293b9ad1323f714b2c1d0600b09baf58692dca7948e19c not found: ID does not exist" Jan 31 15:04:37 crc kubenswrapper[4751]: I0131 15:04:37.066285 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:37 crc kubenswrapper[4751]: I0131 15:04:37.066313 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:37 crc kubenswrapper[4751]: I0131 15:04:37.066324 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:37 crc kubenswrapper[4751]: E0131 15:04:37.075795 4751 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46fa6f5a_9f30_4e19_8b53_b8aa7c3a533d.slice\": RecentStats: unable to find data in memory cache]" Jan 31 15:04:37 crc kubenswrapper[4751]: I0131 15:04:37.569381 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-sync-zzxfv"] Jan 31 15:04:37 crc kubenswrapper[4751]: I0131 15:04:37.576134 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-sync-zzxfv"] Jan 31 15:04:37 crc kubenswrapper[4751]: I0131 15:04:37.592997 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance633a-account-delete-4gbtt"] Jan 31 15:04:37 crc kubenswrapper[4751]: E0131 15:04:37.593268 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34a7875b-0d63-43d7-9833-07b4ddc85ff6" containerName="glance-httpd" Jan 31 15:04:37 crc kubenswrapper[4751]: I0131 15:04:37.593283 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="34a7875b-0d63-43d7-9833-07b4ddc85ff6" containerName="glance-httpd" Jan 31 15:04:37 crc kubenswrapper[4751]: E0131 15:04:37.593297 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dfb5a9a-fa92-40c6-84ab-7b6b081cc688" containerName="glance-log" Jan 31 15:04:37 crc kubenswrapper[4751]: I0131 15:04:37.593303 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dfb5a9a-fa92-40c6-84ab-7b6b081cc688" containerName="glance-log" Jan 31 15:04:37 crc kubenswrapper[4751]: E0131 15:04:37.593319 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d" containerName="glance-log" Jan 31 15:04:37 crc kubenswrapper[4751]: I0131 15:04:37.593325 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d" containerName="glance-log" Jan 31 15:04:37 crc kubenswrapper[4751]: E0131 15:04:37.593336 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dfb5a9a-fa92-40c6-84ab-7b6b081cc688" containerName="glance-httpd" Jan 31 15:04:37 crc kubenswrapper[4751]: I0131 15:04:37.593343 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dfb5a9a-fa92-40c6-84ab-7b6b081cc688" containerName="glance-httpd" Jan 31 15:04:37 crc kubenswrapper[4751]: E0131 15:04:37.593357 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34a7875b-0d63-43d7-9833-07b4ddc85ff6" containerName="glance-log" Jan 31 15:04:37 crc kubenswrapper[4751]: I0131 15:04:37.593362 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="34a7875b-0d63-43d7-9833-07b4ddc85ff6" containerName="glance-log" Jan 31 15:04:37 crc kubenswrapper[4751]: E0131 15:04:37.593374 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d" containerName="glance-httpd" Jan 31 15:04:37 crc kubenswrapper[4751]: I0131 15:04:37.593379 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d" containerName="glance-httpd" Jan 31 15:04:37 crc kubenswrapper[4751]: I0131 15:04:37.593500 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="34a7875b-0d63-43d7-9833-07b4ddc85ff6" containerName="glance-log" Jan 31 15:04:37 crc kubenswrapper[4751]: I0131 15:04:37.593514 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="34a7875b-0d63-43d7-9833-07b4ddc85ff6" containerName="glance-httpd" Jan 31 15:04:37 crc kubenswrapper[4751]: I0131 15:04:37.593522 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d" containerName="glance-log" Jan 31 15:04:37 crc kubenswrapper[4751]: I0131 15:04:37.593530 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d" containerName="glance-httpd" Jan 31 15:04:37 crc kubenswrapper[4751]: I0131 15:04:37.593537 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dfb5a9a-fa92-40c6-84ab-7b6b081cc688" containerName="glance-log" Jan 31 15:04:37 crc kubenswrapper[4751]: I0131 15:04:37.593548 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dfb5a9a-fa92-40c6-84ab-7b6b081cc688" containerName="glance-httpd" Jan 31 15:04:37 crc kubenswrapper[4751]: I0131 15:04:37.593962 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance633a-account-delete-4gbtt" Jan 31 15:04:37 crc kubenswrapper[4751]: I0131 15:04:37.601744 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance633a-account-delete-4gbtt"] Jan 31 15:04:37 crc kubenswrapper[4751]: I0131 15:04:37.676206 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e25c4e6-94f1-4729-bcb2-ddb3200ccf3a-operator-scripts\") pod \"glance633a-account-delete-4gbtt\" (UID: \"1e25c4e6-94f1-4729-bcb2-ddb3200ccf3a\") " pod="glance-kuttl-tests/glance633a-account-delete-4gbtt" Jan 31 15:04:37 crc kubenswrapper[4751]: I0131 15:04:37.676546 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m2ns\" (UniqueName: \"kubernetes.io/projected/1e25c4e6-94f1-4729-bcb2-ddb3200ccf3a-kube-api-access-7m2ns\") pod \"glance633a-account-delete-4gbtt\" (UID: \"1e25c4e6-94f1-4729-bcb2-ddb3200ccf3a\") " pod="glance-kuttl-tests/glance633a-account-delete-4gbtt" Jan 31 15:04:37 crc kubenswrapper[4751]: I0131 15:04:37.777784 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e25c4e6-94f1-4729-bcb2-ddb3200ccf3a-operator-scripts\") pod \"glance633a-account-delete-4gbtt\" (UID: \"1e25c4e6-94f1-4729-bcb2-ddb3200ccf3a\") " pod="glance-kuttl-tests/glance633a-account-delete-4gbtt" Jan 31 15:04:37 crc kubenswrapper[4751]: I0131 15:04:37.777864 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7m2ns\" (UniqueName: \"kubernetes.io/projected/1e25c4e6-94f1-4729-bcb2-ddb3200ccf3a-kube-api-access-7m2ns\") pod \"glance633a-account-delete-4gbtt\" (UID: \"1e25c4e6-94f1-4729-bcb2-ddb3200ccf3a\") " pod="glance-kuttl-tests/glance633a-account-delete-4gbtt" Jan 31 15:04:37 crc kubenswrapper[4751]: I0131 15:04:37.778577 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e25c4e6-94f1-4729-bcb2-ddb3200ccf3a-operator-scripts\") pod \"glance633a-account-delete-4gbtt\" (UID: \"1e25c4e6-94f1-4729-bcb2-ddb3200ccf3a\") " pod="glance-kuttl-tests/glance633a-account-delete-4gbtt" Jan 31 15:04:37 crc kubenswrapper[4751]: I0131 15:04:37.798833 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m2ns\" (UniqueName: \"kubernetes.io/projected/1e25c4e6-94f1-4729-bcb2-ddb3200ccf3a-kube-api-access-7m2ns\") pod \"glance633a-account-delete-4gbtt\" (UID: \"1e25c4e6-94f1-4729-bcb2-ddb3200ccf3a\") " pod="glance-kuttl-tests/glance633a-account-delete-4gbtt" Jan 31 15:04:37 crc kubenswrapper[4751]: I0131 15:04:37.910372 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance633a-account-delete-4gbtt" Jan 31 15:04:38 crc kubenswrapper[4751]: I0131 15:04:38.342222 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance633a-account-delete-4gbtt"] Jan 31 15:04:38 crc kubenswrapper[4751]: I0131 15:04:38.414813 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d" path="/var/lib/kubelet/pods/46fa6f5a-9f30-4e19-8b53-b8aa7c3a533d/volumes" Jan 31 15:04:38 crc kubenswrapper[4751]: I0131 15:04:38.415551 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e91c4dc3-9319-4e4b-951a-4e1f117c3215" path="/var/lib/kubelet/pods/e91c4dc3-9319-4e4b-951a-4e1f117c3215/volumes" Jan 31 15:04:38 crc kubenswrapper[4751]: I0131 15:04:38.997902 4751 generic.go:334] "Generic (PLEG): container finished" podID="1e25c4e6-94f1-4729-bcb2-ddb3200ccf3a" containerID="c486e82ff06dabf3bbaf584cc05f4bf167ea45034bb1b4f577adb93e884d0e62" exitCode=0 Jan 31 15:04:38 crc kubenswrapper[4751]: I0131 15:04:38.998009 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance633a-account-delete-4gbtt" event={"ID":"1e25c4e6-94f1-4729-bcb2-ddb3200ccf3a","Type":"ContainerDied","Data":"c486e82ff06dabf3bbaf584cc05f4bf167ea45034bb1b4f577adb93e884d0e62"} Jan 31 15:04:38 crc kubenswrapper[4751]: I0131 15:04:38.998223 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance633a-account-delete-4gbtt" event={"ID":"1e25c4e6-94f1-4729-bcb2-ddb3200ccf3a","Type":"ContainerStarted","Data":"1a1bc68638ca8cbc0b488a80037c2b24dabf4fc76297519708441cd3c84fc4ca"} Jan 31 15:04:40 crc kubenswrapper[4751]: I0131 15:04:40.310475 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance633a-account-delete-4gbtt" Jan 31 15:04:40 crc kubenswrapper[4751]: I0131 15:04:40.412749 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e25c4e6-94f1-4729-bcb2-ddb3200ccf3a-operator-scripts\") pod \"1e25c4e6-94f1-4729-bcb2-ddb3200ccf3a\" (UID: \"1e25c4e6-94f1-4729-bcb2-ddb3200ccf3a\") " Jan 31 15:04:40 crc kubenswrapper[4751]: I0131 15:04:40.412924 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7m2ns\" (UniqueName: \"kubernetes.io/projected/1e25c4e6-94f1-4729-bcb2-ddb3200ccf3a-kube-api-access-7m2ns\") pod \"1e25c4e6-94f1-4729-bcb2-ddb3200ccf3a\" (UID: \"1e25c4e6-94f1-4729-bcb2-ddb3200ccf3a\") " Jan 31 15:04:40 crc kubenswrapper[4751]: I0131 15:04:40.413796 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e25c4e6-94f1-4729-bcb2-ddb3200ccf3a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1e25c4e6-94f1-4729-bcb2-ddb3200ccf3a" (UID: "1e25c4e6-94f1-4729-bcb2-ddb3200ccf3a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 15:04:40 crc kubenswrapper[4751]: I0131 15:04:40.423256 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e25c4e6-94f1-4729-bcb2-ddb3200ccf3a-kube-api-access-7m2ns" (OuterVolumeSpecName: "kube-api-access-7m2ns") pod "1e25c4e6-94f1-4729-bcb2-ddb3200ccf3a" (UID: "1e25c4e6-94f1-4729-bcb2-ddb3200ccf3a"). InnerVolumeSpecName "kube-api-access-7m2ns". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:04:40 crc kubenswrapper[4751]: I0131 15:04:40.515238 4751 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e25c4e6-94f1-4729-bcb2-ddb3200ccf3a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:40 crc kubenswrapper[4751]: I0131 15:04:40.515295 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7m2ns\" (UniqueName: \"kubernetes.io/projected/1e25c4e6-94f1-4729-bcb2-ddb3200ccf3a-kube-api-access-7m2ns\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:40 crc kubenswrapper[4751]: I0131 15:04:40.848471 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/openstackclient"] Jan 31 15:04:40 crc kubenswrapper[4751]: E0131 15:04:40.848809 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e25c4e6-94f1-4729-bcb2-ddb3200ccf3a" containerName="mariadb-account-delete" Jan 31 15:04:40 crc kubenswrapper[4751]: I0131 15:04:40.848824 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e25c4e6-94f1-4729-bcb2-ddb3200ccf3a" containerName="mariadb-account-delete" Jan 31 15:04:40 crc kubenswrapper[4751]: I0131 15:04:40.848936 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e25c4e6-94f1-4729-bcb2-ddb3200ccf3a" containerName="mariadb-account-delete" Jan 31 15:04:40 crc kubenswrapper[4751]: I0131 15:04:40.849399 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Jan 31 15:04:40 crc kubenswrapper[4751]: I0131 15:04:40.851350 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"default-dockercfg-gh2c4" Jan 31 15:04:40 crc kubenswrapper[4751]: I0131 15:04:40.851598 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"openstack-config-secret" Jan 31 15:04:40 crc kubenswrapper[4751]: I0131 15:04:40.852100 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-scripts-9db6gc427h" Jan 31 15:04:40 crc kubenswrapper[4751]: I0131 15:04:40.852384 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-config" Jan 31 15:04:40 crc kubenswrapper[4751]: I0131 15:04:40.869434 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstackclient"] Jan 31 15:04:40 crc kubenswrapper[4751]: I0131 15:04:40.922037 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config\") pod \"openstackclient\" (UID: \"eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb\") " pod="glance-kuttl-tests/openstackclient" Jan 31 15:04:40 crc kubenswrapper[4751]: I0131 15:04:40.922110 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qxht\" (UniqueName: \"kubernetes.io/projected/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-kube-api-access-5qxht\") pod \"openstackclient\" (UID: \"eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb\") " pod="glance-kuttl-tests/openstackclient" Jan 31 15:04:40 crc kubenswrapper[4751]: I0131 15:04:40.922167 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config-secret\") pod \"openstackclient\" (UID: \"eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb\") " pod="glance-kuttl-tests/openstackclient" Jan 31 15:04:40 crc kubenswrapper[4751]: I0131 15:04:40.922217 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-scripts\") pod \"openstackclient\" (UID: \"eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb\") " pod="glance-kuttl-tests/openstackclient" Jan 31 15:04:41 crc kubenswrapper[4751]: I0131 15:04:41.016244 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance633a-account-delete-4gbtt" event={"ID":"1e25c4e6-94f1-4729-bcb2-ddb3200ccf3a","Type":"ContainerDied","Data":"1a1bc68638ca8cbc0b488a80037c2b24dabf4fc76297519708441cd3c84fc4ca"} Jan 31 15:04:41 crc kubenswrapper[4751]: I0131 15:04:41.016307 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a1bc68638ca8cbc0b488a80037c2b24dabf4fc76297519708441cd3c84fc4ca" Jan 31 15:04:41 crc kubenswrapper[4751]: I0131 15:04:41.016587 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance633a-account-delete-4gbtt" Jan 31 15:04:41 crc kubenswrapper[4751]: I0131 15:04:41.023462 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-scripts\") pod \"openstackclient\" (UID: \"eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb\") " pod="glance-kuttl-tests/openstackclient" Jan 31 15:04:41 crc kubenswrapper[4751]: I0131 15:04:41.023543 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config\") pod \"openstackclient\" (UID: \"eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb\") " pod="glance-kuttl-tests/openstackclient" Jan 31 15:04:41 crc kubenswrapper[4751]: I0131 15:04:41.023575 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qxht\" (UniqueName: \"kubernetes.io/projected/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-kube-api-access-5qxht\") pod \"openstackclient\" (UID: \"eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb\") " pod="glance-kuttl-tests/openstackclient" Jan 31 15:04:41 crc kubenswrapper[4751]: I0131 15:04:41.023621 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config-secret\") pod \"openstackclient\" (UID: \"eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb\") " pod="glance-kuttl-tests/openstackclient" Jan 31 15:04:41 crc kubenswrapper[4751]: I0131 15:04:41.024789 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config\") pod \"openstackclient\" (UID: \"eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb\") " pod="glance-kuttl-tests/openstackclient" Jan 31 15:04:41 crc kubenswrapper[4751]: I0131 15:04:41.024800 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-scripts\") pod \"openstackclient\" (UID: \"eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb\") " pod="glance-kuttl-tests/openstackclient" Jan 31 15:04:41 crc kubenswrapper[4751]: I0131 15:04:41.032334 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config-secret\") pod \"openstackclient\" (UID: \"eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb\") " pod="glance-kuttl-tests/openstackclient" Jan 31 15:04:41 crc kubenswrapper[4751]: I0131 15:04:41.039808 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qxht\" (UniqueName: \"kubernetes.io/projected/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-kube-api-access-5qxht\") pod \"openstackclient\" (UID: \"eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb\") " pod="glance-kuttl-tests/openstackclient" Jan 31 15:04:41 crc kubenswrapper[4751]: I0131 15:04:41.178358 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Jan 31 15:04:41 crc kubenswrapper[4751]: I0131 15:04:41.598011 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstackclient"] Jan 31 15:04:42 crc kubenswrapper[4751]: I0131 15:04:42.024800 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb","Type":"ContainerStarted","Data":"c57156abf0bd178296d0ed3e767e5309d5ca4f93240cd0eeca5761f9adeff9af"} Jan 31 15:04:42 crc kubenswrapper[4751]: I0131 15:04:42.025118 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb","Type":"ContainerStarted","Data":"02039631bc193bd76c54ff239695724d1cccb3eda3e0ae5770dc0b2bb3e8c27e"} Jan 31 15:04:42 crc kubenswrapper[4751]: I0131 15:04:42.039594 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/openstackclient" podStartSLOduration=2.039578649 podStartE2EDuration="2.039578649s" podCreationTimestamp="2026-01-31 15:04:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 15:04:42.037422662 +0000 UTC m=+1386.412135547" watchObservedRunningTime="2026-01-31 15:04:42.039578649 +0000 UTC m=+1386.414291534" Jan 31 15:04:42 crc kubenswrapper[4751]: I0131 15:04:42.627879 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-create-9z5l9"] Jan 31 15:04:42 crc kubenswrapper[4751]: I0131 15:04:42.636969 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-create-9z5l9"] Jan 31 15:04:42 crc kubenswrapper[4751]: I0131 15:04:42.648516 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-633a-account-create-update-kmj9d"] Jan 31 15:04:42 crc kubenswrapper[4751]: I0131 15:04:42.654021 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance633a-account-delete-4gbtt"] Jan 31 15:04:42 crc kubenswrapper[4751]: I0131 15:04:42.659306 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-633a-account-create-update-kmj9d"] Jan 31 15:04:42 crc kubenswrapper[4751]: I0131 15:04:42.670006 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance633a-account-delete-4gbtt"] Jan 31 15:04:42 crc kubenswrapper[4751]: I0131 15:04:42.814247 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-create-bbbff"] Jan 31 15:04:42 crc kubenswrapper[4751]: I0131 15:04:42.815336 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-bbbff" Jan 31 15:04:42 crc kubenswrapper[4751]: I0131 15:04:42.820113 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-fb75-account-create-update-8nfdw"] Jan 31 15:04:42 crc kubenswrapper[4751]: I0131 15:04:42.821439 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-fb75-account-create-update-8nfdw" Jan 31 15:04:42 crc kubenswrapper[4751]: I0131 15:04:42.823592 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-db-secret" Jan 31 15:04:42 crc kubenswrapper[4751]: I0131 15:04:42.824428 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-bbbff"] Jan 31 15:04:42 crc kubenswrapper[4751]: I0131 15:04:42.835148 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-fb75-account-create-update-8nfdw"] Jan 31 15:04:42 crc kubenswrapper[4751]: I0131 15:04:42.879541 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p87fj\" (UniqueName: \"kubernetes.io/projected/88adbd16-7694-4f3b-8de1-b15932042491-kube-api-access-p87fj\") pod \"glance-db-create-bbbff\" (UID: \"88adbd16-7694-4f3b-8de1-b15932042491\") " pod="glance-kuttl-tests/glance-db-create-bbbff" Jan 31 15:04:42 crc kubenswrapper[4751]: I0131 15:04:42.879628 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d2dc104-ad94-47b2-add7-9314eb88e5b0-operator-scripts\") pod \"glance-fb75-account-create-update-8nfdw\" (UID: \"2d2dc104-ad94-47b2-add7-9314eb88e5b0\") " pod="glance-kuttl-tests/glance-fb75-account-create-update-8nfdw" Jan 31 15:04:42 crc kubenswrapper[4751]: I0131 15:04:42.879677 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcngm\" (UniqueName: \"kubernetes.io/projected/2d2dc104-ad94-47b2-add7-9314eb88e5b0-kube-api-access-vcngm\") pod \"glance-fb75-account-create-update-8nfdw\" (UID: \"2d2dc104-ad94-47b2-add7-9314eb88e5b0\") " pod="glance-kuttl-tests/glance-fb75-account-create-update-8nfdw" Jan 31 15:04:42 crc kubenswrapper[4751]: I0131 15:04:42.879715 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88adbd16-7694-4f3b-8de1-b15932042491-operator-scripts\") pod \"glance-db-create-bbbff\" (UID: \"88adbd16-7694-4f3b-8de1-b15932042491\") " pod="glance-kuttl-tests/glance-db-create-bbbff" Jan 31 15:04:42 crc kubenswrapper[4751]: I0131 15:04:42.980497 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p87fj\" (UniqueName: \"kubernetes.io/projected/88adbd16-7694-4f3b-8de1-b15932042491-kube-api-access-p87fj\") pod \"glance-db-create-bbbff\" (UID: \"88adbd16-7694-4f3b-8de1-b15932042491\") " pod="glance-kuttl-tests/glance-db-create-bbbff" Jan 31 15:04:42 crc kubenswrapper[4751]: I0131 15:04:42.980564 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d2dc104-ad94-47b2-add7-9314eb88e5b0-operator-scripts\") pod \"glance-fb75-account-create-update-8nfdw\" (UID: \"2d2dc104-ad94-47b2-add7-9314eb88e5b0\") " pod="glance-kuttl-tests/glance-fb75-account-create-update-8nfdw" Jan 31 15:04:42 crc kubenswrapper[4751]: I0131 15:04:42.980600 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcngm\" (UniqueName: \"kubernetes.io/projected/2d2dc104-ad94-47b2-add7-9314eb88e5b0-kube-api-access-vcngm\") pod \"glance-fb75-account-create-update-8nfdw\" (UID: \"2d2dc104-ad94-47b2-add7-9314eb88e5b0\") " pod="glance-kuttl-tests/glance-fb75-account-create-update-8nfdw" Jan 31 15:04:42 crc kubenswrapper[4751]: I0131 15:04:42.980629 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88adbd16-7694-4f3b-8de1-b15932042491-operator-scripts\") pod \"glance-db-create-bbbff\" (UID: \"88adbd16-7694-4f3b-8de1-b15932042491\") " pod="glance-kuttl-tests/glance-db-create-bbbff" Jan 31 15:04:42 crc kubenswrapper[4751]: I0131 15:04:42.981343 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88adbd16-7694-4f3b-8de1-b15932042491-operator-scripts\") pod \"glance-db-create-bbbff\" (UID: \"88adbd16-7694-4f3b-8de1-b15932042491\") " pod="glance-kuttl-tests/glance-db-create-bbbff" Jan 31 15:04:42 crc kubenswrapper[4751]: I0131 15:04:42.981845 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d2dc104-ad94-47b2-add7-9314eb88e5b0-operator-scripts\") pod \"glance-fb75-account-create-update-8nfdw\" (UID: \"2d2dc104-ad94-47b2-add7-9314eb88e5b0\") " pod="glance-kuttl-tests/glance-fb75-account-create-update-8nfdw" Jan 31 15:04:42 crc kubenswrapper[4751]: I0131 15:04:42.998116 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p87fj\" (UniqueName: \"kubernetes.io/projected/88adbd16-7694-4f3b-8de1-b15932042491-kube-api-access-p87fj\") pod \"glance-db-create-bbbff\" (UID: \"88adbd16-7694-4f3b-8de1-b15932042491\") " pod="glance-kuttl-tests/glance-db-create-bbbff" Jan 31 15:04:42 crc kubenswrapper[4751]: I0131 15:04:42.999210 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcngm\" (UniqueName: \"kubernetes.io/projected/2d2dc104-ad94-47b2-add7-9314eb88e5b0-kube-api-access-vcngm\") pod \"glance-fb75-account-create-update-8nfdw\" (UID: \"2d2dc104-ad94-47b2-add7-9314eb88e5b0\") " pod="glance-kuttl-tests/glance-fb75-account-create-update-8nfdw" Jan 31 15:04:43 crc kubenswrapper[4751]: I0131 15:04:43.139889 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-bbbff" Jan 31 15:04:43 crc kubenswrapper[4751]: I0131 15:04:43.157260 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-fb75-account-create-update-8nfdw" Jan 31 15:04:43 crc kubenswrapper[4751]: I0131 15:04:43.582951 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-bbbff"] Jan 31 15:04:43 crc kubenswrapper[4751]: W0131 15:04:43.590945 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88adbd16_7694_4f3b_8de1_b15932042491.slice/crio-2a073f1ce887077ba61fb8ce7661d5c057ed31d457d77d5994a4916ab666cd3c WatchSource:0}: Error finding container 2a073f1ce887077ba61fb8ce7661d5c057ed31d457d77d5994a4916ab666cd3c: Status 404 returned error can't find the container with id 2a073f1ce887077ba61fb8ce7661d5c057ed31d457d77d5994a4916ab666cd3c Jan 31 15:04:43 crc kubenswrapper[4751]: I0131 15:04:43.641473 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-fb75-account-create-update-8nfdw"] Jan 31 15:04:43 crc kubenswrapper[4751]: W0131 15:04:43.652156 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d2dc104_ad94_47b2_add7_9314eb88e5b0.slice/crio-4567af6c9694bff5a0fadc261de59c39482b429c7e9f3decda303815f32bc220 WatchSource:0}: Error finding container 4567af6c9694bff5a0fadc261de59c39482b429c7e9f3decda303815f32bc220: Status 404 returned error can't find the container with id 4567af6c9694bff5a0fadc261de59c39482b429c7e9f3decda303815f32bc220 Jan 31 15:04:44 crc kubenswrapper[4751]: I0131 15:04:44.047800 4751 generic.go:334] "Generic (PLEG): container finished" podID="2d2dc104-ad94-47b2-add7-9314eb88e5b0" containerID="a87d6cd135483f11e653acd5122adb7f7e32f94e7051f9157fa4ae04850a4813" exitCode=0 Jan 31 15:04:44 crc kubenswrapper[4751]: I0131 15:04:44.048275 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-fb75-account-create-update-8nfdw" event={"ID":"2d2dc104-ad94-47b2-add7-9314eb88e5b0","Type":"ContainerDied","Data":"a87d6cd135483f11e653acd5122adb7f7e32f94e7051f9157fa4ae04850a4813"} Jan 31 15:04:44 crc kubenswrapper[4751]: I0131 15:04:44.048307 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-fb75-account-create-update-8nfdw" event={"ID":"2d2dc104-ad94-47b2-add7-9314eb88e5b0","Type":"ContainerStarted","Data":"4567af6c9694bff5a0fadc261de59c39482b429c7e9f3decda303815f32bc220"} Jan 31 15:04:44 crc kubenswrapper[4751]: I0131 15:04:44.054968 4751 generic.go:334] "Generic (PLEG): container finished" podID="88adbd16-7694-4f3b-8de1-b15932042491" containerID="77d9f01225cc43eac33fe40d8bc014694a35ae20a7b11d1e4c070bd741ce303a" exitCode=0 Jan 31 15:04:44 crc kubenswrapper[4751]: I0131 15:04:44.055008 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-bbbff" event={"ID":"88adbd16-7694-4f3b-8de1-b15932042491","Type":"ContainerDied","Data":"77d9f01225cc43eac33fe40d8bc014694a35ae20a7b11d1e4c070bd741ce303a"} Jan 31 15:04:44 crc kubenswrapper[4751]: I0131 15:04:44.055029 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-bbbff" event={"ID":"88adbd16-7694-4f3b-8de1-b15932042491","Type":"ContainerStarted","Data":"2a073f1ce887077ba61fb8ce7661d5c057ed31d457d77d5994a4916ab666cd3c"} Jan 31 15:04:44 crc kubenswrapper[4751]: I0131 15:04:44.417831 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e25c4e6-94f1-4729-bcb2-ddb3200ccf3a" path="/var/lib/kubelet/pods/1e25c4e6-94f1-4729-bcb2-ddb3200ccf3a/volumes" Jan 31 15:04:44 crc kubenswrapper[4751]: I0131 15:04:44.418578 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d66e78e-6853-45e7-966f-cd9ec9586439" path="/var/lib/kubelet/pods/4d66e78e-6853-45e7-966f-cd9ec9586439/volumes" Jan 31 15:04:44 crc kubenswrapper[4751]: I0131 15:04:44.419047 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8028c623-f182-4e00-9c6d-c864a023abb5" path="/var/lib/kubelet/pods/8028c623-f182-4e00-9c6d-c864a023abb5/volumes" Jan 31 15:04:45 crc kubenswrapper[4751]: I0131 15:04:45.406424 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-fb75-account-create-update-8nfdw" Jan 31 15:04:45 crc kubenswrapper[4751]: I0131 15:04:45.413131 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-bbbff" Jan 31 15:04:45 crc kubenswrapper[4751]: I0131 15:04:45.532908 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcngm\" (UniqueName: \"kubernetes.io/projected/2d2dc104-ad94-47b2-add7-9314eb88e5b0-kube-api-access-vcngm\") pod \"2d2dc104-ad94-47b2-add7-9314eb88e5b0\" (UID: \"2d2dc104-ad94-47b2-add7-9314eb88e5b0\") " Jan 31 15:04:45 crc kubenswrapper[4751]: I0131 15:04:45.533015 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88adbd16-7694-4f3b-8de1-b15932042491-operator-scripts\") pod \"88adbd16-7694-4f3b-8de1-b15932042491\" (UID: \"88adbd16-7694-4f3b-8de1-b15932042491\") " Jan 31 15:04:45 crc kubenswrapper[4751]: I0131 15:04:45.533091 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d2dc104-ad94-47b2-add7-9314eb88e5b0-operator-scripts\") pod \"2d2dc104-ad94-47b2-add7-9314eb88e5b0\" (UID: \"2d2dc104-ad94-47b2-add7-9314eb88e5b0\") " Jan 31 15:04:45 crc kubenswrapper[4751]: I0131 15:04:45.533168 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p87fj\" (UniqueName: \"kubernetes.io/projected/88adbd16-7694-4f3b-8de1-b15932042491-kube-api-access-p87fj\") pod \"88adbd16-7694-4f3b-8de1-b15932042491\" (UID: \"88adbd16-7694-4f3b-8de1-b15932042491\") " Jan 31 15:04:45 crc kubenswrapper[4751]: I0131 15:04:45.539332 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88adbd16-7694-4f3b-8de1-b15932042491-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "88adbd16-7694-4f3b-8de1-b15932042491" (UID: "88adbd16-7694-4f3b-8de1-b15932042491"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 15:04:45 crc kubenswrapper[4751]: I0131 15:04:45.539738 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d2dc104-ad94-47b2-add7-9314eb88e5b0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2d2dc104-ad94-47b2-add7-9314eb88e5b0" (UID: "2d2dc104-ad94-47b2-add7-9314eb88e5b0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 15:04:45 crc kubenswrapper[4751]: I0131 15:04:45.549269 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88adbd16-7694-4f3b-8de1-b15932042491-kube-api-access-p87fj" (OuterVolumeSpecName: "kube-api-access-p87fj") pod "88adbd16-7694-4f3b-8de1-b15932042491" (UID: "88adbd16-7694-4f3b-8de1-b15932042491"). InnerVolumeSpecName "kube-api-access-p87fj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:04:45 crc kubenswrapper[4751]: I0131 15:04:45.551377 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d2dc104-ad94-47b2-add7-9314eb88e5b0-kube-api-access-vcngm" (OuterVolumeSpecName: "kube-api-access-vcngm") pod "2d2dc104-ad94-47b2-add7-9314eb88e5b0" (UID: "2d2dc104-ad94-47b2-add7-9314eb88e5b0"). InnerVolumeSpecName "kube-api-access-vcngm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:04:45 crc kubenswrapper[4751]: I0131 15:04:45.635274 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p87fj\" (UniqueName: \"kubernetes.io/projected/88adbd16-7694-4f3b-8de1-b15932042491-kube-api-access-p87fj\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:45 crc kubenswrapper[4751]: I0131 15:04:45.635329 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcngm\" (UniqueName: \"kubernetes.io/projected/2d2dc104-ad94-47b2-add7-9314eb88e5b0-kube-api-access-vcngm\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:45 crc kubenswrapper[4751]: I0131 15:04:45.635343 4751 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88adbd16-7694-4f3b-8de1-b15932042491-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:45 crc kubenswrapper[4751]: I0131 15:04:45.635354 4751 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d2dc104-ad94-47b2-add7-9314eb88e5b0-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:46 crc kubenswrapper[4751]: I0131 15:04:46.075035 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-fb75-account-create-update-8nfdw" Jan 31 15:04:46 crc kubenswrapper[4751]: I0131 15:04:46.075049 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-fb75-account-create-update-8nfdw" event={"ID":"2d2dc104-ad94-47b2-add7-9314eb88e5b0","Type":"ContainerDied","Data":"4567af6c9694bff5a0fadc261de59c39482b429c7e9f3decda303815f32bc220"} Jan 31 15:04:46 crc kubenswrapper[4751]: I0131 15:04:46.075121 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4567af6c9694bff5a0fadc261de59c39482b429c7e9f3decda303815f32bc220" Jan 31 15:04:46 crc kubenswrapper[4751]: I0131 15:04:46.076949 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-bbbff" event={"ID":"88adbd16-7694-4f3b-8de1-b15932042491","Type":"ContainerDied","Data":"2a073f1ce887077ba61fb8ce7661d5c057ed31d457d77d5994a4916ab666cd3c"} Jan 31 15:04:46 crc kubenswrapper[4751]: I0131 15:04:46.076979 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a073f1ce887077ba61fb8ce7661d5c057ed31d457d77d5994a4916ab666cd3c" Jan 31 15:04:46 crc kubenswrapper[4751]: I0131 15:04:46.077104 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-bbbff" Jan 31 15:04:47 crc kubenswrapper[4751]: I0131 15:04:47.946153 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-sync-96ldw"] Jan 31 15:04:47 crc kubenswrapper[4751]: E0131 15:04:47.946903 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88adbd16-7694-4f3b-8de1-b15932042491" containerName="mariadb-database-create" Jan 31 15:04:47 crc kubenswrapper[4751]: I0131 15:04:47.946916 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="88adbd16-7694-4f3b-8de1-b15932042491" containerName="mariadb-database-create" Jan 31 15:04:47 crc kubenswrapper[4751]: E0131 15:04:47.946946 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d2dc104-ad94-47b2-add7-9314eb88e5b0" containerName="mariadb-account-create-update" Jan 31 15:04:47 crc kubenswrapper[4751]: I0131 15:04:47.946953 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d2dc104-ad94-47b2-add7-9314eb88e5b0" containerName="mariadb-account-create-update" Jan 31 15:04:47 crc kubenswrapper[4751]: I0131 15:04:47.947132 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="88adbd16-7694-4f3b-8de1-b15932042491" containerName="mariadb-database-create" Jan 31 15:04:47 crc kubenswrapper[4751]: I0131 15:04:47.947150 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d2dc104-ad94-47b2-add7-9314eb88e5b0" containerName="mariadb-account-create-update" Jan 31 15:04:47 crc kubenswrapper[4751]: I0131 15:04:47.947722 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-96ldw" Jan 31 15:04:47 crc kubenswrapper[4751]: I0131 15:04:47.949853 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-config-data" Jan 31 15:04:47 crc kubenswrapper[4751]: I0131 15:04:47.949858 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-mdf2n" Jan 31 15:04:47 crc kubenswrapper[4751]: I0131 15:04:47.973241 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-96ldw"] Jan 31 15:04:48 crc kubenswrapper[4751]: I0131 15:04:48.071735 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/837cbdc7-6443-4789-a796-c2f1bd79119d-db-sync-config-data\") pod \"glance-db-sync-96ldw\" (UID: \"837cbdc7-6443-4789-a796-c2f1bd79119d\") " pod="glance-kuttl-tests/glance-db-sync-96ldw" Jan 31 15:04:48 crc kubenswrapper[4751]: I0131 15:04:48.071819 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8jzf\" (UniqueName: \"kubernetes.io/projected/837cbdc7-6443-4789-a796-c2f1bd79119d-kube-api-access-w8jzf\") pod \"glance-db-sync-96ldw\" (UID: \"837cbdc7-6443-4789-a796-c2f1bd79119d\") " pod="glance-kuttl-tests/glance-db-sync-96ldw" Jan 31 15:04:48 crc kubenswrapper[4751]: I0131 15:04:48.071868 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/837cbdc7-6443-4789-a796-c2f1bd79119d-config-data\") pod \"glance-db-sync-96ldw\" (UID: \"837cbdc7-6443-4789-a796-c2f1bd79119d\") " pod="glance-kuttl-tests/glance-db-sync-96ldw" Jan 31 15:04:48 crc kubenswrapper[4751]: I0131 15:04:48.173847 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/837cbdc7-6443-4789-a796-c2f1bd79119d-db-sync-config-data\") pod \"glance-db-sync-96ldw\" (UID: \"837cbdc7-6443-4789-a796-c2f1bd79119d\") " pod="glance-kuttl-tests/glance-db-sync-96ldw" Jan 31 15:04:48 crc kubenswrapper[4751]: I0131 15:04:48.173959 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8jzf\" (UniqueName: \"kubernetes.io/projected/837cbdc7-6443-4789-a796-c2f1bd79119d-kube-api-access-w8jzf\") pod \"glance-db-sync-96ldw\" (UID: \"837cbdc7-6443-4789-a796-c2f1bd79119d\") " pod="glance-kuttl-tests/glance-db-sync-96ldw" Jan 31 15:04:48 crc kubenswrapper[4751]: I0131 15:04:48.174064 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/837cbdc7-6443-4789-a796-c2f1bd79119d-config-data\") pod \"glance-db-sync-96ldw\" (UID: \"837cbdc7-6443-4789-a796-c2f1bd79119d\") " pod="glance-kuttl-tests/glance-db-sync-96ldw" Jan 31 15:04:48 crc kubenswrapper[4751]: I0131 15:04:48.185269 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/837cbdc7-6443-4789-a796-c2f1bd79119d-config-data\") pod \"glance-db-sync-96ldw\" (UID: \"837cbdc7-6443-4789-a796-c2f1bd79119d\") " pod="glance-kuttl-tests/glance-db-sync-96ldw" Jan 31 15:04:48 crc kubenswrapper[4751]: I0131 15:04:48.185553 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/837cbdc7-6443-4789-a796-c2f1bd79119d-db-sync-config-data\") pod \"glance-db-sync-96ldw\" (UID: \"837cbdc7-6443-4789-a796-c2f1bd79119d\") " pod="glance-kuttl-tests/glance-db-sync-96ldw" Jan 31 15:04:48 crc kubenswrapper[4751]: I0131 15:04:48.197580 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8jzf\" (UniqueName: \"kubernetes.io/projected/837cbdc7-6443-4789-a796-c2f1bd79119d-kube-api-access-w8jzf\") pod \"glance-db-sync-96ldw\" (UID: \"837cbdc7-6443-4789-a796-c2f1bd79119d\") " pod="glance-kuttl-tests/glance-db-sync-96ldw" Jan 31 15:04:48 crc kubenswrapper[4751]: I0131 15:04:48.287550 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-96ldw" Jan 31 15:04:48 crc kubenswrapper[4751]: I0131 15:04:48.532202 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-96ldw"] Jan 31 15:04:49 crc kubenswrapper[4751]: I0131 15:04:49.098387 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-96ldw" event={"ID":"837cbdc7-6443-4789-a796-c2f1bd79119d","Type":"ContainerStarted","Data":"8c9ca246c6d8d22550b0a337fe277ae3824da506216668e0ce9b2ebcd4cee908"} Jan 31 15:04:49 crc kubenswrapper[4751]: I0131 15:04:49.098747 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-96ldw" event={"ID":"837cbdc7-6443-4789-a796-c2f1bd79119d","Type":"ContainerStarted","Data":"826ef65431d89ba10bddb0143acb4f6024aa6af60be7b453598c0fcfebc2d4cc"} Jan 31 15:04:49 crc kubenswrapper[4751]: I0131 15:04:49.118729 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-sync-96ldw" podStartSLOduration=2.118704519 podStartE2EDuration="2.118704519s" podCreationTimestamp="2026-01-31 15:04:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 15:04:49.112344621 +0000 UTC m=+1393.487057516" watchObservedRunningTime="2026-01-31 15:04:49.118704519 +0000 UTC m=+1393.493417404" Jan 31 15:04:52 crc kubenswrapper[4751]: I0131 15:04:52.128597 4751 generic.go:334] "Generic (PLEG): container finished" podID="837cbdc7-6443-4789-a796-c2f1bd79119d" containerID="8c9ca246c6d8d22550b0a337fe277ae3824da506216668e0ce9b2ebcd4cee908" exitCode=0 Jan 31 15:04:52 crc kubenswrapper[4751]: I0131 15:04:52.128675 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-96ldw" event={"ID":"837cbdc7-6443-4789-a796-c2f1bd79119d","Type":"ContainerDied","Data":"8c9ca246c6d8d22550b0a337fe277ae3824da506216668e0ce9b2ebcd4cee908"} Jan 31 15:04:53 crc kubenswrapper[4751]: I0131 15:04:53.486530 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-96ldw" Jan 31 15:04:53 crc kubenswrapper[4751]: I0131 15:04:53.650755 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/837cbdc7-6443-4789-a796-c2f1bd79119d-db-sync-config-data\") pod \"837cbdc7-6443-4789-a796-c2f1bd79119d\" (UID: \"837cbdc7-6443-4789-a796-c2f1bd79119d\") " Jan 31 15:04:53 crc kubenswrapper[4751]: I0131 15:04:53.650844 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/837cbdc7-6443-4789-a796-c2f1bd79119d-config-data\") pod \"837cbdc7-6443-4789-a796-c2f1bd79119d\" (UID: \"837cbdc7-6443-4789-a796-c2f1bd79119d\") " Jan 31 15:04:53 crc kubenswrapper[4751]: I0131 15:04:53.651151 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8jzf\" (UniqueName: \"kubernetes.io/projected/837cbdc7-6443-4789-a796-c2f1bd79119d-kube-api-access-w8jzf\") pod \"837cbdc7-6443-4789-a796-c2f1bd79119d\" (UID: \"837cbdc7-6443-4789-a796-c2f1bd79119d\") " Jan 31 15:04:53 crc kubenswrapper[4751]: I0131 15:04:53.655634 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/837cbdc7-6443-4789-a796-c2f1bd79119d-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "837cbdc7-6443-4789-a796-c2f1bd79119d" (UID: "837cbdc7-6443-4789-a796-c2f1bd79119d"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:04:53 crc kubenswrapper[4751]: I0131 15:04:53.655638 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/837cbdc7-6443-4789-a796-c2f1bd79119d-kube-api-access-w8jzf" (OuterVolumeSpecName: "kube-api-access-w8jzf") pod "837cbdc7-6443-4789-a796-c2f1bd79119d" (UID: "837cbdc7-6443-4789-a796-c2f1bd79119d"). InnerVolumeSpecName "kube-api-access-w8jzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:04:53 crc kubenswrapper[4751]: I0131 15:04:53.719386 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/837cbdc7-6443-4789-a796-c2f1bd79119d-config-data" (OuterVolumeSpecName: "config-data") pod "837cbdc7-6443-4789-a796-c2f1bd79119d" (UID: "837cbdc7-6443-4789-a796-c2f1bd79119d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:04:53 crc kubenswrapper[4751]: I0131 15:04:53.753686 4751 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/837cbdc7-6443-4789-a796-c2f1bd79119d-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:53 crc kubenswrapper[4751]: I0131 15:04:53.753740 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/837cbdc7-6443-4789-a796-c2f1bd79119d-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:53 crc kubenswrapper[4751]: I0131 15:04:53.753761 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8jzf\" (UniqueName: \"kubernetes.io/projected/837cbdc7-6443-4789-a796-c2f1bd79119d-kube-api-access-w8jzf\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:54 crc kubenswrapper[4751]: I0131 15:04:54.152731 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-96ldw" event={"ID":"837cbdc7-6443-4789-a796-c2f1bd79119d","Type":"ContainerDied","Data":"826ef65431d89ba10bddb0143acb4f6024aa6af60be7b453598c0fcfebc2d4cc"} Jan 31 15:04:54 crc kubenswrapper[4751]: I0131 15:04:54.152774 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="826ef65431d89ba10bddb0143acb4f6024aa6af60be7b453598c0fcfebc2d4cc" Jan 31 15:04:54 crc kubenswrapper[4751]: I0131 15:04:54.153181 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-96ldw" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.307022 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Jan 31 15:04:55 crc kubenswrapper[4751]: E0131 15:04:55.307393 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="837cbdc7-6443-4789-a796-c2f1bd79119d" containerName="glance-db-sync" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.307410 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="837cbdc7-6443-4789-a796-c2f1bd79119d" containerName="glance-db-sync" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.307584 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="837cbdc7-6443-4789-a796-c2f1bd79119d" containerName="glance-db-sync" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.308513 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.313794 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.313797 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-external-config-data" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.316342 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.317943 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-mdf2n" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.479200 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-scripts\") pod \"glance-default-external-api-1\" (UID: \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.479250 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.479273 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-config-data\") pod \"glance-default-external-api-1\" (UID: \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.479293 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-logs\") pod \"glance-default-external-api-1\" (UID: \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.479337 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjmr8\" (UniqueName: \"kubernetes.io/projected/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-kube-api-access-rjmr8\") pod \"glance-default-external-api-1\" (UID: \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.479359 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-sys\") pod \"glance-default-external-api-1\" (UID: \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.479385 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-external-api-1\" (UID: \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.479404 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.479423 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-dev\") pod \"glance-default-external-api-1\" (UID: \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.479437 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.479473 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.479493 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-run\") pod \"glance-default-external-api-1\" (UID: \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.479506 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.479526 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-external-api-1\" (UID: \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.541319 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.542762 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.572265 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.580991 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-run\") pod \"glance-default-external-api-1\" (UID: \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.581029 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.581054 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-external-api-1\" (UID: \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.581100 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-scripts\") pod \"glance-default-external-api-1\" (UID: \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.581121 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.581149 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-config-data\") pod \"glance-default-external-api-1\" (UID: \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.581166 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-logs\") pod \"glance-default-external-api-1\" (UID: \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.581219 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjmr8\" (UniqueName: \"kubernetes.io/projected/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-kube-api-access-rjmr8\") pod \"glance-default-external-api-1\" (UID: \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.581235 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-sys\") pod \"glance-default-external-api-1\" (UID: \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.581267 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-external-api-1\" (UID: \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.581287 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.581305 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-dev\") pod \"glance-default-external-api-1\" (UID: \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.581320 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.581354 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.581433 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.581468 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-run\") pod \"glance-default-external-api-1\" (UID: \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.581490 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.581817 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-external-api-1\" (UID: \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\") device mount path \"/mnt/openstack/pv13\"" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.582463 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.583043 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-sys\") pod \"glance-default-external-api-1\" (UID: \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.583578 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-logs\") pod \"glance-default-external-api-1\" (UID: \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.583633 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.583973 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-external-api-1\" (UID: \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\") device mount path \"/mnt/openstack/pv17\"" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.585293 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-dev\") pod \"glance-default-external-api-1\" (UID: \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.585351 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.591377 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-scripts\") pod \"glance-default-external-api-1\" (UID: \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.593926 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-config-data\") pod \"glance-default-external-api-1\" (UID: \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.618451 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-external-api-1\" (UID: \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.625509 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjmr8\" (UniqueName: \"kubernetes.io/projected/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-kube-api-access-rjmr8\") pod \"glance-default-external-api-1\" (UID: \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.647276 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"glance-default-external-api-1\" (UID: \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.662311 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.663668 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.671607 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-internal-config-data" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.689376 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"d1590321-a9e4-43b9-a9a0-04dc832b3332\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.689438 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d1590321-a9e4-43b9-a9a0-04dc832b3332-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"d1590321-a9e4-43b9-a9a0-04dc832b3332\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.689463 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1590321-a9e4-43b9-a9a0-04dc832b3332-config-data\") pod \"glance-default-external-api-0\" (UID: \"d1590321-a9e4-43b9-a9a0-04dc832b3332\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.689492 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d1590321-a9e4-43b9-a9a0-04dc832b3332-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"d1590321-a9e4-43b9-a9a0-04dc832b3332\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.689523 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d1590321-a9e4-43b9-a9a0-04dc832b3332-dev\") pod \"glance-default-external-api-0\" (UID: \"d1590321-a9e4-43b9-a9a0-04dc832b3332\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.689566 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d1590321-a9e4-43b9-a9a0-04dc832b3332-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"d1590321-a9e4-43b9-a9a0-04dc832b3332\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.689589 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d1590321-a9e4-43b9-a9a0-04dc832b3332-run\") pod \"glance-default-external-api-0\" (UID: \"d1590321-a9e4-43b9-a9a0-04dc832b3332\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.689612 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d1590321-a9e4-43b9-a9a0-04dc832b3332-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d1590321-a9e4-43b9-a9a0-04dc832b3332\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.689646 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"d1590321-a9e4-43b9-a9a0-04dc832b3332\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.689675 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1590321-a9e4-43b9-a9a0-04dc832b3332-scripts\") pod \"glance-default-external-api-0\" (UID: \"d1590321-a9e4-43b9-a9a0-04dc832b3332\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.689706 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1590321-a9e4-43b9-a9a0-04dc832b3332-logs\") pod \"glance-default-external-api-0\" (UID: \"d1590321-a9e4-43b9-a9a0-04dc832b3332\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.689727 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d1590321-a9e4-43b9-a9a0-04dc832b3332-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"d1590321-a9e4-43b9-a9a0-04dc832b3332\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.689770 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9x4g\" (UniqueName: \"kubernetes.io/projected/d1590321-a9e4-43b9-a9a0-04dc832b3332-kube-api-access-g9x4g\") pod \"glance-default-external-api-0\" (UID: \"d1590321-a9e4-43b9-a9a0-04dc832b3332\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.689795 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d1590321-a9e4-43b9-a9a0-04dc832b3332-sys\") pod \"glance-default-external-api-0\" (UID: \"d1590321-a9e4-43b9-a9a0-04dc832b3332\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.692555 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.693890 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.703110 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.710711 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.791215 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.791261 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b39c23a2-492d-4401-bb73-6b4bfc849bec-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"b39c23a2-492d-4401-bb73-6b4bfc849bec\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.791289 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b39c23a2-492d-4401-bb73-6b4bfc849bec-config-data\") pod \"glance-default-internal-api-1\" (UID: \"b39c23a2-492d-4401-bb73-6b4bfc849bec\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.791339 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b39c23a2-492d-4401-bb73-6b4bfc849bec-scripts\") pod \"glance-default-internal-api-1\" (UID: \"b39c23a2-492d-4401-bb73-6b4bfc849bec\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.791355 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b39c23a2-492d-4401-bb73-6b4bfc849bec-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"b39c23a2-492d-4401-bb73-6b4bfc849bec\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.791468 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9x4g\" (UniqueName: \"kubernetes.io/projected/d1590321-a9e4-43b9-a9a0-04dc832b3332-kube-api-access-g9x4g\") pod \"glance-default-external-api-0\" (UID: \"d1590321-a9e4-43b9-a9a0-04dc832b3332\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.791534 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.791561 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-logs\") pod \"glance-default-internal-api-0\" (UID: \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.791587 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-internal-api-1\" (UID: \"b39c23a2-492d-4401-bb73-6b4bfc849bec\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.791633 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d1590321-a9e4-43b9-a9a0-04dc832b3332-sys\") pod \"glance-default-external-api-0\" (UID: \"d1590321-a9e4-43b9-a9a0-04dc832b3332\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.791691 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-dev\") pod \"glance-default-internal-api-0\" (UID: \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.791778 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d1590321-a9e4-43b9-a9a0-04dc832b3332-sys\") pod \"glance-default-external-api-0\" (UID: \"d1590321-a9e4-43b9-a9a0-04dc832b3332\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.791798 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-1\" (UID: \"b39c23a2-492d-4401-bb73-6b4bfc849bec\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.791863 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-sys\") pod \"glance-default-internal-api-0\" (UID: \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.791889 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.791916 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b39c23a2-492d-4401-bb73-6b4bfc849bec-dev\") pod \"glance-default-internal-api-1\" (UID: \"b39c23a2-492d-4401-bb73-6b4bfc849bec\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.791991 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-internal-api-0\" (UID: \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.792040 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"d1590321-a9e4-43b9-a9a0-04dc832b3332\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.792081 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b39c23a2-492d-4401-bb73-6b4bfc849bec-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"b39c23a2-492d-4401-bb73-6b4bfc849bec\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.792156 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.792189 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d1590321-a9e4-43b9-a9a0-04dc832b3332-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"d1590321-a9e4-43b9-a9a0-04dc832b3332\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.792217 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1590321-a9e4-43b9-a9a0-04dc832b3332-config-data\") pod \"glance-default-external-api-0\" (UID: \"d1590321-a9e4-43b9-a9a0-04dc832b3332\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.792244 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b39c23a2-492d-4401-bb73-6b4bfc849bec-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"b39c23a2-492d-4401-bb73-6b4bfc849bec\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.792271 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d1590321-a9e4-43b9-a9a0-04dc832b3332-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"d1590321-a9e4-43b9-a9a0-04dc832b3332\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.792278 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.792361 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d1590321-a9e4-43b9-a9a0-04dc832b3332-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"d1590321-a9e4-43b9-a9a0-04dc832b3332\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.792396 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d1590321-a9e4-43b9-a9a0-04dc832b3332-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"d1590321-a9e4-43b9-a9a0-04dc832b3332\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.792396 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.792447 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d1590321-a9e4-43b9-a9a0-04dc832b3332-dev\") pod \"glance-default-external-api-0\" (UID: \"d1590321-a9e4-43b9-a9a0-04dc832b3332\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.792369 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"d1590321-a9e4-43b9-a9a0-04dc832b3332\") device mount path \"/mnt/openstack/pv02\"" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.792466 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b39c23a2-492d-4401-bb73-6b4bfc849bec-run\") pod \"glance-default-internal-api-1\" (UID: \"b39c23a2-492d-4401-bb73-6b4bfc849bec\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.792484 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b39c23a2-492d-4401-bb73-6b4bfc849bec-sys\") pod \"glance-default-internal-api-1\" (UID: \"b39c23a2-492d-4401-bb73-6b4bfc849bec\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.792508 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d1590321-a9e4-43b9-a9a0-04dc832b3332-dev\") pod \"glance-default-external-api-0\" (UID: \"d1590321-a9e4-43b9-a9a0-04dc832b3332\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.792555 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b39c23a2-492d-4401-bb73-6b4bfc849bec-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"b39c23a2-492d-4401-bb73-6b4bfc849bec\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.792609 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d1590321-a9e4-43b9-a9a0-04dc832b3332-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"d1590321-a9e4-43b9-a9a0-04dc832b3332\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.792642 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d1590321-a9e4-43b9-a9a0-04dc832b3332-run\") pod \"glance-default-external-api-0\" (UID: \"d1590321-a9e4-43b9-a9a0-04dc832b3332\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.792663 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d1590321-a9e4-43b9-a9a0-04dc832b3332-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"d1590321-a9e4-43b9-a9a0-04dc832b3332\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.792684 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d1590321-a9e4-43b9-a9a0-04dc832b3332-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d1590321-a9e4-43b9-a9a0-04dc832b3332\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.792691 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d1590321-a9e4-43b9-a9a0-04dc832b3332-run\") pod \"glance-default-external-api-0\" (UID: \"d1590321-a9e4-43b9-a9a0-04dc832b3332\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.792727 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"d1590321-a9e4-43b9-a9a0-04dc832b3332\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.792753 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-run\") pod \"glance-default-internal-api-0\" (UID: \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.792773 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1590321-a9e4-43b9-a9a0-04dc832b3332-scripts\") pod \"glance-default-external-api-0\" (UID: \"d1590321-a9e4-43b9-a9a0-04dc832b3332\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.792789 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b39c23a2-492d-4401-bb73-6b4bfc849bec-logs\") pod \"glance-default-internal-api-1\" (UID: \"b39c23a2-492d-4401-bb73-6b4bfc849bec\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.792807 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.792836 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj5bx\" (UniqueName: \"kubernetes.io/projected/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-kube-api-access-mj5bx\") pod \"glance-default-internal-api-0\" (UID: \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.792863 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.792893 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1590321-a9e4-43b9-a9a0-04dc832b3332-logs\") pod \"glance-default-external-api-0\" (UID: \"d1590321-a9e4-43b9-a9a0-04dc832b3332\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.792912 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b65vr\" (UniqueName: \"kubernetes.io/projected/b39c23a2-492d-4401-bb73-6b4bfc849bec-kube-api-access-b65vr\") pod \"glance-default-internal-api-1\" (UID: \"b39c23a2-492d-4401-bb73-6b4bfc849bec\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.792939 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d1590321-a9e4-43b9-a9a0-04dc832b3332-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"d1590321-a9e4-43b9-a9a0-04dc832b3332\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.792957 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"d1590321-a9e4-43b9-a9a0-04dc832b3332\") device mount path \"/mnt/openstack/pv05\"" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.793039 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d1590321-a9e4-43b9-a9a0-04dc832b3332-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"d1590321-a9e4-43b9-a9a0-04dc832b3332\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.793317 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d1590321-a9e4-43b9-a9a0-04dc832b3332-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d1590321-a9e4-43b9-a9a0-04dc832b3332\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.793335 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1590321-a9e4-43b9-a9a0-04dc832b3332-logs\") pod \"glance-default-external-api-0\" (UID: \"d1590321-a9e4-43b9-a9a0-04dc832b3332\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.796524 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1590321-a9e4-43b9-a9a0-04dc832b3332-scripts\") pod \"glance-default-external-api-0\" (UID: \"d1590321-a9e4-43b9-a9a0-04dc832b3332\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.799288 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1590321-a9e4-43b9-a9a0-04dc832b3332-config-data\") pod \"glance-default-external-api-0\" (UID: \"d1590321-a9e4-43b9-a9a0-04dc832b3332\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.810786 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9x4g\" (UniqueName: \"kubernetes.io/projected/d1590321-a9e4-43b9-a9a0-04dc832b3332-kube-api-access-g9x4g\") pod \"glance-default-external-api-0\" (UID: \"d1590321-a9e4-43b9-a9a0-04dc832b3332\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.813677 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"d1590321-a9e4-43b9-a9a0-04dc832b3332\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.815934 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"d1590321-a9e4-43b9-a9a0-04dc832b3332\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.863932 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.893915 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-run\") pod \"glance-default-internal-api-0\" (UID: \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.894095 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b39c23a2-492d-4401-bb73-6b4bfc849bec-logs\") pod \"glance-default-internal-api-1\" (UID: \"b39c23a2-492d-4401-bb73-6b4bfc849bec\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.894208 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.894299 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mj5bx\" (UniqueName: \"kubernetes.io/projected/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-kube-api-access-mj5bx\") pod \"glance-default-internal-api-0\" (UID: \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.894387 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.894473 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b65vr\" (UniqueName: \"kubernetes.io/projected/b39c23a2-492d-4401-bb73-6b4bfc849bec-kube-api-access-b65vr\") pod \"glance-default-internal-api-1\" (UID: \"b39c23a2-492d-4401-bb73-6b4bfc849bec\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.894566 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.894681 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b39c23a2-492d-4401-bb73-6b4bfc849bec-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"b39c23a2-492d-4401-bb73-6b4bfc849bec\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.894779 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b39c23a2-492d-4401-bb73-6b4bfc849bec-config-data\") pod \"glance-default-internal-api-1\" (UID: \"b39c23a2-492d-4401-bb73-6b4bfc849bec\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.894879 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b39c23a2-492d-4401-bb73-6b4bfc849bec-scripts\") pod \"glance-default-internal-api-1\" (UID: \"b39c23a2-492d-4401-bb73-6b4bfc849bec\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.894961 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b39c23a2-492d-4401-bb73-6b4bfc849bec-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"b39c23a2-492d-4401-bb73-6b4bfc849bec\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.895036 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.895137 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-logs\") pod \"glance-default-internal-api-0\" (UID: \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.895217 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-internal-api-1\" (UID: \"b39c23a2-492d-4401-bb73-6b4bfc849bec\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.895299 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-dev\") pod \"glance-default-internal-api-0\" (UID: \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.895369 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-1\" (UID: \"b39c23a2-492d-4401-bb73-6b4bfc849bec\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.895436 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-sys\") pod \"glance-default-internal-api-0\" (UID: \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.895513 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.895579 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b39c23a2-492d-4401-bb73-6b4bfc849bec-dev\") pod \"glance-default-internal-api-1\" (UID: \"b39c23a2-492d-4401-bb73-6b4bfc849bec\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.895656 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-internal-api-0\" (UID: \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.895726 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b39c23a2-492d-4401-bb73-6b4bfc849bec-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"b39c23a2-492d-4401-bb73-6b4bfc849bec\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.895799 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.895868 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b39c23a2-492d-4401-bb73-6b4bfc849bec-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"b39c23a2-492d-4401-bb73-6b4bfc849bec\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.895935 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.896036 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.896142 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b39c23a2-492d-4401-bb73-6b4bfc849bec-run\") pod \"glance-default-internal-api-1\" (UID: \"b39c23a2-492d-4401-bb73-6b4bfc849bec\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.896265 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b39c23a2-492d-4401-bb73-6b4bfc849bec-sys\") pod \"glance-default-internal-api-1\" (UID: \"b39c23a2-492d-4401-bb73-6b4bfc849bec\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.896349 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b39c23a2-492d-4401-bb73-6b4bfc849bec-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"b39c23a2-492d-4401-bb73-6b4bfc849bec\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.896503 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b39c23a2-492d-4401-bb73-6b4bfc849bec-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"b39c23a2-492d-4401-bb73-6b4bfc849bec\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.896595 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-run\") pod \"glance-default-internal-api-0\" (UID: \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.897017 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b39c23a2-492d-4401-bb73-6b4bfc849bec-logs\") pod \"glance-default-internal-api-1\" (UID: \"b39c23a2-492d-4401-bb73-6b4bfc849bec\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.897529 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-dev\") pod \"glance-default-internal-api-0\" (UID: \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.898253 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.898323 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-sys\") pod \"glance-default-internal-api-0\" (UID: \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.898366 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-1\" (UID: \"b39c23a2-492d-4401-bb73-6b4bfc849bec\") device mount path \"/mnt/openstack/pv18\"" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.898685 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.898755 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b39c23a2-492d-4401-bb73-6b4bfc849bec-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"b39c23a2-492d-4401-bb73-6b4bfc849bec\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.899000 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b39c23a2-492d-4401-bb73-6b4bfc849bec-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"b39c23a2-492d-4401-bb73-6b4bfc849bec\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.900707 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b39c23a2-492d-4401-bb73-6b4bfc849bec-dev\") pod \"glance-default-internal-api-1\" (UID: \"b39c23a2-492d-4401-bb73-6b4bfc849bec\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.900790 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b39c23a2-492d-4401-bb73-6b4bfc849bec-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"b39c23a2-492d-4401-bb73-6b4bfc849bec\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.900834 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.900869 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b39c23a2-492d-4401-bb73-6b4bfc849bec-run\") pod \"glance-default-internal-api-1\" (UID: \"b39c23a2-492d-4401-bb73-6b4bfc849bec\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.900863 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b39c23a2-492d-4401-bb73-6b4bfc849bec-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"b39c23a2-492d-4401-bb73-6b4bfc849bec\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.900898 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-internal-api-1\" (UID: \"b39c23a2-492d-4401-bb73-6b4bfc849bec\") device mount path \"/mnt/openstack/pv16\"" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.900942 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b39c23a2-492d-4401-bb73-6b4bfc849bec-sys\") pod \"glance-default-internal-api-1\" (UID: \"b39c23a2-492d-4401-bb73-6b4bfc849bec\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.900954 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-internal-api-0\" (UID: \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\") device mount path \"/mnt/openstack/pv15\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.900980 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.900995 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\") device mount path \"/mnt/openstack/pv08\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.901035 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.902210 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.903568 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.904083 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-logs\") pod \"glance-default-internal-api-0\" (UID: \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.905965 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b39c23a2-492d-4401-bb73-6b4bfc849bec-config-data\") pod \"glance-default-internal-api-1\" (UID: \"b39c23a2-492d-4401-bb73-6b4bfc849bec\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.909739 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b39c23a2-492d-4401-bb73-6b4bfc849bec-scripts\") pod \"glance-default-internal-api-1\" (UID: \"b39c23a2-492d-4401-bb73-6b4bfc849bec\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.920691 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj5bx\" (UniqueName: \"kubernetes.io/projected/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-kube-api-access-mj5bx\") pod \"glance-default-internal-api-0\" (UID: \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.922728 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b65vr\" (UniqueName: \"kubernetes.io/projected/b39c23a2-492d-4401-bb73-6b4bfc849bec-kube-api-access-b65vr\") pod \"glance-default-internal-api-1\" (UID: \"b39c23a2-492d-4401-bb73-6b4bfc849bec\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.927489 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.935483 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-1\" (UID: \"b39c23a2-492d-4401-bb73-6b4bfc849bec\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.945242 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-internal-api-0\" (UID: \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.959141 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-internal-api-1\" (UID: \"b39c23a2-492d-4401-bb73-6b4bfc849bec\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:04:55 crc kubenswrapper[4751]: I0131 15:04:55.982458 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:04:56 crc kubenswrapper[4751]: I0131 15:04:56.019710 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:04:56 crc kubenswrapper[4751]: I0131 15:04:56.027846 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:04:56 crc kubenswrapper[4751]: I0131 15:04:56.299502 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Jan 31 15:04:56 crc kubenswrapper[4751]: W0131 15:04:56.302128 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1590321_a9e4_43b9_a9a0_04dc832b3332.slice/crio-74ea627d68211382ddcd6045fc064247fa1aa0e13ee2656765ee16aee6d9bb61 WatchSource:0}: Error finding container 74ea627d68211382ddcd6045fc064247fa1aa0e13ee2656765ee16aee6d9bb61: Status 404 returned error can't find the container with id 74ea627d68211382ddcd6045fc064247fa1aa0e13ee2656765ee16aee6d9bb61 Jan 31 15:04:56 crc kubenswrapper[4751]: I0131 15:04:56.399824 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Jan 31 15:04:56 crc kubenswrapper[4751]: W0131 15:04:56.402976 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ce40f98_80af_4a4b_8556_c5c7dd84fc58.slice/crio-c11c5ca70f63ee145b6adb16291fa14f1d7247eb5f019288ced5f1338933a04e WatchSource:0}: Error finding container c11c5ca70f63ee145b6adb16291fa14f1d7247eb5f019288ced5f1338933a04e: Status 404 returned error can't find the container with id c11c5ca70f63ee145b6adb16291fa14f1d7247eb5f019288ced5f1338933a04e Jan 31 15:04:56 crc kubenswrapper[4751]: I0131 15:04:56.449935 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Jan 31 15:04:56 crc kubenswrapper[4751]: I0131 15:04:56.480596 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Jan 31 15:04:56 crc kubenswrapper[4751]: W0131 15:04:56.486528 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb39c23a2_492d_4401_bb73_6b4bfc849bec.slice/crio-995975e00b456601ec8a0bc0c21da9af3666fa6889c9406bb7f743ae54f464a2 WatchSource:0}: Error finding container 995975e00b456601ec8a0bc0c21da9af3666fa6889c9406bb7f743ae54f464a2: Status 404 returned error can't find the container with id 995975e00b456601ec8a0bc0c21da9af3666fa6889c9406bb7f743ae54f464a2 Jan 31 15:04:56 crc kubenswrapper[4751]: I0131 15:04:56.551954 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 15:04:56 crc kubenswrapper[4751]: W0131 15:04:56.559231 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8309fec8_e5ee_4e23_8617_ab2e7ba833d6.slice/crio-dd5644a60c6d37de5875344e64047668e5f4f00160a8166735d8995e0b9a3862 WatchSource:0}: Error finding container dd5644a60c6d37de5875344e64047668e5f4f00160a8166735d8995e0b9a3862: Status 404 returned error can't find the container with id dd5644a60c6d37de5875344e64047668e5f4f00160a8166735d8995e0b9a3862 Jan 31 15:04:57 crc kubenswrapper[4751]: I0131 15:04:57.176784 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"8309fec8-e5ee-4e23-8617-ab2e7ba833d6","Type":"ContainerStarted","Data":"2af1683a972bf70a8cfc2843e554280e84c5731cd7be66a108127c478a878dd2"} Jan 31 15:04:57 crc kubenswrapper[4751]: I0131 15:04:57.177235 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"8309fec8-e5ee-4e23-8617-ab2e7ba833d6","Type":"ContainerStarted","Data":"dd5644a60c6d37de5875344e64047668e5f4f00160a8166735d8995e0b9a3862"} Jan 31 15:04:57 crc kubenswrapper[4751]: I0131 15:04:57.178156 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"d1590321-a9e4-43b9-a9a0-04dc832b3332","Type":"ContainerStarted","Data":"efd3679f0fb441a01fa19571a9aa5ba37ff46c03bfa892f113c431e3a9022ffe"} Jan 31 15:04:57 crc kubenswrapper[4751]: I0131 15:04:57.178301 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"d1590321-a9e4-43b9-a9a0-04dc832b3332","Type":"ContainerStarted","Data":"74ea627d68211382ddcd6045fc064247fa1aa0e13ee2656765ee16aee6d9bb61"} Jan 31 15:04:57 crc kubenswrapper[4751]: I0131 15:04:57.179586 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"2ce40f98-80af-4a4b-8556-c5c7dd84fc58","Type":"ContainerStarted","Data":"e83e3eee6d1bd6eeb18c2480a68530a6737d3f76fac28ccff0e74d019e406b35"} Jan 31 15:04:57 crc kubenswrapper[4751]: I0131 15:04:57.179699 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"2ce40f98-80af-4a4b-8556-c5c7dd84fc58","Type":"ContainerStarted","Data":"c11c5ca70f63ee145b6adb16291fa14f1d7247eb5f019288ced5f1338933a04e"} Jan 31 15:04:57 crc kubenswrapper[4751]: I0131 15:04:57.181080 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"b39c23a2-492d-4401-bb73-6b4bfc849bec","Type":"ContainerStarted","Data":"ade6d8caba0bf8e6958b676f72eece1bb73609f4cb6e8cdf0b7220751b79dcad"} Jan 31 15:04:57 crc kubenswrapper[4751]: I0131 15:04:57.181119 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"b39c23a2-492d-4401-bb73-6b4bfc849bec","Type":"ContainerStarted","Data":"995975e00b456601ec8a0bc0c21da9af3666fa6889c9406bb7f743ae54f464a2"} Jan 31 15:04:58 crc kubenswrapper[4751]: I0131 15:04:58.192234 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"d1590321-a9e4-43b9-a9a0-04dc832b3332","Type":"ContainerStarted","Data":"79f170b0f270f110e5b5d5229988d518034d01da466dbe2c3254336ea700274a"} Jan 31 15:04:58 crc kubenswrapper[4751]: I0131 15:04:58.194238 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"b39c23a2-492d-4401-bb73-6b4bfc849bec","Type":"ContainerStarted","Data":"0ccf08a2c3c9ee0e1a6fd7438c1d413c5265de24852c5a25306ee98ea0c2399b"} Jan 31 15:04:58 crc kubenswrapper[4751]: I0131 15:04:58.194368 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="b39c23a2-492d-4401-bb73-6b4bfc849bec" containerName="glance-log" containerID="cri-o://ade6d8caba0bf8e6958b676f72eece1bb73609f4cb6e8cdf0b7220751b79dcad" gracePeriod=30 Jan 31 15:04:58 crc kubenswrapper[4751]: I0131 15:04:58.194674 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="b39c23a2-492d-4401-bb73-6b4bfc849bec" containerName="glance-httpd" containerID="cri-o://0ccf08a2c3c9ee0e1a6fd7438c1d413c5265de24852c5a25306ee98ea0c2399b" gracePeriod=30 Jan 31 15:04:58 crc kubenswrapper[4751]: I0131 15:04:58.221901 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-0" podStartSLOduration=4.221865722 podStartE2EDuration="4.221865722s" podCreationTimestamp="2026-01-31 15:04:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 15:04:58.220792063 +0000 UTC m=+1402.595504968" watchObservedRunningTime="2026-01-31 15:04:58.221865722 +0000 UTC m=+1402.596578647" Jan 31 15:04:58 crc kubenswrapper[4751]: I0131 15:04:58.265091 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-1" podStartSLOduration=4.265051131 podStartE2EDuration="4.265051131s" podCreationTimestamp="2026-01-31 15:04:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 15:04:58.257597254 +0000 UTC m=+1402.632310149" watchObservedRunningTime="2026-01-31 15:04:58.265051131 +0000 UTC m=+1402.639764026" Jan 31 15:04:59 crc kubenswrapper[4751]: I0131 15:04:59.203343 4751 generic.go:334] "Generic (PLEG): container finished" podID="b39c23a2-492d-4401-bb73-6b4bfc849bec" containerID="0ccf08a2c3c9ee0e1a6fd7438c1d413c5265de24852c5a25306ee98ea0c2399b" exitCode=143 Jan 31 15:04:59 crc kubenswrapper[4751]: I0131 15:04:59.203735 4751 generic.go:334] "Generic (PLEG): container finished" podID="b39c23a2-492d-4401-bb73-6b4bfc849bec" containerID="ade6d8caba0bf8e6958b676f72eece1bb73609f4cb6e8cdf0b7220751b79dcad" exitCode=143 Jan 31 15:04:59 crc kubenswrapper[4751]: I0131 15:04:59.203417 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"b39c23a2-492d-4401-bb73-6b4bfc849bec","Type":"ContainerDied","Data":"0ccf08a2c3c9ee0e1a6fd7438c1d413c5265de24852c5a25306ee98ea0c2399b"} Jan 31 15:04:59 crc kubenswrapper[4751]: I0131 15:04:59.203772 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"b39c23a2-492d-4401-bb73-6b4bfc849bec","Type":"ContainerDied","Data":"ade6d8caba0bf8e6958b676f72eece1bb73609f4cb6e8cdf0b7220751b79dcad"} Jan 31 15:04:59 crc kubenswrapper[4751]: I0131 15:04:59.205451 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"8309fec8-e5ee-4e23-8617-ab2e7ba833d6","Type":"ContainerStarted","Data":"0da059b6d1edf05d0ffe817ea6cbe4df21ba9242536bbbd9d3e7d479bd096546"} Jan 31 15:04:59 crc kubenswrapper[4751]: I0131 15:04:59.207124 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"2ce40f98-80af-4a4b-8556-c5c7dd84fc58","Type":"ContainerStarted","Data":"5b307a953bf028387f69f75e58e1545d2c2c38f81096b44abf4639237badb848"} Jan 31 15:04:59 crc kubenswrapper[4751]: I0131 15:04:59.563151 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:04:59 crc kubenswrapper[4751]: I0131 15:04:59.586284 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-1" podStartSLOduration=4.586265291 podStartE2EDuration="4.586265291s" podCreationTimestamp="2026-01-31 15:04:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 15:04:59.235250332 +0000 UTC m=+1403.609963217" watchObservedRunningTime="2026-01-31 15:04:59.586265291 +0000 UTC m=+1403.960978176" Jan 31 15:04:59 crc kubenswrapper[4751]: I0131 15:04:59.676559 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b39c23a2-492d-4401-bb73-6b4bfc849bec-config-data\") pod \"b39c23a2-492d-4401-bb73-6b4bfc849bec\" (UID: \"b39c23a2-492d-4401-bb73-6b4bfc849bec\") " Jan 31 15:04:59 crc kubenswrapper[4751]: I0131 15:04:59.676610 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b39c23a2-492d-4401-bb73-6b4bfc849bec-run\") pod \"b39c23a2-492d-4401-bb73-6b4bfc849bec\" (UID: \"b39c23a2-492d-4401-bb73-6b4bfc849bec\") " Jan 31 15:04:59 crc kubenswrapper[4751]: I0131 15:04:59.676655 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b39c23a2-492d-4401-bb73-6b4bfc849bec-scripts\") pod \"b39c23a2-492d-4401-bb73-6b4bfc849bec\" (UID: \"b39c23a2-492d-4401-bb73-6b4bfc849bec\") " Jan 31 15:04:59 crc kubenswrapper[4751]: I0131 15:04:59.676674 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b39c23a2-492d-4401-bb73-6b4bfc849bec-var-locks-brick\") pod \"b39c23a2-492d-4401-bb73-6b4bfc849bec\" (UID: \"b39c23a2-492d-4401-bb73-6b4bfc849bec\") " Jan 31 15:04:59 crc kubenswrapper[4751]: I0131 15:04:59.676691 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b39c23a2-492d-4401-bb73-6b4bfc849bec-lib-modules\") pod \"b39c23a2-492d-4401-bb73-6b4bfc849bec\" (UID: \"b39c23a2-492d-4401-bb73-6b4bfc849bec\") " Jan 31 15:04:59 crc kubenswrapper[4751]: I0131 15:04:59.676712 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b65vr\" (UniqueName: \"kubernetes.io/projected/b39c23a2-492d-4401-bb73-6b4bfc849bec-kube-api-access-b65vr\") pod \"b39c23a2-492d-4401-bb73-6b4bfc849bec\" (UID: \"b39c23a2-492d-4401-bb73-6b4bfc849bec\") " Jan 31 15:04:59 crc kubenswrapper[4751]: I0131 15:04:59.676729 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"b39c23a2-492d-4401-bb73-6b4bfc849bec\" (UID: \"b39c23a2-492d-4401-bb73-6b4bfc849bec\") " Jan 31 15:04:59 crc kubenswrapper[4751]: I0131 15:04:59.676749 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b39c23a2-492d-4401-bb73-6b4bfc849bec-httpd-run\") pod \"b39c23a2-492d-4401-bb73-6b4bfc849bec\" (UID: \"b39c23a2-492d-4401-bb73-6b4bfc849bec\") " Jan 31 15:04:59 crc kubenswrapper[4751]: I0131 15:04:59.676761 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b39c23a2-492d-4401-bb73-6b4bfc849bec-dev\") pod \"b39c23a2-492d-4401-bb73-6b4bfc849bec\" (UID: \"b39c23a2-492d-4401-bb73-6b4bfc849bec\") " Jan 31 15:04:59 crc kubenswrapper[4751]: I0131 15:04:59.676772 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b39c23a2-492d-4401-bb73-6b4bfc849bec-run" (OuterVolumeSpecName: "run") pod "b39c23a2-492d-4401-bb73-6b4bfc849bec" (UID: "b39c23a2-492d-4401-bb73-6b4bfc849bec"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:04:59 crc kubenswrapper[4751]: I0131 15:04:59.676811 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b39c23a2-492d-4401-bb73-6b4bfc849bec-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "b39c23a2-492d-4401-bb73-6b4bfc849bec" (UID: "b39c23a2-492d-4401-bb73-6b4bfc849bec"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:04:59 crc kubenswrapper[4751]: I0131 15:04:59.676800 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"b39c23a2-492d-4401-bb73-6b4bfc849bec\" (UID: \"b39c23a2-492d-4401-bb73-6b4bfc849bec\") " Jan 31 15:04:59 crc kubenswrapper[4751]: I0131 15:04:59.676887 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b39c23a2-492d-4401-bb73-6b4bfc849bec-sys\") pod \"b39c23a2-492d-4401-bb73-6b4bfc849bec\" (UID: \"b39c23a2-492d-4401-bb73-6b4bfc849bec\") " Jan 31 15:04:59 crc kubenswrapper[4751]: I0131 15:04:59.676930 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b39c23a2-492d-4401-bb73-6b4bfc849bec-etc-iscsi\") pod \"b39c23a2-492d-4401-bb73-6b4bfc849bec\" (UID: \"b39c23a2-492d-4401-bb73-6b4bfc849bec\") " Jan 31 15:04:59 crc kubenswrapper[4751]: I0131 15:04:59.677012 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b39c23a2-492d-4401-bb73-6b4bfc849bec-logs\") pod \"b39c23a2-492d-4401-bb73-6b4bfc849bec\" (UID: \"b39c23a2-492d-4401-bb73-6b4bfc849bec\") " Jan 31 15:04:59 crc kubenswrapper[4751]: I0131 15:04:59.677036 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b39c23a2-492d-4401-bb73-6b4bfc849bec-etc-nvme\") pod \"b39c23a2-492d-4401-bb73-6b4bfc849bec\" (UID: \"b39c23a2-492d-4401-bb73-6b4bfc849bec\") " Jan 31 15:04:59 crc kubenswrapper[4751]: I0131 15:04:59.677292 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b39c23a2-492d-4401-bb73-6b4bfc849bec-sys" (OuterVolumeSpecName: "sys") pod "b39c23a2-492d-4401-bb73-6b4bfc849bec" (UID: "b39c23a2-492d-4401-bb73-6b4bfc849bec"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:04:59 crc kubenswrapper[4751]: I0131 15:04:59.677318 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b39c23a2-492d-4401-bb73-6b4bfc849bec-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "b39c23a2-492d-4401-bb73-6b4bfc849bec" (UID: "b39c23a2-492d-4401-bb73-6b4bfc849bec"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:04:59 crc kubenswrapper[4751]: I0131 15:04:59.677343 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b39c23a2-492d-4401-bb73-6b4bfc849bec-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "b39c23a2-492d-4401-bb73-6b4bfc849bec" (UID: "b39c23a2-492d-4401-bb73-6b4bfc849bec"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:04:59 crc kubenswrapper[4751]: I0131 15:04:59.677402 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b39c23a2-492d-4401-bb73-6b4bfc849bec-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "b39c23a2-492d-4401-bb73-6b4bfc849bec" (UID: "b39c23a2-492d-4401-bb73-6b4bfc849bec"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:04:59 crc kubenswrapper[4751]: I0131 15:04:59.677462 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b39c23a2-492d-4401-bb73-6b4bfc849bec-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b39c23a2-492d-4401-bb73-6b4bfc849bec" (UID: "b39c23a2-492d-4401-bb73-6b4bfc849bec"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:04:59 crc kubenswrapper[4751]: I0131 15:04:59.677476 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b39c23a2-492d-4401-bb73-6b4bfc849bec-dev" (OuterVolumeSpecName: "dev") pod "b39c23a2-492d-4401-bb73-6b4bfc849bec" (UID: "b39c23a2-492d-4401-bb73-6b4bfc849bec"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:04:59 crc kubenswrapper[4751]: I0131 15:04:59.677579 4751 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b39c23a2-492d-4401-bb73-6b4bfc849bec-var-locks-brick\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:59 crc kubenswrapper[4751]: I0131 15:04:59.677595 4751 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b39c23a2-492d-4401-bb73-6b4bfc849bec-lib-modules\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:59 crc kubenswrapper[4751]: I0131 15:04:59.677603 4751 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b39c23a2-492d-4401-bb73-6b4bfc849bec-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:59 crc kubenswrapper[4751]: I0131 15:04:59.677613 4751 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b39c23a2-492d-4401-bb73-6b4bfc849bec-dev\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:59 crc kubenswrapper[4751]: I0131 15:04:59.677624 4751 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b39c23a2-492d-4401-bb73-6b4bfc849bec-sys\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:59 crc kubenswrapper[4751]: I0131 15:04:59.677632 4751 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b39c23a2-492d-4401-bb73-6b4bfc849bec-etc-iscsi\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:59 crc kubenswrapper[4751]: I0131 15:04:59.677640 4751 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b39c23a2-492d-4401-bb73-6b4bfc849bec-etc-nvme\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:59 crc kubenswrapper[4751]: I0131 15:04:59.677648 4751 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b39c23a2-492d-4401-bb73-6b4bfc849bec-run\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:59 crc kubenswrapper[4751]: I0131 15:04:59.677609 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b39c23a2-492d-4401-bb73-6b4bfc849bec-logs" (OuterVolumeSpecName: "logs") pod "b39c23a2-492d-4401-bb73-6b4bfc849bec" (UID: "b39c23a2-492d-4401-bb73-6b4bfc849bec"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:04:59 crc kubenswrapper[4751]: I0131 15:04:59.682275 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b39c23a2-492d-4401-bb73-6b4bfc849bec-kube-api-access-b65vr" (OuterVolumeSpecName: "kube-api-access-b65vr") pod "b39c23a2-492d-4401-bb73-6b4bfc849bec" (UID: "b39c23a2-492d-4401-bb73-6b4bfc849bec"). InnerVolumeSpecName "kube-api-access-b65vr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:04:59 crc kubenswrapper[4751]: I0131 15:04:59.682381 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage18-crc" (OuterVolumeSpecName: "glance-cache") pod "b39c23a2-492d-4401-bb73-6b4bfc849bec" (UID: "b39c23a2-492d-4401-bb73-6b4bfc849bec"). InnerVolumeSpecName "local-storage18-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:04:59 crc kubenswrapper[4751]: I0131 15:04:59.683470 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage16-crc" (OuterVolumeSpecName: "glance") pod "b39c23a2-492d-4401-bb73-6b4bfc849bec" (UID: "b39c23a2-492d-4401-bb73-6b4bfc849bec"). InnerVolumeSpecName "local-storage16-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:04:59 crc kubenswrapper[4751]: I0131 15:04:59.685810 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b39c23a2-492d-4401-bb73-6b4bfc849bec-scripts" (OuterVolumeSpecName: "scripts") pod "b39c23a2-492d-4401-bb73-6b4bfc849bec" (UID: "b39c23a2-492d-4401-bb73-6b4bfc849bec"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:04:59 crc kubenswrapper[4751]: I0131 15:04:59.723438 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b39c23a2-492d-4401-bb73-6b4bfc849bec-config-data" (OuterVolumeSpecName: "config-data") pod "b39c23a2-492d-4401-bb73-6b4bfc849bec" (UID: "b39c23a2-492d-4401-bb73-6b4bfc849bec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:04:59 crc kubenswrapper[4751]: I0131 15:04:59.779533 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b39c23a2-492d-4401-bb73-6b4bfc849bec-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:59 crc kubenswrapper[4751]: I0131 15:04:59.779569 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b65vr\" (UniqueName: \"kubernetes.io/projected/b39c23a2-492d-4401-bb73-6b4bfc849bec-kube-api-access-b65vr\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:59 crc kubenswrapper[4751]: I0131 15:04:59.779607 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" " Jan 31 15:04:59 crc kubenswrapper[4751]: I0131 15:04:59.779620 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" " Jan 31 15:04:59 crc kubenswrapper[4751]: I0131 15:04:59.779630 4751 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b39c23a2-492d-4401-bb73-6b4bfc849bec-logs\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:59 crc kubenswrapper[4751]: I0131 15:04:59.779646 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b39c23a2-492d-4401-bb73-6b4bfc849bec-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:59 crc kubenswrapper[4751]: I0131 15:04:59.794536 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage18-crc" (UniqueName: "kubernetes.io/local-volume/local-storage18-crc") on node "crc" Jan 31 15:04:59 crc kubenswrapper[4751]: I0131 15:04:59.797979 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage16-crc" (UniqueName: "kubernetes.io/local-volume/local-storage16-crc") on node "crc" Jan 31 15:04:59 crc kubenswrapper[4751]: I0131 15:04:59.881268 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:04:59 crc kubenswrapper[4751]: I0131 15:04:59.881314 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:05:00 crc kubenswrapper[4751]: I0131 15:05:00.220308 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"b39c23a2-492d-4401-bb73-6b4bfc849bec","Type":"ContainerDied","Data":"995975e00b456601ec8a0bc0c21da9af3666fa6889c9406bb7f743ae54f464a2"} Jan 31 15:05:00 crc kubenswrapper[4751]: I0131 15:05:00.220417 4751 scope.go:117] "RemoveContainer" containerID="0ccf08a2c3c9ee0e1a6fd7438c1d413c5265de24852c5a25306ee98ea0c2399b" Jan 31 15:05:00 crc kubenswrapper[4751]: I0131 15:05:00.222441 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:05:00 crc kubenswrapper[4751]: I0131 15:05:00.260630 4751 scope.go:117] "RemoveContainer" containerID="ade6d8caba0bf8e6958b676f72eece1bb73609f4cb6e8cdf0b7220751b79dcad" Jan 31 15:05:00 crc kubenswrapper[4751]: I0131 15:05:00.261765 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=6.261732636 podStartE2EDuration="6.261732636s" podCreationTimestamp="2026-01-31 15:04:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 15:05:00.247170082 +0000 UTC m=+1404.621882987" watchObservedRunningTime="2026-01-31 15:05:00.261732636 +0000 UTC m=+1404.636445561" Jan 31 15:05:00 crc kubenswrapper[4751]: I0131 15:05:00.280440 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Jan 31 15:05:00 crc kubenswrapper[4751]: I0131 15:05:00.291485 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Jan 31 15:05:00 crc kubenswrapper[4751]: I0131 15:05:00.308187 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Jan 31 15:05:00 crc kubenswrapper[4751]: E0131 15:05:00.308796 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b39c23a2-492d-4401-bb73-6b4bfc849bec" containerName="glance-httpd" Jan 31 15:05:00 crc kubenswrapper[4751]: I0131 15:05:00.308888 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="b39c23a2-492d-4401-bb73-6b4bfc849bec" containerName="glance-httpd" Jan 31 15:05:00 crc kubenswrapper[4751]: E0131 15:05:00.308979 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b39c23a2-492d-4401-bb73-6b4bfc849bec" containerName="glance-log" Jan 31 15:05:00 crc kubenswrapper[4751]: I0131 15:05:00.309053 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="b39c23a2-492d-4401-bb73-6b4bfc849bec" containerName="glance-log" Jan 31 15:05:00 crc kubenswrapper[4751]: I0131 15:05:00.309299 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="b39c23a2-492d-4401-bb73-6b4bfc849bec" containerName="glance-log" Jan 31 15:05:00 crc kubenswrapper[4751]: I0131 15:05:00.309375 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="b39c23a2-492d-4401-bb73-6b4bfc849bec" containerName="glance-httpd" Jan 31 15:05:00 crc kubenswrapper[4751]: I0131 15:05:00.310260 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:05:00 crc kubenswrapper[4751]: I0131 15:05:00.321588 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Jan 31 15:05:00 crc kubenswrapper[4751]: I0131 15:05:00.390472 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/fada73df-4c18-4f18-9fcd-9fe24825a32c-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"fada73df-4c18-4f18-9fcd-9fe24825a32c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:05:00 crc kubenswrapper[4751]: I0131 15:05:00.390713 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fada73df-4c18-4f18-9fcd-9fe24825a32c-run\") pod \"glance-default-internal-api-1\" (UID: \"fada73df-4c18-4f18-9fcd-9fe24825a32c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:05:00 crc kubenswrapper[4751]: I0131 15:05:00.390865 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fada73df-4c18-4f18-9fcd-9fe24825a32c-logs\") pod \"glance-default-internal-api-1\" (UID: \"fada73df-4c18-4f18-9fcd-9fe24825a32c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:05:00 crc kubenswrapper[4751]: I0131 15:05:00.390968 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fada73df-4c18-4f18-9fcd-9fe24825a32c-sys\") pod \"glance-default-internal-api-1\" (UID: \"fada73df-4c18-4f18-9fcd-9fe24825a32c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:05:00 crc kubenswrapper[4751]: I0131 15:05:00.391121 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/fada73df-4c18-4f18-9fcd-9fe24825a32c-dev\") pod \"glance-default-internal-api-1\" (UID: \"fada73df-4c18-4f18-9fcd-9fe24825a32c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:05:00 crc kubenswrapper[4751]: I0131 15:05:00.391183 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fada73df-4c18-4f18-9fcd-9fe24825a32c-scripts\") pod \"glance-default-internal-api-1\" (UID: \"fada73df-4c18-4f18-9fcd-9fe24825a32c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:05:00 crc kubenswrapper[4751]: I0131 15:05:00.391281 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fada73df-4c18-4f18-9fcd-9fe24825a32c-config-data\") pod \"glance-default-internal-api-1\" (UID: \"fada73df-4c18-4f18-9fcd-9fe24825a32c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:05:00 crc kubenswrapper[4751]: I0131 15:05:00.391339 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/fada73df-4c18-4f18-9fcd-9fe24825a32c-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"fada73df-4c18-4f18-9fcd-9fe24825a32c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:05:00 crc kubenswrapper[4751]: I0131 15:05:00.391369 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/fada73df-4c18-4f18-9fcd-9fe24825a32c-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"fada73df-4c18-4f18-9fcd-9fe24825a32c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:05:00 crc kubenswrapper[4751]: I0131 15:05:00.391433 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-internal-api-1\" (UID: \"fada73df-4c18-4f18-9fcd-9fe24825a32c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:05:00 crc kubenswrapper[4751]: I0131 15:05:00.391528 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-1\" (UID: \"fada73df-4c18-4f18-9fcd-9fe24825a32c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:05:00 crc kubenswrapper[4751]: I0131 15:05:00.391691 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fada73df-4c18-4f18-9fcd-9fe24825a32c-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"fada73df-4c18-4f18-9fcd-9fe24825a32c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:05:00 crc kubenswrapper[4751]: I0131 15:05:00.391748 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fada73df-4c18-4f18-9fcd-9fe24825a32c-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"fada73df-4c18-4f18-9fcd-9fe24825a32c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:05:00 crc kubenswrapper[4751]: I0131 15:05:00.391771 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmpk8\" (UniqueName: \"kubernetes.io/projected/fada73df-4c18-4f18-9fcd-9fe24825a32c-kube-api-access-pmpk8\") pod \"glance-default-internal-api-1\" (UID: \"fada73df-4c18-4f18-9fcd-9fe24825a32c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:05:00 crc kubenswrapper[4751]: I0131 15:05:00.418671 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b39c23a2-492d-4401-bb73-6b4bfc849bec" path="/var/lib/kubelet/pods/b39c23a2-492d-4401-bb73-6b4bfc849bec/volumes" Jan 31 15:05:00 crc kubenswrapper[4751]: I0131 15:05:00.493686 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fada73df-4c18-4f18-9fcd-9fe24825a32c-run\") pod \"glance-default-internal-api-1\" (UID: \"fada73df-4c18-4f18-9fcd-9fe24825a32c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:05:00 crc kubenswrapper[4751]: I0131 15:05:00.493930 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fada73df-4c18-4f18-9fcd-9fe24825a32c-logs\") pod \"glance-default-internal-api-1\" (UID: \"fada73df-4c18-4f18-9fcd-9fe24825a32c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:05:00 crc kubenswrapper[4751]: I0131 15:05:00.493772 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fada73df-4c18-4f18-9fcd-9fe24825a32c-run\") pod \"glance-default-internal-api-1\" (UID: \"fada73df-4c18-4f18-9fcd-9fe24825a32c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:05:00 crc kubenswrapper[4751]: I0131 15:05:00.493978 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fada73df-4c18-4f18-9fcd-9fe24825a32c-sys\") pod \"glance-default-internal-api-1\" (UID: \"fada73df-4c18-4f18-9fcd-9fe24825a32c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:05:00 crc kubenswrapper[4751]: I0131 15:05:00.493958 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fada73df-4c18-4f18-9fcd-9fe24825a32c-sys\") pod \"glance-default-internal-api-1\" (UID: \"fada73df-4c18-4f18-9fcd-9fe24825a32c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:05:00 crc kubenswrapper[4751]: I0131 15:05:00.494044 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/fada73df-4c18-4f18-9fcd-9fe24825a32c-dev\") pod \"glance-default-internal-api-1\" (UID: \"fada73df-4c18-4f18-9fcd-9fe24825a32c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:05:00 crc kubenswrapper[4751]: I0131 15:05:00.494108 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fada73df-4c18-4f18-9fcd-9fe24825a32c-scripts\") pod \"glance-default-internal-api-1\" (UID: \"fada73df-4c18-4f18-9fcd-9fe24825a32c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:05:00 crc kubenswrapper[4751]: I0131 15:05:00.494178 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fada73df-4c18-4f18-9fcd-9fe24825a32c-config-data\") pod \"glance-default-internal-api-1\" (UID: \"fada73df-4c18-4f18-9fcd-9fe24825a32c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:05:00 crc kubenswrapper[4751]: I0131 15:05:00.494212 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/fada73df-4c18-4f18-9fcd-9fe24825a32c-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"fada73df-4c18-4f18-9fcd-9fe24825a32c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:05:00 crc kubenswrapper[4751]: I0131 15:05:00.494246 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/fada73df-4c18-4f18-9fcd-9fe24825a32c-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"fada73df-4c18-4f18-9fcd-9fe24825a32c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:05:00 crc kubenswrapper[4751]: I0131 15:05:00.494291 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-internal-api-1\" (UID: \"fada73df-4c18-4f18-9fcd-9fe24825a32c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:05:00 crc kubenswrapper[4751]: I0131 15:05:00.494345 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-1\" (UID: \"fada73df-4c18-4f18-9fcd-9fe24825a32c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:05:00 crc kubenswrapper[4751]: I0131 15:05:00.494414 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fada73df-4c18-4f18-9fcd-9fe24825a32c-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"fada73df-4c18-4f18-9fcd-9fe24825a32c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:05:00 crc kubenswrapper[4751]: I0131 15:05:00.494447 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fada73df-4c18-4f18-9fcd-9fe24825a32c-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"fada73df-4c18-4f18-9fcd-9fe24825a32c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:05:00 crc kubenswrapper[4751]: I0131 15:05:00.494478 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmpk8\" (UniqueName: \"kubernetes.io/projected/fada73df-4c18-4f18-9fcd-9fe24825a32c-kube-api-access-pmpk8\") pod \"glance-default-internal-api-1\" (UID: \"fada73df-4c18-4f18-9fcd-9fe24825a32c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:05:00 crc kubenswrapper[4751]: I0131 15:05:00.494479 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fada73df-4c18-4f18-9fcd-9fe24825a32c-logs\") pod \"glance-default-internal-api-1\" (UID: \"fada73df-4c18-4f18-9fcd-9fe24825a32c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:05:00 crc kubenswrapper[4751]: I0131 15:05:00.494634 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/fada73df-4c18-4f18-9fcd-9fe24825a32c-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"fada73df-4c18-4f18-9fcd-9fe24825a32c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:05:00 crc kubenswrapper[4751]: I0131 15:05:00.494826 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/fada73df-4c18-4f18-9fcd-9fe24825a32c-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"fada73df-4c18-4f18-9fcd-9fe24825a32c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:05:00 crc kubenswrapper[4751]: I0131 15:05:00.494867 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/fada73df-4c18-4f18-9fcd-9fe24825a32c-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"fada73df-4c18-4f18-9fcd-9fe24825a32c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:05:00 crc kubenswrapper[4751]: I0131 15:05:00.494899 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/fada73df-4c18-4f18-9fcd-9fe24825a32c-dev\") pod \"glance-default-internal-api-1\" (UID: \"fada73df-4c18-4f18-9fcd-9fe24825a32c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:05:00 crc kubenswrapper[4751]: I0131 15:05:00.495344 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/fada73df-4c18-4f18-9fcd-9fe24825a32c-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"fada73df-4c18-4f18-9fcd-9fe24825a32c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:05:00 crc kubenswrapper[4751]: I0131 15:05:00.495391 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fada73df-4c18-4f18-9fcd-9fe24825a32c-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"fada73df-4c18-4f18-9fcd-9fe24825a32c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:05:00 crc kubenswrapper[4751]: I0131 15:05:00.495415 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-1\" (UID: \"fada73df-4c18-4f18-9fcd-9fe24825a32c\") device mount path \"/mnt/openstack/pv18\"" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:05:00 crc kubenswrapper[4751]: I0131 15:05:00.495450 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-internal-api-1\" (UID: \"fada73df-4c18-4f18-9fcd-9fe24825a32c\") device mount path \"/mnt/openstack/pv16\"" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:05:00 crc kubenswrapper[4751]: I0131 15:05:00.495763 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fada73df-4c18-4f18-9fcd-9fe24825a32c-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"fada73df-4c18-4f18-9fcd-9fe24825a32c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:05:00 crc kubenswrapper[4751]: I0131 15:05:00.499461 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fada73df-4c18-4f18-9fcd-9fe24825a32c-scripts\") pod \"glance-default-internal-api-1\" (UID: \"fada73df-4c18-4f18-9fcd-9fe24825a32c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:05:00 crc kubenswrapper[4751]: I0131 15:05:00.504374 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fada73df-4c18-4f18-9fcd-9fe24825a32c-config-data\") pod \"glance-default-internal-api-1\" (UID: \"fada73df-4c18-4f18-9fcd-9fe24825a32c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:05:00 crc kubenswrapper[4751]: I0131 15:05:00.514279 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmpk8\" (UniqueName: \"kubernetes.io/projected/fada73df-4c18-4f18-9fcd-9fe24825a32c-kube-api-access-pmpk8\") pod \"glance-default-internal-api-1\" (UID: \"fada73df-4c18-4f18-9fcd-9fe24825a32c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:05:00 crc kubenswrapper[4751]: I0131 15:05:00.525121 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-internal-api-1\" (UID: \"fada73df-4c18-4f18-9fcd-9fe24825a32c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:05:00 crc kubenswrapper[4751]: I0131 15:05:00.539853 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-1\" (UID: \"fada73df-4c18-4f18-9fcd-9fe24825a32c\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:05:00 crc kubenswrapper[4751]: I0131 15:05:00.629125 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:05:01 crc kubenswrapper[4751]: I0131 15:05:01.105127 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Jan 31 15:05:01 crc kubenswrapper[4751]: I0131 15:05:01.231924 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"fada73df-4c18-4f18-9fcd-9fe24825a32c","Type":"ContainerStarted","Data":"4e85dc9929fef5538bbedbabd1bc4862934d09f3a214bd5393c16cf9dbfd21f7"} Jan 31 15:05:02 crc kubenswrapper[4751]: I0131 15:05:02.245291 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"fada73df-4c18-4f18-9fcd-9fe24825a32c","Type":"ContainerStarted","Data":"0622c62b16c45cccd093df45438f4801ecb9dee67a405110fa2bd8152369a536"} Jan 31 15:05:02 crc kubenswrapper[4751]: I0131 15:05:02.246940 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"fada73df-4c18-4f18-9fcd-9fe24825a32c","Type":"ContainerStarted","Data":"4ac3b73f999bb2d4398e520fc413f3d3e17b891e2a007e2b9a8454c50571906c"} Jan 31 15:05:02 crc kubenswrapper[4751]: I0131 15:05:02.271296 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-1" podStartSLOduration=2.271277753 podStartE2EDuration="2.271277753s" podCreationTimestamp="2026-01-31 15:05:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 15:05:02.267598356 +0000 UTC m=+1406.642311261" watchObservedRunningTime="2026-01-31 15:05:02.271277753 +0000 UTC m=+1406.645990638" Jan 31 15:05:05 crc kubenswrapper[4751]: I0131 15:05:05.864403 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:05 crc kubenswrapper[4751]: I0131 15:05:05.864884 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:05 crc kubenswrapper[4751]: I0131 15:05:05.892731 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:05 crc kubenswrapper[4751]: I0131 15:05:05.916930 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:05 crc kubenswrapper[4751]: I0131 15:05:05.931497 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:05:05 crc kubenswrapper[4751]: I0131 15:05:05.931590 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:05:05 crc kubenswrapper[4751]: I0131 15:05:05.967195 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:05:05 crc kubenswrapper[4751]: I0131 15:05:05.980695 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:05:06 crc kubenswrapper[4751]: I0131 15:05:06.020879 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:06 crc kubenswrapper[4751]: I0131 15:05:06.020938 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:06 crc kubenswrapper[4751]: I0131 15:05:06.043773 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:06 crc kubenswrapper[4751]: I0131 15:05:06.068572 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:06 crc kubenswrapper[4751]: I0131 15:05:06.279879 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:06 crc kubenswrapper[4751]: I0131 15:05:06.279968 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:05:06 crc kubenswrapper[4751]: I0131 15:05:06.280305 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:05:06 crc kubenswrapper[4751]: I0131 15:05:06.280335 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:06 crc kubenswrapper[4751]: I0131 15:05:06.280347 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:06 crc kubenswrapper[4751]: I0131 15:05:06.280356 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:08 crc kubenswrapper[4751]: I0131 15:05:08.188134 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:08 crc kubenswrapper[4751]: I0131 15:05:08.194109 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:08 crc kubenswrapper[4751]: I0131 15:05:08.229908 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:08 crc kubenswrapper[4751]: I0131 15:05:08.230713 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:05:08 crc kubenswrapper[4751]: I0131 15:05:08.245634 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:08 crc kubenswrapper[4751]: I0131 15:05:08.248853 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:05:08 crc kubenswrapper[4751]: I0131 15:05:08.355944 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Jan 31 15:05:08 crc kubenswrapper[4751]: I0131 15:05:08.896701 4751 patch_prober.go:28] interesting pod/machine-config-daemon-2wpj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 15:05:08 crc kubenswrapper[4751]: I0131 15:05:08.896751 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podUID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 15:05:10 crc kubenswrapper[4751]: I0131 15:05:10.321784 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="d1590321-a9e4-43b9-a9a0-04dc832b3332" containerName="glance-log" containerID="cri-o://efd3679f0fb441a01fa19571a9aa5ba37ff46c03bfa892f113c431e3a9022ffe" gracePeriod=30 Jan 31 15:05:10 crc kubenswrapper[4751]: I0131 15:05:10.321892 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="d1590321-a9e4-43b9-a9a0-04dc832b3332" containerName="glance-httpd" containerID="cri-o://79f170b0f270f110e5b5d5229988d518034d01da466dbe2c3254336ea700274a" gracePeriod=30 Jan 31 15:05:10 crc kubenswrapper[4751]: I0131 15:05:10.629880 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:05:10 crc kubenswrapper[4751]: I0131 15:05:10.629953 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:05:10 crc kubenswrapper[4751]: I0131 15:05:10.673503 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:05:10 crc kubenswrapper[4751]: I0131 15:05:10.702166 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:05:11 crc kubenswrapper[4751]: I0131 15:05:11.332564 4751 generic.go:334] "Generic (PLEG): container finished" podID="d1590321-a9e4-43b9-a9a0-04dc832b3332" containerID="efd3679f0fb441a01fa19571a9aa5ba37ff46c03bfa892f113c431e3a9022ffe" exitCode=143 Jan 31 15:05:11 crc kubenswrapper[4751]: I0131 15:05:11.332606 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"d1590321-a9e4-43b9-a9a0-04dc832b3332","Type":"ContainerDied","Data":"efd3679f0fb441a01fa19571a9aa5ba37ff46c03bfa892f113c431e3a9022ffe"} Jan 31 15:05:11 crc kubenswrapper[4751]: I0131 15:05:11.332868 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:05:11 crc kubenswrapper[4751]: I0131 15:05:11.332891 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:05:13 crc kubenswrapper[4751]: I0131 15:05:13.228877 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:05:13 crc kubenswrapper[4751]: I0131 15:05:13.231548 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:05:13 crc kubenswrapper[4751]: I0131 15:05:13.291724 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 15:05:13 crc kubenswrapper[4751]: I0131 15:05:13.292061 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="8309fec8-e5ee-4e23-8617-ab2e7ba833d6" containerName="glance-log" containerID="cri-o://2af1683a972bf70a8cfc2843e554280e84c5731cd7be66a108127c478a878dd2" gracePeriod=30 Jan 31 15:05:13 crc kubenswrapper[4751]: I0131 15:05:13.292121 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="8309fec8-e5ee-4e23-8617-ab2e7ba833d6" containerName="glance-httpd" containerID="cri-o://0da059b6d1edf05d0ffe817ea6cbe4df21ba9242536bbbd9d3e7d479bd096546" gracePeriod=30 Jan 31 15:05:13 crc kubenswrapper[4751]: I0131 15:05:13.820757 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:13 crc kubenswrapper[4751]: I0131 15:05:13.908936 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9x4g\" (UniqueName: \"kubernetes.io/projected/d1590321-a9e4-43b9-a9a0-04dc832b3332-kube-api-access-g9x4g\") pod \"d1590321-a9e4-43b9-a9a0-04dc832b3332\" (UID: \"d1590321-a9e4-43b9-a9a0-04dc832b3332\") " Jan 31 15:05:13 crc kubenswrapper[4751]: I0131 15:05:13.908980 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d1590321-a9e4-43b9-a9a0-04dc832b3332-dev\") pod \"d1590321-a9e4-43b9-a9a0-04dc832b3332\" (UID: \"d1590321-a9e4-43b9-a9a0-04dc832b3332\") " Jan 31 15:05:13 crc kubenswrapper[4751]: I0131 15:05:13.909009 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1590321-a9e4-43b9-a9a0-04dc832b3332-logs\") pod \"d1590321-a9e4-43b9-a9a0-04dc832b3332\" (UID: \"d1590321-a9e4-43b9-a9a0-04dc832b3332\") " Jan 31 15:05:13 crc kubenswrapper[4751]: I0131 15:05:13.909026 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"d1590321-a9e4-43b9-a9a0-04dc832b3332\" (UID: \"d1590321-a9e4-43b9-a9a0-04dc832b3332\") " Jan 31 15:05:13 crc kubenswrapper[4751]: I0131 15:05:13.909048 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d1590321-a9e4-43b9-a9a0-04dc832b3332-etc-nvme\") pod \"d1590321-a9e4-43b9-a9a0-04dc832b3332\" (UID: \"d1590321-a9e4-43b9-a9a0-04dc832b3332\") " Jan 31 15:05:13 crc kubenswrapper[4751]: I0131 15:05:13.909098 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"d1590321-a9e4-43b9-a9a0-04dc832b3332\" (UID: \"d1590321-a9e4-43b9-a9a0-04dc832b3332\") " Jan 31 15:05:13 crc kubenswrapper[4751]: I0131 15:05:13.909149 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1590321-a9e4-43b9-a9a0-04dc832b3332-config-data\") pod \"d1590321-a9e4-43b9-a9a0-04dc832b3332\" (UID: \"d1590321-a9e4-43b9-a9a0-04dc832b3332\") " Jan 31 15:05:13 crc kubenswrapper[4751]: I0131 15:05:13.909245 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d1590321-a9e4-43b9-a9a0-04dc832b3332-httpd-run\") pod \"d1590321-a9e4-43b9-a9a0-04dc832b3332\" (UID: \"d1590321-a9e4-43b9-a9a0-04dc832b3332\") " Jan 31 15:05:13 crc kubenswrapper[4751]: I0131 15:05:13.909293 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d1590321-a9e4-43b9-a9a0-04dc832b3332-etc-iscsi\") pod \"d1590321-a9e4-43b9-a9a0-04dc832b3332\" (UID: \"d1590321-a9e4-43b9-a9a0-04dc832b3332\") " Jan 31 15:05:13 crc kubenswrapper[4751]: I0131 15:05:13.909313 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d1590321-a9e4-43b9-a9a0-04dc832b3332-run\") pod \"d1590321-a9e4-43b9-a9a0-04dc832b3332\" (UID: \"d1590321-a9e4-43b9-a9a0-04dc832b3332\") " Jan 31 15:05:13 crc kubenswrapper[4751]: I0131 15:05:13.909333 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d1590321-a9e4-43b9-a9a0-04dc832b3332-sys\") pod \"d1590321-a9e4-43b9-a9a0-04dc832b3332\" (UID: \"d1590321-a9e4-43b9-a9a0-04dc832b3332\") " Jan 31 15:05:13 crc kubenswrapper[4751]: I0131 15:05:13.909363 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d1590321-a9e4-43b9-a9a0-04dc832b3332-var-locks-brick\") pod \"d1590321-a9e4-43b9-a9a0-04dc832b3332\" (UID: \"d1590321-a9e4-43b9-a9a0-04dc832b3332\") " Jan 31 15:05:13 crc kubenswrapper[4751]: I0131 15:05:13.909377 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d1590321-a9e4-43b9-a9a0-04dc832b3332-lib-modules\") pod \"d1590321-a9e4-43b9-a9a0-04dc832b3332\" (UID: \"d1590321-a9e4-43b9-a9a0-04dc832b3332\") " Jan 31 15:05:13 crc kubenswrapper[4751]: I0131 15:05:13.909396 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1590321-a9e4-43b9-a9a0-04dc832b3332-scripts\") pod \"d1590321-a9e4-43b9-a9a0-04dc832b3332\" (UID: \"d1590321-a9e4-43b9-a9a0-04dc832b3332\") " Jan 31 15:05:13 crc kubenswrapper[4751]: I0131 15:05:13.910513 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d1590321-a9e4-43b9-a9a0-04dc832b3332-run" (OuterVolumeSpecName: "run") pod "d1590321-a9e4-43b9-a9a0-04dc832b3332" (UID: "d1590321-a9e4-43b9-a9a0-04dc832b3332"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:05:13 crc kubenswrapper[4751]: I0131 15:05:13.910600 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d1590321-a9e4-43b9-a9a0-04dc832b3332-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "d1590321-a9e4-43b9-a9a0-04dc832b3332" (UID: "d1590321-a9e4-43b9-a9a0-04dc832b3332"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:05:13 crc kubenswrapper[4751]: I0131 15:05:13.910612 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d1590321-a9e4-43b9-a9a0-04dc832b3332-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "d1590321-a9e4-43b9-a9a0-04dc832b3332" (UID: "d1590321-a9e4-43b9-a9a0-04dc832b3332"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:05:13 crc kubenswrapper[4751]: I0131 15:05:13.910635 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d1590321-a9e4-43b9-a9a0-04dc832b3332-sys" (OuterVolumeSpecName: "sys") pod "d1590321-a9e4-43b9-a9a0-04dc832b3332" (UID: "d1590321-a9e4-43b9-a9a0-04dc832b3332"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:05:13 crc kubenswrapper[4751]: I0131 15:05:13.910695 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d1590321-a9e4-43b9-a9a0-04dc832b3332-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "d1590321-a9e4-43b9-a9a0-04dc832b3332" (UID: "d1590321-a9e4-43b9-a9a0-04dc832b3332"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:05:13 crc kubenswrapper[4751]: I0131 15:05:13.910718 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d1590321-a9e4-43b9-a9a0-04dc832b3332-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "d1590321-a9e4-43b9-a9a0-04dc832b3332" (UID: "d1590321-a9e4-43b9-a9a0-04dc832b3332"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:05:13 crc kubenswrapper[4751]: I0131 15:05:13.910790 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d1590321-a9e4-43b9-a9a0-04dc832b3332-dev" (OuterVolumeSpecName: "dev") pod "d1590321-a9e4-43b9-a9a0-04dc832b3332" (UID: "d1590321-a9e4-43b9-a9a0-04dc832b3332"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:05:13 crc kubenswrapper[4751]: I0131 15:05:13.910869 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1590321-a9e4-43b9-a9a0-04dc832b3332-logs" (OuterVolumeSpecName: "logs") pod "d1590321-a9e4-43b9-a9a0-04dc832b3332" (UID: "d1590321-a9e4-43b9-a9a0-04dc832b3332"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:05:13 crc kubenswrapper[4751]: I0131 15:05:13.911073 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1590321-a9e4-43b9-a9a0-04dc832b3332-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d1590321-a9e4-43b9-a9a0-04dc832b3332" (UID: "d1590321-a9e4-43b9-a9a0-04dc832b3332"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:05:13 crc kubenswrapper[4751]: I0131 15:05:13.919236 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1590321-a9e4-43b9-a9a0-04dc832b3332-kube-api-access-g9x4g" (OuterVolumeSpecName: "kube-api-access-g9x4g") pod "d1590321-a9e4-43b9-a9a0-04dc832b3332" (UID: "d1590321-a9e4-43b9-a9a0-04dc832b3332"). InnerVolumeSpecName "kube-api-access-g9x4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:05:13 crc kubenswrapper[4751]: I0131 15:05:13.927981 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "d1590321-a9e4-43b9-a9a0-04dc832b3332" (UID: "d1590321-a9e4-43b9-a9a0-04dc832b3332"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:05:13 crc kubenswrapper[4751]: I0131 15:05:13.927997 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1590321-a9e4-43b9-a9a0-04dc832b3332-scripts" (OuterVolumeSpecName: "scripts") pod "d1590321-a9e4-43b9-a9a0-04dc832b3332" (UID: "d1590321-a9e4-43b9-a9a0-04dc832b3332"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:05:13 crc kubenswrapper[4751]: I0131 15:05:13.928178 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance-cache") pod "d1590321-a9e4-43b9-a9a0-04dc832b3332" (UID: "d1590321-a9e4-43b9-a9a0-04dc832b3332"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:05:13 crc kubenswrapper[4751]: I0131 15:05:13.956931 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1590321-a9e4-43b9-a9a0-04dc832b3332-config-data" (OuterVolumeSpecName: "config-data") pod "d1590321-a9e4-43b9-a9a0-04dc832b3332" (UID: "d1590321-a9e4-43b9-a9a0-04dc832b3332"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.010773 4751 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d1590321-a9e4-43b9-a9a0-04dc832b3332-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.010813 4751 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d1590321-a9e4-43b9-a9a0-04dc832b3332-etc-iscsi\") on node \"crc\" DevicePath \"\"" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.010830 4751 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d1590321-a9e4-43b9-a9a0-04dc832b3332-run\") on node \"crc\" DevicePath \"\"" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.010842 4751 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d1590321-a9e4-43b9-a9a0-04dc832b3332-sys\") on node \"crc\" DevicePath \"\"" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.010853 4751 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d1590321-a9e4-43b9-a9a0-04dc832b3332-var-locks-brick\") on node \"crc\" DevicePath \"\"" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.010866 4751 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d1590321-a9e4-43b9-a9a0-04dc832b3332-lib-modules\") on node \"crc\" DevicePath \"\"" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.010876 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1590321-a9e4-43b9-a9a0-04dc832b3332-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.010899 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9x4g\" (UniqueName: \"kubernetes.io/projected/d1590321-a9e4-43b9-a9a0-04dc832b3332-kube-api-access-g9x4g\") on node \"crc\" DevicePath \"\"" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.010912 4751 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d1590321-a9e4-43b9-a9a0-04dc832b3332-dev\") on node \"crc\" DevicePath \"\"" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.010922 4751 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1590321-a9e4-43b9-a9a0-04dc832b3332-logs\") on node \"crc\" DevicePath \"\"" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.010953 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.010964 4751 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d1590321-a9e4-43b9-a9a0-04dc832b3332-etc-nvme\") on node \"crc\" DevicePath \"\"" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.010980 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.010992 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1590321-a9e4-43b9-a9a0-04dc832b3332-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.037973 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.044155 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.112813 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.112854 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.364393 4751 generic.go:334] "Generic (PLEG): container finished" podID="d1590321-a9e4-43b9-a9a0-04dc832b3332" containerID="79f170b0f270f110e5b5d5229988d518034d01da466dbe2c3254336ea700274a" exitCode=0 Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.364449 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"d1590321-a9e4-43b9-a9a0-04dc832b3332","Type":"ContainerDied","Data":"79f170b0f270f110e5b5d5229988d518034d01da466dbe2c3254336ea700274a"} Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.364504 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"d1590321-a9e4-43b9-a9a0-04dc832b3332","Type":"ContainerDied","Data":"74ea627d68211382ddcd6045fc064247fa1aa0e13ee2656765ee16aee6d9bb61"} Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.364522 4751 scope.go:117] "RemoveContainer" containerID="79f170b0f270f110e5b5d5229988d518034d01da466dbe2c3254336ea700274a" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.364467 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.367570 4751 generic.go:334] "Generic (PLEG): container finished" podID="8309fec8-e5ee-4e23-8617-ab2e7ba833d6" containerID="2af1683a972bf70a8cfc2843e554280e84c5731cd7be66a108127c478a878dd2" exitCode=143 Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.369025 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"8309fec8-e5ee-4e23-8617-ab2e7ba833d6","Type":"ContainerDied","Data":"2af1683a972bf70a8cfc2843e554280e84c5731cd7be66a108127c478a878dd2"} Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.388084 4751 scope.go:117] "RemoveContainer" containerID="efd3679f0fb441a01fa19571a9aa5ba37ff46c03bfa892f113c431e3a9022ffe" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.410417 4751 scope.go:117] "RemoveContainer" containerID="79f170b0f270f110e5b5d5229988d518034d01da466dbe2c3254336ea700274a" Jan 31 15:05:14 crc kubenswrapper[4751]: E0131 15:05:14.410809 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79f170b0f270f110e5b5d5229988d518034d01da466dbe2c3254336ea700274a\": container with ID starting with 79f170b0f270f110e5b5d5229988d518034d01da466dbe2c3254336ea700274a not found: ID does not exist" containerID="79f170b0f270f110e5b5d5229988d518034d01da466dbe2c3254336ea700274a" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.410846 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79f170b0f270f110e5b5d5229988d518034d01da466dbe2c3254336ea700274a"} err="failed to get container status \"79f170b0f270f110e5b5d5229988d518034d01da466dbe2c3254336ea700274a\": rpc error: code = NotFound desc = could not find container \"79f170b0f270f110e5b5d5229988d518034d01da466dbe2c3254336ea700274a\": container with ID starting with 79f170b0f270f110e5b5d5229988d518034d01da466dbe2c3254336ea700274a not found: ID does not exist" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.410874 4751 scope.go:117] "RemoveContainer" containerID="efd3679f0fb441a01fa19571a9aa5ba37ff46c03bfa892f113c431e3a9022ffe" Jan 31 15:05:14 crc kubenswrapper[4751]: E0131 15:05:14.411347 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efd3679f0fb441a01fa19571a9aa5ba37ff46c03bfa892f113c431e3a9022ffe\": container with ID starting with efd3679f0fb441a01fa19571a9aa5ba37ff46c03bfa892f113c431e3a9022ffe not found: ID does not exist" containerID="efd3679f0fb441a01fa19571a9aa5ba37ff46c03bfa892f113c431e3a9022ffe" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.411374 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efd3679f0fb441a01fa19571a9aa5ba37ff46c03bfa892f113c431e3a9022ffe"} err="failed to get container status \"efd3679f0fb441a01fa19571a9aa5ba37ff46c03bfa892f113c431e3a9022ffe\": rpc error: code = NotFound desc = could not find container \"efd3679f0fb441a01fa19571a9aa5ba37ff46c03bfa892f113c431e3a9022ffe\": container with ID starting with efd3679f0fb441a01fa19571a9aa5ba37ff46c03bfa892f113c431e3a9022ffe not found: ID does not exist" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.429206 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.436772 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.466511 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Jan 31 15:05:14 crc kubenswrapper[4751]: E0131 15:05:14.466902 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1590321-a9e4-43b9-a9a0-04dc832b3332" containerName="glance-log" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.466921 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1590321-a9e4-43b9-a9a0-04dc832b3332" containerName="glance-log" Jan 31 15:05:14 crc kubenswrapper[4751]: E0131 15:05:14.466962 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1590321-a9e4-43b9-a9a0-04dc832b3332" containerName="glance-httpd" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.466971 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1590321-a9e4-43b9-a9a0-04dc832b3332" containerName="glance-httpd" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.467146 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1590321-a9e4-43b9-a9a0-04dc832b3332" containerName="glance-log" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.467166 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1590321-a9e4-43b9-a9a0-04dc832b3332" containerName="glance-httpd" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.468047 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.473910 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.621891 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/90af064c-9d0a-4818-8e19-c87da44a879b-dev\") pod \"glance-default-external-api-0\" (UID: \"90af064c-9d0a-4818-8e19-c87da44a879b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.622261 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90af064c-9d0a-4818-8e19-c87da44a879b-scripts\") pod \"glance-default-external-api-0\" (UID: \"90af064c-9d0a-4818-8e19-c87da44a879b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.622301 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/90af064c-9d0a-4818-8e19-c87da44a879b-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"90af064c-9d0a-4818-8e19-c87da44a879b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.622329 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/90af064c-9d0a-4818-8e19-c87da44a879b-sys\") pod \"glance-default-external-api-0\" (UID: \"90af064c-9d0a-4818-8e19-c87da44a879b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.622437 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90af064c-9d0a-4818-8e19-c87da44a879b-logs\") pod \"glance-default-external-api-0\" (UID: \"90af064c-9d0a-4818-8e19-c87da44a879b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.622736 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/90af064c-9d0a-4818-8e19-c87da44a879b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"90af064c-9d0a-4818-8e19-c87da44a879b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.622783 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"90af064c-9d0a-4818-8e19-c87da44a879b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.623006 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/90af064c-9d0a-4818-8e19-c87da44a879b-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"90af064c-9d0a-4818-8e19-c87da44a879b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.623040 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/90af064c-9d0a-4818-8e19-c87da44a879b-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"90af064c-9d0a-4818-8e19-c87da44a879b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.623287 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwj7s\" (UniqueName: \"kubernetes.io/projected/90af064c-9d0a-4818-8e19-c87da44a879b-kube-api-access-rwj7s\") pod \"glance-default-external-api-0\" (UID: \"90af064c-9d0a-4818-8e19-c87da44a879b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.623320 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/90af064c-9d0a-4818-8e19-c87da44a879b-run\") pod \"glance-default-external-api-0\" (UID: \"90af064c-9d0a-4818-8e19-c87da44a879b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.623365 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/90af064c-9d0a-4818-8e19-c87da44a879b-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"90af064c-9d0a-4818-8e19-c87da44a879b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.623405 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"90af064c-9d0a-4818-8e19-c87da44a879b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.623435 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90af064c-9d0a-4818-8e19-c87da44a879b-config-data\") pod \"glance-default-external-api-0\" (UID: \"90af064c-9d0a-4818-8e19-c87da44a879b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.725275 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwj7s\" (UniqueName: \"kubernetes.io/projected/90af064c-9d0a-4818-8e19-c87da44a879b-kube-api-access-rwj7s\") pod \"glance-default-external-api-0\" (UID: \"90af064c-9d0a-4818-8e19-c87da44a879b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.725346 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/90af064c-9d0a-4818-8e19-c87da44a879b-run\") pod \"glance-default-external-api-0\" (UID: \"90af064c-9d0a-4818-8e19-c87da44a879b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.725398 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/90af064c-9d0a-4818-8e19-c87da44a879b-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"90af064c-9d0a-4818-8e19-c87da44a879b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.725435 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"90af064c-9d0a-4818-8e19-c87da44a879b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.725468 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/90af064c-9d0a-4818-8e19-c87da44a879b-run\") pod \"glance-default-external-api-0\" (UID: \"90af064c-9d0a-4818-8e19-c87da44a879b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.725480 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90af064c-9d0a-4818-8e19-c87da44a879b-config-data\") pod \"glance-default-external-api-0\" (UID: \"90af064c-9d0a-4818-8e19-c87da44a879b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.725571 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/90af064c-9d0a-4818-8e19-c87da44a879b-dev\") pod \"glance-default-external-api-0\" (UID: \"90af064c-9d0a-4818-8e19-c87da44a879b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.725593 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/90af064c-9d0a-4818-8e19-c87da44a879b-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"90af064c-9d0a-4818-8e19-c87da44a879b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.725604 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90af064c-9d0a-4818-8e19-c87da44a879b-scripts\") pod \"glance-default-external-api-0\" (UID: \"90af064c-9d0a-4818-8e19-c87da44a879b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.725696 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/90af064c-9d0a-4818-8e19-c87da44a879b-dev\") pod \"glance-default-external-api-0\" (UID: \"90af064c-9d0a-4818-8e19-c87da44a879b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.725706 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/90af064c-9d0a-4818-8e19-c87da44a879b-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"90af064c-9d0a-4818-8e19-c87da44a879b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.725743 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/90af064c-9d0a-4818-8e19-c87da44a879b-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"90af064c-9d0a-4818-8e19-c87da44a879b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.725767 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"90af064c-9d0a-4818-8e19-c87da44a879b\") device mount path \"/mnt/openstack/pv02\"" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.725803 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/90af064c-9d0a-4818-8e19-c87da44a879b-sys\") pod \"glance-default-external-api-0\" (UID: \"90af064c-9d0a-4818-8e19-c87da44a879b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.725772 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/90af064c-9d0a-4818-8e19-c87da44a879b-sys\") pod \"glance-default-external-api-0\" (UID: \"90af064c-9d0a-4818-8e19-c87da44a879b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.725857 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90af064c-9d0a-4818-8e19-c87da44a879b-logs\") pod \"glance-default-external-api-0\" (UID: \"90af064c-9d0a-4818-8e19-c87da44a879b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.725935 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/90af064c-9d0a-4818-8e19-c87da44a879b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"90af064c-9d0a-4818-8e19-c87da44a879b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.725991 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"90af064c-9d0a-4818-8e19-c87da44a879b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.726042 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/90af064c-9d0a-4818-8e19-c87da44a879b-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"90af064c-9d0a-4818-8e19-c87da44a879b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.726110 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/90af064c-9d0a-4818-8e19-c87da44a879b-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"90af064c-9d0a-4818-8e19-c87da44a879b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.726228 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"90af064c-9d0a-4818-8e19-c87da44a879b\") device mount path \"/mnt/openstack/pv05\"" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.726265 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/90af064c-9d0a-4818-8e19-c87da44a879b-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"90af064c-9d0a-4818-8e19-c87da44a879b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.726341 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/90af064c-9d0a-4818-8e19-c87da44a879b-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"90af064c-9d0a-4818-8e19-c87da44a879b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.726386 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/90af064c-9d0a-4818-8e19-c87da44a879b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"90af064c-9d0a-4818-8e19-c87da44a879b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.726605 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90af064c-9d0a-4818-8e19-c87da44a879b-logs\") pod \"glance-default-external-api-0\" (UID: \"90af064c-9d0a-4818-8e19-c87da44a879b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.731096 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90af064c-9d0a-4818-8e19-c87da44a879b-scripts\") pod \"glance-default-external-api-0\" (UID: \"90af064c-9d0a-4818-8e19-c87da44a879b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.732664 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90af064c-9d0a-4818-8e19-c87da44a879b-config-data\") pod \"glance-default-external-api-0\" (UID: \"90af064c-9d0a-4818-8e19-c87da44a879b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.754006 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"90af064c-9d0a-4818-8e19-c87da44a879b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.758476 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwj7s\" (UniqueName: \"kubernetes.io/projected/90af064c-9d0a-4818-8e19-c87da44a879b-kube-api-access-rwj7s\") pod \"glance-default-external-api-0\" (UID: \"90af064c-9d0a-4818-8e19-c87da44a879b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.777615 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"90af064c-9d0a-4818-8e19-c87da44a879b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:14 crc kubenswrapper[4751]: I0131 15:05:14.789938 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:15 crc kubenswrapper[4751]: I0131 15:05:15.216993 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Jan 31 15:05:15 crc kubenswrapper[4751]: I0131 15:05:15.378479 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"90af064c-9d0a-4818-8e19-c87da44a879b","Type":"ContainerStarted","Data":"8b807b8111d0f0f33aee37200b1958ff47b158280257c5ed2833cf9c5c3a286f"} Jan 31 15:05:15 crc kubenswrapper[4751]: I0131 15:05:15.378876 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"90af064c-9d0a-4818-8e19-c87da44a879b","Type":"ContainerStarted","Data":"e99fc8a548b6e6e8e6da564fb55696f96c325bf5ca3500bbda2b1e9e31f7bf04"} Jan 31 15:05:16 crc kubenswrapper[4751]: I0131 15:05:16.390593 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"90af064c-9d0a-4818-8e19-c87da44a879b","Type":"ContainerStarted","Data":"e0cc5d1490293994ec3b55b9f608e90b08473604cf0ebba39024c27ee8d6005b"} Jan 31 15:05:16 crc kubenswrapper[4751]: I0131 15:05:16.419024 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1590321-a9e4-43b9-a9a0-04dc832b3332" path="/var/lib/kubelet/pods/d1590321-a9e4-43b9-a9a0-04dc832b3332/volumes" Jan 31 15:05:16 crc kubenswrapper[4751]: I0131 15:05:16.439922 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-0" podStartSLOduration=2.439902557 podStartE2EDuration="2.439902557s" podCreationTimestamp="2026-01-31 15:05:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 15:05:16.423111004 +0000 UTC m=+1420.797823889" watchObservedRunningTime="2026-01-31 15:05:16.439902557 +0000 UTC m=+1420.814615452" Jan 31 15:05:16 crc kubenswrapper[4751]: I0131 15:05:16.864605 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:16 crc kubenswrapper[4751]: I0131 15:05:16.964893 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-logs\") pod \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\" (UID: \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\") " Jan 31 15:05:16 crc kubenswrapper[4751]: I0131 15:05:16.964938 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\" (UID: \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\") " Jan 31 15:05:16 crc kubenswrapper[4751]: I0131 15:05:16.964968 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-sys\") pod \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\" (UID: \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\") " Jan 31 15:05:16 crc kubenswrapper[4751]: I0131 15:05:16.965034 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-config-data\") pod \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\" (UID: \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\") " Jan 31 15:05:16 crc kubenswrapper[4751]: I0131 15:05:16.965062 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-lib-modules\") pod \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\" (UID: \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\") " Jan 31 15:05:16 crc kubenswrapper[4751]: I0131 15:05:16.965112 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-etc-nvme\") pod \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\" (UID: \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\") " Jan 31 15:05:16 crc kubenswrapper[4751]: I0131 15:05:16.965166 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-httpd-run\") pod \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\" (UID: \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\") " Jan 31 15:05:16 crc kubenswrapper[4751]: I0131 15:05:16.965187 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-run\") pod \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\" (UID: \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\") " Jan 31 15:05:16 crc kubenswrapper[4751]: I0131 15:05:16.965209 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mj5bx\" (UniqueName: \"kubernetes.io/projected/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-kube-api-access-mj5bx\") pod \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\" (UID: \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\") " Jan 31 15:05:16 crc kubenswrapper[4751]: I0131 15:05:16.965252 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-scripts\") pod \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\" (UID: \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\") " Jan 31 15:05:16 crc kubenswrapper[4751]: I0131 15:05:16.965279 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\" (UID: \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\") " Jan 31 15:05:16 crc kubenswrapper[4751]: I0131 15:05:16.965316 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-etc-iscsi\") pod \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\" (UID: \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\") " Jan 31 15:05:16 crc kubenswrapper[4751]: I0131 15:05:16.965353 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-var-locks-brick\") pod \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\" (UID: \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\") " Jan 31 15:05:16 crc kubenswrapper[4751]: I0131 15:05:16.965391 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-dev\") pod \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\" (UID: \"8309fec8-e5ee-4e23-8617-ab2e7ba833d6\") " Jan 31 15:05:16 crc kubenswrapper[4751]: I0131 15:05:16.965795 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-dev" (OuterVolumeSpecName: "dev") pod "8309fec8-e5ee-4e23-8617-ab2e7ba833d6" (UID: "8309fec8-e5ee-4e23-8617-ab2e7ba833d6"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:05:16 crc kubenswrapper[4751]: I0131 15:05:16.965835 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "8309fec8-e5ee-4e23-8617-ab2e7ba833d6" (UID: "8309fec8-e5ee-4e23-8617-ab2e7ba833d6"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:05:16 crc kubenswrapper[4751]: I0131 15:05:16.965835 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-sys" (OuterVolumeSpecName: "sys") pod "8309fec8-e5ee-4e23-8617-ab2e7ba833d6" (UID: "8309fec8-e5ee-4e23-8617-ab2e7ba833d6"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:05:16 crc kubenswrapper[4751]: I0131 15:05:16.965877 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "8309fec8-e5ee-4e23-8617-ab2e7ba833d6" (UID: "8309fec8-e5ee-4e23-8617-ab2e7ba833d6"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:05:16 crc kubenswrapper[4751]: I0131 15:05:16.965907 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "8309fec8-e5ee-4e23-8617-ab2e7ba833d6" (UID: "8309fec8-e5ee-4e23-8617-ab2e7ba833d6"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:05:16 crc kubenswrapper[4751]: I0131 15:05:16.965950 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "8309fec8-e5ee-4e23-8617-ab2e7ba833d6" (UID: "8309fec8-e5ee-4e23-8617-ab2e7ba833d6"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:05:16 crc kubenswrapper[4751]: I0131 15:05:16.965953 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-run" (OuterVolumeSpecName: "run") pod "8309fec8-e5ee-4e23-8617-ab2e7ba833d6" (UID: "8309fec8-e5ee-4e23-8617-ab2e7ba833d6"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:05:16 crc kubenswrapper[4751]: I0131 15:05:16.966119 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8309fec8-e5ee-4e23-8617-ab2e7ba833d6" (UID: "8309fec8-e5ee-4e23-8617-ab2e7ba833d6"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:05:16 crc kubenswrapper[4751]: I0131 15:05:16.966193 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-logs" (OuterVolumeSpecName: "logs") pod "8309fec8-e5ee-4e23-8617-ab2e7ba833d6" (UID: "8309fec8-e5ee-4e23-8617-ab2e7ba833d6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:05:16 crc kubenswrapper[4751]: I0131 15:05:16.971153 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance-cache") pod "8309fec8-e5ee-4e23-8617-ab2e7ba833d6" (UID: "8309fec8-e5ee-4e23-8617-ab2e7ba833d6"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:05:16 crc kubenswrapper[4751]: I0131 15:05:16.972202 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-scripts" (OuterVolumeSpecName: "scripts") pod "8309fec8-e5ee-4e23-8617-ab2e7ba833d6" (UID: "8309fec8-e5ee-4e23-8617-ab2e7ba833d6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:05:16 crc kubenswrapper[4751]: I0131 15:05:16.973491 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-kube-api-access-mj5bx" (OuterVolumeSpecName: "kube-api-access-mj5bx") pod "8309fec8-e5ee-4e23-8617-ab2e7ba833d6" (UID: "8309fec8-e5ee-4e23-8617-ab2e7ba833d6"). InnerVolumeSpecName "kube-api-access-mj5bx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:05:16 crc kubenswrapper[4751]: I0131 15:05:16.982591 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage15-crc" (OuterVolumeSpecName: "glance") pod "8309fec8-e5ee-4e23-8617-ab2e7ba833d6" (UID: "8309fec8-e5ee-4e23-8617-ab2e7ba833d6"). InnerVolumeSpecName "local-storage15-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.022200 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-config-data" (OuterVolumeSpecName: "config-data") pod "8309fec8-e5ee-4e23-8617-ab2e7ba833d6" (UID: "8309fec8-e5ee-4e23-8617-ab2e7ba833d6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.067601 4751 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.067635 4751 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-run\") on node \"crc\" DevicePath \"\"" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.067651 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mj5bx\" (UniqueName: \"kubernetes.io/projected/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-kube-api-access-mj5bx\") on node \"crc\" DevicePath \"\"" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.067686 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.067720 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.067731 4751 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-etc-iscsi\") on node \"crc\" DevicePath \"\"" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.067740 4751 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-var-locks-brick\") on node \"crc\" DevicePath \"\"" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.067749 4751 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-dev\") on node \"crc\" DevicePath \"\"" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.067758 4751 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-logs\") on node \"crc\" DevicePath \"\"" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.067771 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" " Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.067779 4751 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-sys\") on node \"crc\" DevicePath \"\"" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.067787 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.067795 4751 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-lib-modules\") on node \"crc\" DevicePath \"\"" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.067803 4751 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8309fec8-e5ee-4e23-8617-ab2e7ba833d6-etc-nvme\") on node \"crc\" DevicePath \"\"" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.083408 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage15-crc" (UniqueName: "kubernetes.io/local-volume/local-storage15-crc") on node "crc" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.086786 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.168884 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.168916 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.399163 4751 generic.go:334] "Generic (PLEG): container finished" podID="8309fec8-e5ee-4e23-8617-ab2e7ba833d6" containerID="0da059b6d1edf05d0ffe817ea6cbe4df21ba9242536bbbd9d3e7d479bd096546" exitCode=0 Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.399260 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.399253 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"8309fec8-e5ee-4e23-8617-ab2e7ba833d6","Type":"ContainerDied","Data":"0da059b6d1edf05d0ffe817ea6cbe4df21ba9242536bbbd9d3e7d479bd096546"} Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.399330 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"8309fec8-e5ee-4e23-8617-ab2e7ba833d6","Type":"ContainerDied","Data":"dd5644a60c6d37de5875344e64047668e5f4f00160a8166735d8995e0b9a3862"} Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.399355 4751 scope.go:117] "RemoveContainer" containerID="0da059b6d1edf05d0ffe817ea6cbe4df21ba9242536bbbd9d3e7d479bd096546" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.435968 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.445234 4751 scope.go:117] "RemoveContainer" containerID="2af1683a972bf70a8cfc2843e554280e84c5731cd7be66a108127c478a878dd2" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.448634 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.463190 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 15:05:17 crc kubenswrapper[4751]: E0131 15:05:17.463616 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8309fec8-e5ee-4e23-8617-ab2e7ba833d6" containerName="glance-log" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.463634 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="8309fec8-e5ee-4e23-8617-ab2e7ba833d6" containerName="glance-log" Jan 31 15:05:17 crc kubenswrapper[4751]: E0131 15:05:17.463659 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8309fec8-e5ee-4e23-8617-ab2e7ba833d6" containerName="glance-httpd" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.463664 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="8309fec8-e5ee-4e23-8617-ab2e7ba833d6" containerName="glance-httpd" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.463801 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="8309fec8-e5ee-4e23-8617-ab2e7ba833d6" containerName="glance-log" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.463813 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="8309fec8-e5ee-4e23-8617-ab2e7ba833d6" containerName="glance-httpd" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.464687 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.472022 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.472305 4751 scope.go:117] "RemoveContainer" containerID="0da059b6d1edf05d0ffe817ea6cbe4df21ba9242536bbbd9d3e7d479bd096546" Jan 31 15:05:17 crc kubenswrapper[4751]: E0131 15:05:17.480240 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0da059b6d1edf05d0ffe817ea6cbe4df21ba9242536bbbd9d3e7d479bd096546\": container with ID starting with 0da059b6d1edf05d0ffe817ea6cbe4df21ba9242536bbbd9d3e7d479bd096546 not found: ID does not exist" containerID="0da059b6d1edf05d0ffe817ea6cbe4df21ba9242536bbbd9d3e7d479bd096546" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.480299 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0da059b6d1edf05d0ffe817ea6cbe4df21ba9242536bbbd9d3e7d479bd096546"} err="failed to get container status \"0da059b6d1edf05d0ffe817ea6cbe4df21ba9242536bbbd9d3e7d479bd096546\": rpc error: code = NotFound desc = could not find container \"0da059b6d1edf05d0ffe817ea6cbe4df21ba9242536bbbd9d3e7d479bd096546\": container with ID starting with 0da059b6d1edf05d0ffe817ea6cbe4df21ba9242536bbbd9d3e7d479bd096546 not found: ID does not exist" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.480332 4751 scope.go:117] "RemoveContainer" containerID="2af1683a972bf70a8cfc2843e554280e84c5731cd7be66a108127c478a878dd2" Jan 31 15:05:17 crc kubenswrapper[4751]: E0131 15:05:17.483126 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2af1683a972bf70a8cfc2843e554280e84c5731cd7be66a108127c478a878dd2\": container with ID starting with 2af1683a972bf70a8cfc2843e554280e84c5731cd7be66a108127c478a878dd2 not found: ID does not exist" containerID="2af1683a972bf70a8cfc2843e554280e84c5731cd7be66a108127c478a878dd2" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.483184 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2af1683a972bf70a8cfc2843e554280e84c5731cd7be66a108127c478a878dd2"} err="failed to get container status \"2af1683a972bf70a8cfc2843e554280e84c5731cd7be66a108127c478a878dd2\": rpc error: code = NotFound desc = could not find container \"2af1683a972bf70a8cfc2843e554280e84c5731cd7be66a108127c478a878dd2\": container with ID starting with 2af1683a972bf70a8cfc2843e554280e84c5731cd7be66a108127c478a878dd2 not found: ID does not exist" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.575243 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e0e8efba-9adc-482b-bd77-553d76648ac6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e0e8efba-9adc-482b-bd77-553d76648ac6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.575509 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e0e8efba-9adc-482b-bd77-553d76648ac6-dev\") pod \"glance-default-internal-api-0\" (UID: \"e0e8efba-9adc-482b-bd77-553d76648ac6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.575583 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0e8efba-9adc-482b-bd77-553d76648ac6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e0e8efba-9adc-482b-bd77-553d76648ac6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.575645 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e0e8efba-9adc-482b-bd77-553d76648ac6-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"e0e8efba-9adc-482b-bd77-553d76648ac6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.575770 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e0e8efba-9adc-482b-bd77-553d76648ac6-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"e0e8efba-9adc-482b-bd77-553d76648ac6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.575859 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0e8efba-9adc-482b-bd77-553d76648ac6-logs\") pod \"glance-default-internal-api-0\" (UID: \"e0e8efba-9adc-482b-bd77-553d76648ac6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.575927 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9kwf\" (UniqueName: \"kubernetes.io/projected/e0e8efba-9adc-482b-bd77-553d76648ac6-kube-api-access-l9kwf\") pod \"glance-default-internal-api-0\" (UID: \"e0e8efba-9adc-482b-bd77-553d76648ac6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.576019 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e0e8efba-9adc-482b-bd77-553d76648ac6-sys\") pod \"glance-default-internal-api-0\" (UID: \"e0e8efba-9adc-482b-bd77-553d76648ac6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.576109 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"e0e8efba-9adc-482b-bd77-553d76648ac6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.576184 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0e8efba-9adc-482b-bd77-553d76648ac6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e0e8efba-9adc-482b-bd77-553d76648ac6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.576248 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e0e8efba-9adc-482b-bd77-553d76648ac6-run\") pod \"glance-default-internal-api-0\" (UID: \"e0e8efba-9adc-482b-bd77-553d76648ac6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.576338 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e0e8efba-9adc-482b-bd77-553d76648ac6-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"e0e8efba-9adc-482b-bd77-553d76648ac6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.576554 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-internal-api-0\" (UID: \"e0e8efba-9adc-482b-bd77-553d76648ac6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.576707 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e0e8efba-9adc-482b-bd77-553d76648ac6-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"e0e8efba-9adc-482b-bd77-553d76648ac6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.681212 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e0e8efba-9adc-482b-bd77-553d76648ac6-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"e0e8efba-9adc-482b-bd77-553d76648ac6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.681267 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-internal-api-0\" (UID: \"e0e8efba-9adc-482b-bd77-553d76648ac6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.681302 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e0e8efba-9adc-482b-bd77-553d76648ac6-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"e0e8efba-9adc-482b-bd77-553d76648ac6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.681323 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e0e8efba-9adc-482b-bd77-553d76648ac6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e0e8efba-9adc-482b-bd77-553d76648ac6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.681358 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e0e8efba-9adc-482b-bd77-553d76648ac6-dev\") pod \"glance-default-internal-api-0\" (UID: \"e0e8efba-9adc-482b-bd77-553d76648ac6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.681376 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0e8efba-9adc-482b-bd77-553d76648ac6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e0e8efba-9adc-482b-bd77-553d76648ac6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.681391 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e0e8efba-9adc-482b-bd77-553d76648ac6-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"e0e8efba-9adc-482b-bd77-553d76648ac6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.681419 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e0e8efba-9adc-482b-bd77-553d76648ac6-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"e0e8efba-9adc-482b-bd77-553d76648ac6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.681446 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0e8efba-9adc-482b-bd77-553d76648ac6-logs\") pod \"glance-default-internal-api-0\" (UID: \"e0e8efba-9adc-482b-bd77-553d76648ac6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.681445 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-internal-api-0\" (UID: \"e0e8efba-9adc-482b-bd77-553d76648ac6\") device mount path \"/mnt/openstack/pv15\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.681495 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e0e8efba-9adc-482b-bd77-553d76648ac6-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"e0e8efba-9adc-482b-bd77-553d76648ac6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.681800 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e0e8efba-9adc-482b-bd77-553d76648ac6-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"e0e8efba-9adc-482b-bd77-553d76648ac6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.681833 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e0e8efba-9adc-482b-bd77-553d76648ac6-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"e0e8efba-9adc-482b-bd77-553d76648ac6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.682237 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0e8efba-9adc-482b-bd77-553d76648ac6-logs\") pod \"glance-default-internal-api-0\" (UID: \"e0e8efba-9adc-482b-bd77-553d76648ac6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.682487 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e0e8efba-9adc-482b-bd77-553d76648ac6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e0e8efba-9adc-482b-bd77-553d76648ac6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.682527 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e0e8efba-9adc-482b-bd77-553d76648ac6-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"e0e8efba-9adc-482b-bd77-553d76648ac6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.682549 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e0e8efba-9adc-482b-bd77-553d76648ac6-dev\") pod \"glance-default-internal-api-0\" (UID: \"e0e8efba-9adc-482b-bd77-553d76648ac6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.681462 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9kwf\" (UniqueName: \"kubernetes.io/projected/e0e8efba-9adc-482b-bd77-553d76648ac6-kube-api-access-l9kwf\") pod \"glance-default-internal-api-0\" (UID: \"e0e8efba-9adc-482b-bd77-553d76648ac6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.682586 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e0e8efba-9adc-482b-bd77-553d76648ac6-sys\") pod \"glance-default-internal-api-0\" (UID: \"e0e8efba-9adc-482b-bd77-553d76648ac6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.682605 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"e0e8efba-9adc-482b-bd77-553d76648ac6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.682621 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e0e8efba-9adc-482b-bd77-553d76648ac6-run\") pod \"glance-default-internal-api-0\" (UID: \"e0e8efba-9adc-482b-bd77-553d76648ac6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.682635 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0e8efba-9adc-482b-bd77-553d76648ac6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e0e8efba-9adc-482b-bd77-553d76648ac6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.684735 4751 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"e0e8efba-9adc-482b-bd77-553d76648ac6\") device mount path \"/mnt/openstack/pv08\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.685265 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e0e8efba-9adc-482b-bd77-553d76648ac6-run\") pod \"glance-default-internal-api-0\" (UID: \"e0e8efba-9adc-482b-bd77-553d76648ac6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.685331 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e0e8efba-9adc-482b-bd77-553d76648ac6-sys\") pod \"glance-default-internal-api-0\" (UID: \"e0e8efba-9adc-482b-bd77-553d76648ac6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.689950 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0e8efba-9adc-482b-bd77-553d76648ac6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e0e8efba-9adc-482b-bd77-553d76648ac6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.690570 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0e8efba-9adc-482b-bd77-553d76648ac6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e0e8efba-9adc-482b-bd77-553d76648ac6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.701863 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9kwf\" (UniqueName: \"kubernetes.io/projected/e0e8efba-9adc-482b-bd77-553d76648ac6-kube-api-access-l9kwf\") pod \"glance-default-internal-api-0\" (UID: \"e0e8efba-9adc-482b-bd77-553d76648ac6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.716847 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"e0e8efba-9adc-482b-bd77-553d76648ac6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.717703 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"glance-default-internal-api-0\" (UID: \"e0e8efba-9adc-482b-bd77-553d76648ac6\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:17 crc kubenswrapper[4751]: I0131 15:05:17.807866 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:18 crc kubenswrapper[4751]: I0131 15:05:18.233848 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 15:05:18 crc kubenswrapper[4751]: I0131 15:05:18.415261 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8309fec8-e5ee-4e23-8617-ab2e7ba833d6" path="/var/lib/kubelet/pods/8309fec8-e5ee-4e23-8617-ab2e7ba833d6/volumes" Jan 31 15:05:18 crc kubenswrapper[4751]: I0131 15:05:18.417774 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"e0e8efba-9adc-482b-bd77-553d76648ac6","Type":"ContainerStarted","Data":"de4d5808fc4ba7308f826962b8650c6ce882dcfa84a2b9961ed782c3d596f76e"} Jan 31 15:05:18 crc kubenswrapper[4751]: I0131 15:05:18.417813 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"e0e8efba-9adc-482b-bd77-553d76648ac6","Type":"ContainerStarted","Data":"8b49f74873882c94e30e439215c7b1269126be109dcab9f528966ad2a1118a0c"} Jan 31 15:05:19 crc kubenswrapper[4751]: I0131 15:05:19.424573 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"e0e8efba-9adc-482b-bd77-553d76648ac6","Type":"ContainerStarted","Data":"90a6f0dd1552854347833452de54355b3cea39a33ffea2db5092440b501dadb7"} Jan 31 15:05:19 crc kubenswrapper[4751]: I0131 15:05:19.452919 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=2.45289148 podStartE2EDuration="2.45289148s" podCreationTimestamp="2026-01-31 15:05:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 15:05:19.445580407 +0000 UTC m=+1423.820293292" watchObservedRunningTime="2026-01-31 15:05:19.45289148 +0000 UTC m=+1423.827604405" Jan 31 15:05:24 crc kubenswrapper[4751]: I0131 15:05:24.790964 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:24 crc kubenswrapper[4751]: I0131 15:05:24.793292 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:24 crc kubenswrapper[4751]: I0131 15:05:24.821697 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:24 crc kubenswrapper[4751]: I0131 15:05:24.829622 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:25 crc kubenswrapper[4751]: I0131 15:05:25.467498 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:25 crc kubenswrapper[4751]: I0131 15:05:25.467549 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:27 crc kubenswrapper[4751]: I0131 15:05:27.342475 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:27 crc kubenswrapper[4751]: I0131 15:05:27.349355 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:05:27 crc kubenswrapper[4751]: I0131 15:05:27.809100 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:27 crc kubenswrapper[4751]: I0131 15:05:27.809151 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:27 crc kubenswrapper[4751]: I0131 15:05:27.837827 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:27 crc kubenswrapper[4751]: I0131 15:05:27.848461 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:28 crc kubenswrapper[4751]: I0131 15:05:28.492563 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:28 crc kubenswrapper[4751]: I0131 15:05:28.492614 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:30 crc kubenswrapper[4751]: I0131 15:05:30.326711 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:30 crc kubenswrapper[4751]: I0131 15:05:30.330216 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:05:38 crc kubenswrapper[4751]: I0131 15:05:38.896942 4751 patch_prober.go:28] interesting pod/machine-config-daemon-2wpj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 15:05:38 crc kubenswrapper[4751]: I0131 15:05:38.897343 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podUID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 15:05:45 crc kubenswrapper[4751]: I0131 15:05:45.977094 4751 scope.go:117] "RemoveContainer" containerID="53e4421364bd50f8121a14bb4c3e20cbc7c5ba08c19bdb68ee47f37ac2b94308" Jan 31 15:05:46 crc kubenswrapper[4751]: I0131 15:05:46.000810 4751 scope.go:117] "RemoveContainer" containerID="4d83615719bc342a748610c14a746dcc08356aa04afe079d5f96b964e25ed0f6" Jan 31 15:05:46 crc kubenswrapper[4751]: I0131 15:05:46.073429 4751 scope.go:117] "RemoveContainer" containerID="4157453b55a598117cf21c7e58fec8625fe386b3472f188272985dac7429ad14" Jan 31 15:06:05 crc kubenswrapper[4751]: I0131 15:06:05.074237 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/root-account-create-update-4gxnx"] Jan 31 15:06:05 crc kubenswrapper[4751]: I0131 15:06:05.081541 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/root-account-create-update-4gxnx"] Jan 31 15:06:05 crc kubenswrapper[4751]: I0131 15:06:05.703962 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Jan 31 15:06:05 crc kubenswrapper[4751]: I0131 15:06:05.704328 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-1" podUID="2ce40f98-80af-4a4b-8556-c5c7dd84fc58" containerName="glance-httpd" containerID="cri-o://5b307a953bf028387f69f75e58e1545d2c2c38f81096b44abf4639237badb848" gracePeriod=30 Jan 31 15:06:05 crc kubenswrapper[4751]: I0131 15:06:05.704304 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-1" podUID="2ce40f98-80af-4a4b-8556-c5c7dd84fc58" containerName="glance-log" containerID="cri-o://e83e3eee6d1bd6eeb18c2480a68530a6737d3f76fac28ccff0e74d019e406b35" gracePeriod=30 Jan 31 15:06:05 crc kubenswrapper[4751]: I0131 15:06:05.847863 4751 generic.go:334] "Generic (PLEG): container finished" podID="2ce40f98-80af-4a4b-8556-c5c7dd84fc58" containerID="e83e3eee6d1bd6eeb18c2480a68530a6737d3f76fac28ccff0e74d019e406b35" exitCode=143 Jan 31 15:06:05 crc kubenswrapper[4751]: I0131 15:06:05.847903 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"2ce40f98-80af-4a4b-8556-c5c7dd84fc58","Type":"ContainerDied","Data":"e83e3eee6d1bd6eeb18c2480a68530a6737d3f76fac28ccff0e74d019e406b35"} Jan 31 15:06:05 crc kubenswrapper[4751]: I0131 15:06:05.867611 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Jan 31 15:06:05 crc kubenswrapper[4751]: I0131 15:06:05.867862 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="fada73df-4c18-4f18-9fcd-9fe24825a32c" containerName="glance-log" containerID="cri-o://4ac3b73f999bb2d4398e520fc413f3d3e17b891e2a007e2b9a8454c50571906c" gracePeriod=30 Jan 31 15:06:05 crc kubenswrapper[4751]: I0131 15:06:05.868296 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="fada73df-4c18-4f18-9fcd-9fe24825a32c" containerName="glance-httpd" containerID="cri-o://0622c62b16c45cccd093df45438f4801ecb9dee67a405110fa2bd8152369a536" gracePeriod=30 Jan 31 15:06:06 crc kubenswrapper[4751]: I0131 15:06:06.420816 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b6442aa-bbe5-4658-bfe9-28ee4f9b3daf" path="/var/lib/kubelet/pods/6b6442aa-bbe5-4658-bfe9-28ee4f9b3daf/volumes" Jan 31 15:06:06 crc kubenswrapper[4751]: I0131 15:06:06.858872 4751 generic.go:334] "Generic (PLEG): container finished" podID="fada73df-4c18-4f18-9fcd-9fe24825a32c" containerID="4ac3b73f999bb2d4398e520fc413f3d3e17b891e2a007e2b9a8454c50571906c" exitCode=143 Jan 31 15:06:06 crc kubenswrapper[4751]: I0131 15:06:06.858928 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"fada73df-4c18-4f18-9fcd-9fe24825a32c","Type":"ContainerDied","Data":"4ac3b73f999bb2d4398e520fc413f3d3e17b891e2a007e2b9a8454c50571906c"} Jan 31 15:06:07 crc kubenswrapper[4751]: I0131 15:06:07.086949 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-sync-96ldw"] Jan 31 15:06:07 crc kubenswrapper[4751]: I0131 15:06:07.095941 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-sync-96ldw"] Jan 31 15:06:07 crc kubenswrapper[4751]: I0131 15:06:07.154732 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 15:06:07 crc kubenswrapper[4751]: I0131 15:06:07.154999 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="e0e8efba-9adc-482b-bd77-553d76648ac6" containerName="glance-log" containerID="cri-o://de4d5808fc4ba7308f826962b8650c6ce882dcfa84a2b9961ed782c3d596f76e" gracePeriod=30 Jan 31 15:06:07 crc kubenswrapper[4751]: I0131 15:06:07.155682 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="e0e8efba-9adc-482b-bd77-553d76648ac6" containerName="glance-httpd" containerID="cri-o://90a6f0dd1552854347833452de54355b3cea39a33ffea2db5092440b501dadb7" gracePeriod=30 Jan 31 15:06:07 crc kubenswrapper[4751]: I0131 15:06:07.166511 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glancefb75-account-delete-rcplg"] Jan 31 15:06:07 crc kubenswrapper[4751]: I0131 15:06:07.167693 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glancefb75-account-delete-rcplg" Jan 31 15:06:07 crc kubenswrapper[4751]: I0131 15:06:07.173602 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glancefb75-account-delete-rcplg"] Jan 31 15:06:07 crc kubenswrapper[4751]: I0131 15:06:07.185361 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09ca3bf6-027a-4e7b-a142-44ad4308fd3e-operator-scripts\") pod \"glancefb75-account-delete-rcplg\" (UID: \"09ca3bf6-027a-4e7b-a142-44ad4308fd3e\") " pod="glance-kuttl-tests/glancefb75-account-delete-rcplg" Jan 31 15:06:07 crc kubenswrapper[4751]: I0131 15:06:07.185432 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x4fc\" (UniqueName: \"kubernetes.io/projected/09ca3bf6-027a-4e7b-a142-44ad4308fd3e-kube-api-access-6x4fc\") pod \"glancefb75-account-delete-rcplg\" (UID: \"09ca3bf6-027a-4e7b-a142-44ad4308fd3e\") " pod="glance-kuttl-tests/glancefb75-account-delete-rcplg" Jan 31 15:06:07 crc kubenswrapper[4751]: I0131 15:06:07.225388 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Jan 31 15:06:07 crc kubenswrapper[4751]: I0131 15:06:07.225672 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="90af064c-9d0a-4818-8e19-c87da44a879b" containerName="glance-log" containerID="cri-o://8b807b8111d0f0f33aee37200b1958ff47b158280257c5ed2833cf9c5c3a286f" gracePeriod=30 Jan 31 15:06:07 crc kubenswrapper[4751]: I0131 15:06:07.226159 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="90af064c-9d0a-4818-8e19-c87da44a879b" containerName="glance-httpd" containerID="cri-o://e0cc5d1490293994ec3b55b9f608e90b08473604cf0ebba39024c27ee8d6005b" gracePeriod=30 Jan 31 15:06:07 crc kubenswrapper[4751]: I0131 15:06:07.287111 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09ca3bf6-027a-4e7b-a142-44ad4308fd3e-operator-scripts\") pod \"glancefb75-account-delete-rcplg\" (UID: \"09ca3bf6-027a-4e7b-a142-44ad4308fd3e\") " pod="glance-kuttl-tests/glancefb75-account-delete-rcplg" Jan 31 15:06:07 crc kubenswrapper[4751]: I0131 15:06:07.287206 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6x4fc\" (UniqueName: \"kubernetes.io/projected/09ca3bf6-027a-4e7b-a142-44ad4308fd3e-kube-api-access-6x4fc\") pod \"glancefb75-account-delete-rcplg\" (UID: \"09ca3bf6-027a-4e7b-a142-44ad4308fd3e\") " pod="glance-kuttl-tests/glancefb75-account-delete-rcplg" Jan 31 15:06:07 crc kubenswrapper[4751]: I0131 15:06:07.288456 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09ca3bf6-027a-4e7b-a142-44ad4308fd3e-operator-scripts\") pod \"glancefb75-account-delete-rcplg\" (UID: \"09ca3bf6-027a-4e7b-a142-44ad4308fd3e\") " pod="glance-kuttl-tests/glancefb75-account-delete-rcplg" Jan 31 15:06:07 crc kubenswrapper[4751]: I0131 15:06:07.314943 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x4fc\" (UniqueName: \"kubernetes.io/projected/09ca3bf6-027a-4e7b-a142-44ad4308fd3e-kube-api-access-6x4fc\") pod \"glancefb75-account-delete-rcplg\" (UID: \"09ca3bf6-027a-4e7b-a142-44ad4308fd3e\") " pod="glance-kuttl-tests/glancefb75-account-delete-rcplg" Jan 31 15:06:07 crc kubenswrapper[4751]: I0131 15:06:07.485229 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glancefb75-account-delete-rcplg" Jan 31 15:06:07 crc kubenswrapper[4751]: I0131 15:06:07.730830 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glancefb75-account-delete-rcplg"] Jan 31 15:06:07 crc kubenswrapper[4751]: I0131 15:06:07.868279 4751 generic.go:334] "Generic (PLEG): container finished" podID="90af064c-9d0a-4818-8e19-c87da44a879b" containerID="8b807b8111d0f0f33aee37200b1958ff47b158280257c5ed2833cf9c5c3a286f" exitCode=143 Jan 31 15:06:07 crc kubenswrapper[4751]: I0131 15:06:07.868352 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"90af064c-9d0a-4818-8e19-c87da44a879b","Type":"ContainerDied","Data":"8b807b8111d0f0f33aee37200b1958ff47b158280257c5ed2833cf9c5c3a286f"} Jan 31 15:06:07 crc kubenswrapper[4751]: I0131 15:06:07.871939 4751 generic.go:334] "Generic (PLEG): container finished" podID="e0e8efba-9adc-482b-bd77-553d76648ac6" containerID="de4d5808fc4ba7308f826962b8650c6ce882dcfa84a2b9961ed782c3d596f76e" exitCode=143 Jan 31 15:06:07 crc kubenswrapper[4751]: I0131 15:06:07.871998 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"e0e8efba-9adc-482b-bd77-553d76648ac6","Type":"ContainerDied","Data":"de4d5808fc4ba7308f826962b8650c6ce882dcfa84a2b9961ed782c3d596f76e"} Jan 31 15:06:07 crc kubenswrapper[4751]: I0131 15:06:07.873303 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glancefb75-account-delete-rcplg" event={"ID":"09ca3bf6-027a-4e7b-a142-44ad4308fd3e","Type":"ContainerStarted","Data":"f49a7bb7647ca08427d7a0e06c9d600f223665df0c37dd55d0d845016d321065"} Jan 31 15:06:07 crc kubenswrapper[4751]: I0131 15:06:07.889121 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glancefb75-account-delete-rcplg" podStartSLOduration=0.889106287 podStartE2EDuration="889.106287ms" podCreationTimestamp="2026-01-31 15:06:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 15:06:07.884921346 +0000 UTC m=+1472.259634231" watchObservedRunningTime="2026-01-31 15:06:07.889106287 +0000 UTC m=+1472.263819162" Jan 31 15:06:08 crc kubenswrapper[4751]: I0131 15:06:08.413250 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="837cbdc7-6443-4789-a796-c2f1bd79119d" path="/var/lib/kubelet/pods/837cbdc7-6443-4789-a796-c2f1bd79119d/volumes" Jan 31 15:06:08 crc kubenswrapper[4751]: I0131 15:06:08.884381 4751 generic.go:334] "Generic (PLEG): container finished" podID="09ca3bf6-027a-4e7b-a142-44ad4308fd3e" containerID="d083c8910ad51529e38c062770d9b1a45be20502e7c76553d0090ac7a9898be5" exitCode=0 Jan 31 15:06:08 crc kubenswrapper[4751]: I0131 15:06:08.884446 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glancefb75-account-delete-rcplg" event={"ID":"09ca3bf6-027a-4e7b-a142-44ad4308fd3e","Type":"ContainerDied","Data":"d083c8910ad51529e38c062770d9b1a45be20502e7c76553d0090ac7a9898be5"} Jan 31 15:06:08 crc kubenswrapper[4751]: I0131 15:06:08.896370 4751 patch_prober.go:28] interesting pod/machine-config-daemon-2wpj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 15:06:08 crc kubenswrapper[4751]: I0131 15:06:08.896460 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podUID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 15:06:08 crc kubenswrapper[4751]: I0131 15:06:08.896517 4751 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" Jan 31 15:06:08 crc kubenswrapper[4751]: I0131 15:06:08.897807 4751 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"89a88ddaeae8a6fe7859be79e45bc66e157a0d02a03f5daf69e0ab6320bd15be"} pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 31 15:06:08 crc kubenswrapper[4751]: I0131 15:06:08.897905 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podUID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" containerName="machine-config-daemon" containerID="cri-o://89a88ddaeae8a6fe7859be79e45bc66e157a0d02a03f5daf69e0ab6320bd15be" gracePeriod=600 Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.218807 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.415670 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\" (UID: \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\") " Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.415746 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-etc-nvme\") pod \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\" (UID: \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\") " Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.415796 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-sys\") pod \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\" (UID: \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\") " Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.415811 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-run\") pod \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\" (UID: \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\") " Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.415900 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-sys" (OuterVolumeSpecName: "sys") pod "2ce40f98-80af-4a4b-8556-c5c7dd84fc58" (UID: "2ce40f98-80af-4a4b-8556-c5c7dd84fc58"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.415916 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "2ce40f98-80af-4a4b-8556-c5c7dd84fc58" (UID: "2ce40f98-80af-4a4b-8556-c5c7dd84fc58"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.415961 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-logs\") pod \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\" (UID: \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\") " Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.415982 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-run" (OuterVolumeSpecName: "run") pod "2ce40f98-80af-4a4b-8556-c5c7dd84fc58" (UID: "2ce40f98-80af-4a4b-8556-c5c7dd84fc58"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.416003 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjmr8\" (UniqueName: \"kubernetes.io/projected/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-kube-api-access-rjmr8\") pod \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\" (UID: \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\") " Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.416024 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-config-data\") pod \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\" (UID: \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\") " Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.416047 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-dev\") pod \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\" (UID: \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\") " Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.416064 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-lib-modules\") pod \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\" (UID: \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\") " Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.416130 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-etc-iscsi\") pod \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\" (UID: \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\") " Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.416179 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-scripts\") pod \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\" (UID: \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\") " Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.416209 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\" (UID: \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\") " Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.416224 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-httpd-run\") pod \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\" (UID: \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\") " Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.416248 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-var-locks-brick\") pod \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\" (UID: \"2ce40f98-80af-4a4b-8556-c5c7dd84fc58\") " Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.416296 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-logs" (OuterVolumeSpecName: "logs") pod "2ce40f98-80af-4a4b-8556-c5c7dd84fc58" (UID: "2ce40f98-80af-4a4b-8556-c5c7dd84fc58"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.416503 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "2ce40f98-80af-4a4b-8556-c5c7dd84fc58" (UID: "2ce40f98-80af-4a4b-8556-c5c7dd84fc58"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.416530 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-dev" (OuterVolumeSpecName: "dev") pod "2ce40f98-80af-4a4b-8556-c5c7dd84fc58" (UID: "2ce40f98-80af-4a4b-8556-c5c7dd84fc58"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.416546 4751 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-sys\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.416558 4751 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-run\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.416559 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "2ce40f98-80af-4a4b-8556-c5c7dd84fc58" (UID: "2ce40f98-80af-4a4b-8556-c5c7dd84fc58"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.416566 4751 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-logs\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.416576 4751 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-etc-nvme\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.416793 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "2ce40f98-80af-4a4b-8556-c5c7dd84fc58" (UID: "2ce40f98-80af-4a4b-8556-c5c7dd84fc58"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.417156 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2ce40f98-80af-4a4b-8556-c5c7dd84fc58" (UID: "2ce40f98-80af-4a4b-8556-c5c7dd84fc58"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.421573 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-scripts" (OuterVolumeSpecName: "scripts") pod "2ce40f98-80af-4a4b-8556-c5c7dd84fc58" (UID: "2ce40f98-80af-4a4b-8556-c5c7dd84fc58"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.422140 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage13-crc" (OuterVolumeSpecName: "glance-cache") pod "2ce40f98-80af-4a4b-8556-c5c7dd84fc58" (UID: "2ce40f98-80af-4a4b-8556-c5c7dd84fc58"). InnerVolumeSpecName "local-storage13-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.422166 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-kube-api-access-rjmr8" (OuterVolumeSpecName: "kube-api-access-rjmr8") pod "2ce40f98-80af-4a4b-8556-c5c7dd84fc58" (UID: "2ce40f98-80af-4a4b-8556-c5c7dd84fc58"). InnerVolumeSpecName "kube-api-access-rjmr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.424144 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage17-crc" (OuterVolumeSpecName: "glance") pod "2ce40f98-80af-4a4b-8556-c5c7dd84fc58" (UID: "2ce40f98-80af-4a4b-8556-c5c7dd84fc58"). InnerVolumeSpecName "local-storage17-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.468209 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-config-data" (OuterVolumeSpecName: "config-data") pod "2ce40f98-80af-4a4b-8556-c5c7dd84fc58" (UID: "2ce40f98-80af-4a4b-8556-c5c7dd84fc58"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.488756 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.519294 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjmr8\" (UniqueName: \"kubernetes.io/projected/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-kube-api-access-rjmr8\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.519334 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.519347 4751 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-dev\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.519359 4751 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-lib-modules\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.519377 4751 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-etc-iscsi\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.519460 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.519512 4751 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.519548 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" " Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.519588 4751 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2ce40f98-80af-4a4b-8556-c5c7dd84fc58-var-locks-brick\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.519614 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" " Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.545892 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage17-crc" (UniqueName: "kubernetes.io/local-volume/local-storage17-crc") on node "crc" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.554837 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage13-crc" (UniqueName: "kubernetes.io/local-volume/local-storage13-crc") on node "crc" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.620844 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/fada73df-4c18-4f18-9fcd-9fe24825a32c-var-locks-brick\") pod \"fada73df-4c18-4f18-9fcd-9fe24825a32c\" (UID: \"fada73df-4c18-4f18-9fcd-9fe24825a32c\") " Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.620957 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fada73df-4c18-4f18-9fcd-9fe24825a32c-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "fada73df-4c18-4f18-9fcd-9fe24825a32c" (UID: "fada73df-4c18-4f18-9fcd-9fe24825a32c"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.620962 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"fada73df-4c18-4f18-9fcd-9fe24825a32c\" (UID: \"fada73df-4c18-4f18-9fcd-9fe24825a32c\") " Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.621026 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fada73df-4c18-4f18-9fcd-9fe24825a32c-run\") pod \"fada73df-4c18-4f18-9fcd-9fe24825a32c\" (UID: \"fada73df-4c18-4f18-9fcd-9fe24825a32c\") " Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.621049 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/fada73df-4c18-4f18-9fcd-9fe24825a32c-dev\") pod \"fada73df-4c18-4f18-9fcd-9fe24825a32c\" (UID: \"fada73df-4c18-4f18-9fcd-9fe24825a32c\") " Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.621084 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fada73df-4c18-4f18-9fcd-9fe24825a32c-logs\") pod \"fada73df-4c18-4f18-9fcd-9fe24825a32c\" (UID: \"fada73df-4c18-4f18-9fcd-9fe24825a32c\") " Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.621113 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmpk8\" (UniqueName: \"kubernetes.io/projected/fada73df-4c18-4f18-9fcd-9fe24825a32c-kube-api-access-pmpk8\") pod \"fada73df-4c18-4f18-9fcd-9fe24825a32c\" (UID: \"fada73df-4c18-4f18-9fcd-9fe24825a32c\") " Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.621133 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fada73df-4c18-4f18-9fcd-9fe24825a32c-config-data\") pod \"fada73df-4c18-4f18-9fcd-9fe24825a32c\" (UID: \"fada73df-4c18-4f18-9fcd-9fe24825a32c\") " Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.621156 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"fada73df-4c18-4f18-9fcd-9fe24825a32c\" (UID: \"fada73df-4c18-4f18-9fcd-9fe24825a32c\") " Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.621170 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fada73df-4c18-4f18-9fcd-9fe24825a32c-sys\") pod \"fada73df-4c18-4f18-9fcd-9fe24825a32c\" (UID: \"fada73df-4c18-4f18-9fcd-9fe24825a32c\") " Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.621159 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fada73df-4c18-4f18-9fcd-9fe24825a32c-run" (OuterVolumeSpecName: "run") pod "fada73df-4c18-4f18-9fcd-9fe24825a32c" (UID: "fada73df-4c18-4f18-9fcd-9fe24825a32c"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.621202 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fada73df-4c18-4f18-9fcd-9fe24825a32c-httpd-run\") pod \"fada73df-4c18-4f18-9fcd-9fe24825a32c\" (UID: \"fada73df-4c18-4f18-9fcd-9fe24825a32c\") " Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.621190 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fada73df-4c18-4f18-9fcd-9fe24825a32c-dev" (OuterVolumeSpecName: "dev") pod "fada73df-4c18-4f18-9fcd-9fe24825a32c" (UID: "fada73df-4c18-4f18-9fcd-9fe24825a32c"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.621242 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fada73df-4c18-4f18-9fcd-9fe24825a32c-lib-modules\") pod \"fada73df-4c18-4f18-9fcd-9fe24825a32c\" (UID: \"fada73df-4c18-4f18-9fcd-9fe24825a32c\") " Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.621258 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/fada73df-4c18-4f18-9fcd-9fe24825a32c-etc-nvme\") pod \"fada73df-4c18-4f18-9fcd-9fe24825a32c\" (UID: \"fada73df-4c18-4f18-9fcd-9fe24825a32c\") " Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.621268 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fada73df-4c18-4f18-9fcd-9fe24825a32c-sys" (OuterVolumeSpecName: "sys") pod "fada73df-4c18-4f18-9fcd-9fe24825a32c" (UID: "fada73df-4c18-4f18-9fcd-9fe24825a32c"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.621285 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/fada73df-4c18-4f18-9fcd-9fe24825a32c-etc-iscsi\") pod \"fada73df-4c18-4f18-9fcd-9fe24825a32c\" (UID: \"fada73df-4c18-4f18-9fcd-9fe24825a32c\") " Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.621315 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fada73df-4c18-4f18-9fcd-9fe24825a32c-scripts\") pod \"fada73df-4c18-4f18-9fcd-9fe24825a32c\" (UID: \"fada73df-4c18-4f18-9fcd-9fe24825a32c\") " Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.621588 4751 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fada73df-4c18-4f18-9fcd-9fe24825a32c-run\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.621611 4751 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/fada73df-4c18-4f18-9fcd-9fe24825a32c-dev\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.621619 4751 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fada73df-4c18-4f18-9fcd-9fe24825a32c-sys\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.621627 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.621634 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.621647 4751 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/fada73df-4c18-4f18-9fcd-9fe24825a32c-var-locks-brick\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.621698 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fada73df-4c18-4f18-9fcd-9fe24825a32c-logs" (OuterVolumeSpecName: "logs") pod "fada73df-4c18-4f18-9fcd-9fe24825a32c" (UID: "fada73df-4c18-4f18-9fcd-9fe24825a32c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.621756 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fada73df-4c18-4f18-9fcd-9fe24825a32c-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "fada73df-4c18-4f18-9fcd-9fe24825a32c" (UID: "fada73df-4c18-4f18-9fcd-9fe24825a32c"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.622001 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fada73df-4c18-4f18-9fcd-9fe24825a32c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "fada73df-4c18-4f18-9fcd-9fe24825a32c" (UID: "fada73df-4c18-4f18-9fcd-9fe24825a32c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.622000 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fada73df-4c18-4f18-9fcd-9fe24825a32c-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "fada73df-4c18-4f18-9fcd-9fe24825a32c" (UID: "fada73df-4c18-4f18-9fcd-9fe24825a32c"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.622035 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fada73df-4c18-4f18-9fcd-9fe24825a32c-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "fada73df-4c18-4f18-9fcd-9fe24825a32c" (UID: "fada73df-4c18-4f18-9fcd-9fe24825a32c"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.623867 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage16-crc" (OuterVolumeSpecName: "glance") pod "fada73df-4c18-4f18-9fcd-9fe24825a32c" (UID: "fada73df-4c18-4f18-9fcd-9fe24825a32c"). InnerVolumeSpecName "local-storage16-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.624489 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fada73df-4c18-4f18-9fcd-9fe24825a32c-scripts" (OuterVolumeSpecName: "scripts") pod "fada73df-4c18-4f18-9fcd-9fe24825a32c" (UID: "fada73df-4c18-4f18-9fcd-9fe24825a32c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.624911 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fada73df-4c18-4f18-9fcd-9fe24825a32c-kube-api-access-pmpk8" (OuterVolumeSpecName: "kube-api-access-pmpk8") pod "fada73df-4c18-4f18-9fcd-9fe24825a32c" (UID: "fada73df-4c18-4f18-9fcd-9fe24825a32c"). InnerVolumeSpecName "kube-api-access-pmpk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.624993 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage18-crc" (OuterVolumeSpecName: "glance-cache") pod "fada73df-4c18-4f18-9fcd-9fe24825a32c" (UID: "fada73df-4c18-4f18-9fcd-9fe24825a32c"). InnerVolumeSpecName "local-storage18-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.653023 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fada73df-4c18-4f18-9fcd-9fe24825a32c-config-data" (OuterVolumeSpecName: "config-data") pod "fada73df-4c18-4f18-9fcd-9fe24825a32c" (UID: "fada73df-4c18-4f18-9fcd-9fe24825a32c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.722684 4751 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/fada73df-4c18-4f18-9fcd-9fe24825a32c-etc-iscsi\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.722724 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fada73df-4c18-4f18-9fcd-9fe24825a32c-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.722754 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" " Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.722764 4751 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fada73df-4c18-4f18-9fcd-9fe24825a32c-logs\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.722775 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmpk8\" (UniqueName: \"kubernetes.io/projected/fada73df-4c18-4f18-9fcd-9fe24825a32c-kube-api-access-pmpk8\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.722786 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fada73df-4c18-4f18-9fcd-9fe24825a32c-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.722799 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" " Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.722808 4751 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fada73df-4c18-4f18-9fcd-9fe24825a32c-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.722817 4751 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fada73df-4c18-4f18-9fcd-9fe24825a32c-lib-modules\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.722825 4751 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/fada73df-4c18-4f18-9fcd-9fe24825a32c-etc-nvme\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.734915 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage18-crc" (UniqueName: "kubernetes.io/local-volume/local-storage18-crc") on node "crc" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.734998 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage16-crc" (UniqueName: "kubernetes.io/local-volume/local-storage16-crc") on node "crc" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.824043 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.824097 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.896360 4751 generic.go:334] "Generic (PLEG): container finished" podID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" containerID="89a88ddaeae8a6fe7859be79e45bc66e157a0d02a03f5daf69e0ab6320bd15be" exitCode=0 Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.896442 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" event={"ID":"b4c170e8-22c9-43a9-8b34-9d626c2ccddc","Type":"ContainerDied","Data":"89a88ddaeae8a6fe7859be79e45bc66e157a0d02a03f5daf69e0ab6320bd15be"} Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.897322 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" event={"ID":"b4c170e8-22c9-43a9-8b34-9d626c2ccddc","Type":"ContainerStarted","Data":"1e91cdb1164832b2457470e9c5a6b63801cdbe4c69db7c3369929241eb60a130"} Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.897571 4751 scope.go:117] "RemoveContainer" containerID="7fcf941f127d31d0e5c99d5ef038c633782d289d0e911f4e9c5c6f77b2a91e2a" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.899773 4751 generic.go:334] "Generic (PLEG): container finished" podID="fada73df-4c18-4f18-9fcd-9fe24825a32c" containerID="0622c62b16c45cccd093df45438f4801ecb9dee67a405110fa2bd8152369a536" exitCode=0 Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.899862 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.899885 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"fada73df-4c18-4f18-9fcd-9fe24825a32c","Type":"ContainerDied","Data":"0622c62b16c45cccd093df45438f4801ecb9dee67a405110fa2bd8152369a536"} Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.899935 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"fada73df-4c18-4f18-9fcd-9fe24825a32c","Type":"ContainerDied","Data":"4e85dc9929fef5538bbedbabd1bc4862934d09f3a214bd5393c16cf9dbfd21f7"} Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.902466 4751 generic.go:334] "Generic (PLEG): container finished" podID="2ce40f98-80af-4a4b-8556-c5c7dd84fc58" containerID="5b307a953bf028387f69f75e58e1545d2c2c38f81096b44abf4639237badb848" exitCode=0 Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.902543 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.902588 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"2ce40f98-80af-4a4b-8556-c5c7dd84fc58","Type":"ContainerDied","Data":"5b307a953bf028387f69f75e58e1545d2c2c38f81096b44abf4639237badb848"} Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.902621 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"2ce40f98-80af-4a4b-8556-c5c7dd84fc58","Type":"ContainerDied","Data":"c11c5ca70f63ee145b6adb16291fa14f1d7247eb5f019288ced5f1338933a04e"} Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.933470 4751 scope.go:117] "RemoveContainer" containerID="0622c62b16c45cccd093df45438f4801ecb9dee67a405110fa2bd8152369a536" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.944958 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.953035 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.960055 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.965846 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.971861 4751 scope.go:117] "RemoveContainer" containerID="4ac3b73f999bb2d4398e520fc413f3d3e17b891e2a007e2b9a8454c50571906c" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.990847 4751 scope.go:117] "RemoveContainer" containerID="0622c62b16c45cccd093df45438f4801ecb9dee67a405110fa2bd8152369a536" Jan 31 15:06:09 crc kubenswrapper[4751]: E0131 15:06:09.991333 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0622c62b16c45cccd093df45438f4801ecb9dee67a405110fa2bd8152369a536\": container with ID starting with 0622c62b16c45cccd093df45438f4801ecb9dee67a405110fa2bd8152369a536 not found: ID does not exist" containerID="0622c62b16c45cccd093df45438f4801ecb9dee67a405110fa2bd8152369a536" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.991369 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0622c62b16c45cccd093df45438f4801ecb9dee67a405110fa2bd8152369a536"} err="failed to get container status \"0622c62b16c45cccd093df45438f4801ecb9dee67a405110fa2bd8152369a536\": rpc error: code = NotFound desc = could not find container \"0622c62b16c45cccd093df45438f4801ecb9dee67a405110fa2bd8152369a536\": container with ID starting with 0622c62b16c45cccd093df45438f4801ecb9dee67a405110fa2bd8152369a536 not found: ID does not exist" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.991397 4751 scope.go:117] "RemoveContainer" containerID="4ac3b73f999bb2d4398e520fc413f3d3e17b891e2a007e2b9a8454c50571906c" Jan 31 15:06:09 crc kubenswrapper[4751]: E0131 15:06:09.991776 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ac3b73f999bb2d4398e520fc413f3d3e17b891e2a007e2b9a8454c50571906c\": container with ID starting with 4ac3b73f999bb2d4398e520fc413f3d3e17b891e2a007e2b9a8454c50571906c not found: ID does not exist" containerID="4ac3b73f999bb2d4398e520fc413f3d3e17b891e2a007e2b9a8454c50571906c" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.991891 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ac3b73f999bb2d4398e520fc413f3d3e17b891e2a007e2b9a8454c50571906c"} err="failed to get container status \"4ac3b73f999bb2d4398e520fc413f3d3e17b891e2a007e2b9a8454c50571906c\": rpc error: code = NotFound desc = could not find container \"4ac3b73f999bb2d4398e520fc413f3d3e17b891e2a007e2b9a8454c50571906c\": container with ID starting with 4ac3b73f999bb2d4398e520fc413f3d3e17b891e2a007e2b9a8454c50571906c not found: ID does not exist" Jan 31 15:06:09 crc kubenswrapper[4751]: I0131 15:06:09.991993 4751 scope.go:117] "RemoveContainer" containerID="5b307a953bf028387f69f75e58e1545d2c2c38f81096b44abf4639237badb848" Jan 31 15:06:10 crc kubenswrapper[4751]: I0131 15:06:10.015671 4751 scope.go:117] "RemoveContainer" containerID="e83e3eee6d1bd6eeb18c2480a68530a6737d3f76fac28ccff0e74d019e406b35" Jan 31 15:06:10 crc kubenswrapper[4751]: I0131 15:06:10.040217 4751 scope.go:117] "RemoveContainer" containerID="5b307a953bf028387f69f75e58e1545d2c2c38f81096b44abf4639237badb848" Jan 31 15:06:10 crc kubenswrapper[4751]: E0131 15:06:10.040766 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b307a953bf028387f69f75e58e1545d2c2c38f81096b44abf4639237badb848\": container with ID starting with 5b307a953bf028387f69f75e58e1545d2c2c38f81096b44abf4639237badb848 not found: ID does not exist" containerID="5b307a953bf028387f69f75e58e1545d2c2c38f81096b44abf4639237badb848" Jan 31 15:06:10 crc kubenswrapper[4751]: I0131 15:06:10.040799 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b307a953bf028387f69f75e58e1545d2c2c38f81096b44abf4639237badb848"} err="failed to get container status \"5b307a953bf028387f69f75e58e1545d2c2c38f81096b44abf4639237badb848\": rpc error: code = NotFound desc = could not find container \"5b307a953bf028387f69f75e58e1545d2c2c38f81096b44abf4639237badb848\": container with ID starting with 5b307a953bf028387f69f75e58e1545d2c2c38f81096b44abf4639237badb848 not found: ID does not exist" Jan 31 15:06:10 crc kubenswrapper[4751]: I0131 15:06:10.040827 4751 scope.go:117] "RemoveContainer" containerID="e83e3eee6d1bd6eeb18c2480a68530a6737d3f76fac28ccff0e74d019e406b35" Jan 31 15:06:10 crc kubenswrapper[4751]: E0131 15:06:10.041237 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e83e3eee6d1bd6eeb18c2480a68530a6737d3f76fac28ccff0e74d019e406b35\": container with ID starting with e83e3eee6d1bd6eeb18c2480a68530a6737d3f76fac28ccff0e74d019e406b35 not found: ID does not exist" containerID="e83e3eee6d1bd6eeb18c2480a68530a6737d3f76fac28ccff0e74d019e406b35" Jan 31 15:06:10 crc kubenswrapper[4751]: I0131 15:06:10.041260 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e83e3eee6d1bd6eeb18c2480a68530a6737d3f76fac28ccff0e74d019e406b35"} err="failed to get container status \"e83e3eee6d1bd6eeb18c2480a68530a6737d3f76fac28ccff0e74d019e406b35\": rpc error: code = NotFound desc = could not find container \"e83e3eee6d1bd6eeb18c2480a68530a6737d3f76fac28ccff0e74d019e406b35\": container with ID starting with e83e3eee6d1bd6eeb18c2480a68530a6737d3f76fac28ccff0e74d019e406b35 not found: ID does not exist" Jan 31 15:06:10 crc kubenswrapper[4751]: I0131 15:06:10.123449 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glancefb75-account-delete-rcplg" Jan 31 15:06:10 crc kubenswrapper[4751]: I0131 15:06:10.229586 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09ca3bf6-027a-4e7b-a142-44ad4308fd3e-operator-scripts\") pod \"09ca3bf6-027a-4e7b-a142-44ad4308fd3e\" (UID: \"09ca3bf6-027a-4e7b-a142-44ad4308fd3e\") " Jan 31 15:06:10 crc kubenswrapper[4751]: I0131 15:06:10.229712 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6x4fc\" (UniqueName: \"kubernetes.io/projected/09ca3bf6-027a-4e7b-a142-44ad4308fd3e-kube-api-access-6x4fc\") pod \"09ca3bf6-027a-4e7b-a142-44ad4308fd3e\" (UID: \"09ca3bf6-027a-4e7b-a142-44ad4308fd3e\") " Jan 31 15:06:10 crc kubenswrapper[4751]: I0131 15:06:10.230991 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ca3bf6-027a-4e7b-a142-44ad4308fd3e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "09ca3bf6-027a-4e7b-a142-44ad4308fd3e" (UID: "09ca3bf6-027a-4e7b-a142-44ad4308fd3e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 15:06:10 crc kubenswrapper[4751]: I0131 15:06:10.234313 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ca3bf6-027a-4e7b-a142-44ad4308fd3e-kube-api-access-6x4fc" (OuterVolumeSpecName: "kube-api-access-6x4fc") pod "09ca3bf6-027a-4e7b-a142-44ad4308fd3e" (UID: "09ca3bf6-027a-4e7b-a142-44ad4308fd3e"). InnerVolumeSpecName "kube-api-access-6x4fc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:06:10 crc kubenswrapper[4751]: I0131 15:06:10.331282 4751 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09ca3bf6-027a-4e7b-a142-44ad4308fd3e-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:10 crc kubenswrapper[4751]: I0131 15:06:10.331321 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6x4fc\" (UniqueName: \"kubernetes.io/projected/09ca3bf6-027a-4e7b-a142-44ad4308fd3e-kube-api-access-6x4fc\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:10 crc kubenswrapper[4751]: I0131 15:06:10.418910 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ce40f98-80af-4a4b-8556-c5c7dd84fc58" path="/var/lib/kubelet/pods/2ce40f98-80af-4a4b-8556-c5c7dd84fc58/volumes" Jan 31 15:06:10 crc kubenswrapper[4751]: I0131 15:06:10.419943 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fada73df-4c18-4f18-9fcd-9fe24825a32c" path="/var/lib/kubelet/pods/fada73df-4c18-4f18-9fcd-9fe24825a32c/volumes" Jan 31 15:06:10 crc kubenswrapper[4751]: I0131 15:06:10.858897 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:06:10 crc kubenswrapper[4751]: I0131 15:06:10.917482 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glancefb75-account-delete-rcplg" event={"ID":"09ca3bf6-027a-4e7b-a142-44ad4308fd3e","Type":"ContainerDied","Data":"f49a7bb7647ca08427d7a0e06c9d600f223665df0c37dd55d0d845016d321065"} Jan 31 15:06:10 crc kubenswrapper[4751]: I0131 15:06:10.917519 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f49a7bb7647ca08427d7a0e06c9d600f223665df0c37dd55d0d845016d321065" Jan 31 15:06:10 crc kubenswrapper[4751]: I0131 15:06:10.917646 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glancefb75-account-delete-rcplg" Jan 31 15:06:10 crc kubenswrapper[4751]: I0131 15:06:10.924083 4751 generic.go:334] "Generic (PLEG): container finished" podID="90af064c-9d0a-4818-8e19-c87da44a879b" containerID="e0cc5d1490293994ec3b55b9f608e90b08473604cf0ebba39024c27ee8d6005b" exitCode=0 Jan 31 15:06:10 crc kubenswrapper[4751]: I0131 15:06:10.924136 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"90af064c-9d0a-4818-8e19-c87da44a879b","Type":"ContainerDied","Data":"e0cc5d1490293994ec3b55b9f608e90b08473604cf0ebba39024c27ee8d6005b"} Jan 31 15:06:10 crc kubenswrapper[4751]: I0131 15:06:10.924159 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"90af064c-9d0a-4818-8e19-c87da44a879b","Type":"ContainerDied","Data":"e99fc8a548b6e6e8e6da564fb55696f96c325bf5ca3500bbda2b1e9e31f7bf04"} Jan 31 15:06:10 crc kubenswrapper[4751]: I0131 15:06:10.924176 4751 scope.go:117] "RemoveContainer" containerID="e0cc5d1490293994ec3b55b9f608e90b08473604cf0ebba39024c27ee8d6005b" Jan 31 15:06:10 crc kubenswrapper[4751]: I0131 15:06:10.924260 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 15:06:10 crc kubenswrapper[4751]: I0131 15:06:10.929454 4751 generic.go:334] "Generic (PLEG): container finished" podID="e0e8efba-9adc-482b-bd77-553d76648ac6" containerID="90a6f0dd1552854347833452de54355b3cea39a33ffea2db5092440b501dadb7" exitCode=0 Jan 31 15:06:10 crc kubenswrapper[4751]: I0131 15:06:10.929529 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"e0e8efba-9adc-482b-bd77-553d76648ac6","Type":"ContainerDied","Data":"90a6f0dd1552854347833452de54355b3cea39a33ffea2db5092440b501dadb7"} Jan 31 15:06:10 crc kubenswrapper[4751]: I0131 15:06:10.950694 4751 scope.go:117] "RemoveContainer" containerID="8b807b8111d0f0f33aee37200b1958ff47b158280257c5ed2833cf9c5c3a286f" Jan 31 15:06:10 crc kubenswrapper[4751]: I0131 15:06:10.969957 4751 scope.go:117] "RemoveContainer" containerID="e0cc5d1490293994ec3b55b9f608e90b08473604cf0ebba39024c27ee8d6005b" Jan 31 15:06:10 crc kubenswrapper[4751]: E0131 15:06:10.970575 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0cc5d1490293994ec3b55b9f608e90b08473604cf0ebba39024c27ee8d6005b\": container with ID starting with e0cc5d1490293994ec3b55b9f608e90b08473604cf0ebba39024c27ee8d6005b not found: ID does not exist" containerID="e0cc5d1490293994ec3b55b9f608e90b08473604cf0ebba39024c27ee8d6005b" Jan 31 15:06:10 crc kubenswrapper[4751]: I0131 15:06:10.970605 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0cc5d1490293994ec3b55b9f608e90b08473604cf0ebba39024c27ee8d6005b"} err="failed to get container status \"e0cc5d1490293994ec3b55b9f608e90b08473604cf0ebba39024c27ee8d6005b\": rpc error: code = NotFound desc = could not find container \"e0cc5d1490293994ec3b55b9f608e90b08473604cf0ebba39024c27ee8d6005b\": container with ID starting with e0cc5d1490293994ec3b55b9f608e90b08473604cf0ebba39024c27ee8d6005b not found: ID does not exist" Jan 31 15:06:10 crc kubenswrapper[4751]: I0131 15:06:10.970630 4751 scope.go:117] "RemoveContainer" containerID="8b807b8111d0f0f33aee37200b1958ff47b158280257c5ed2833cf9c5c3a286f" Jan 31 15:06:10 crc kubenswrapper[4751]: E0131 15:06:10.971034 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b807b8111d0f0f33aee37200b1958ff47b158280257c5ed2833cf9c5c3a286f\": container with ID starting with 8b807b8111d0f0f33aee37200b1958ff47b158280257c5ed2833cf9c5c3a286f not found: ID does not exist" containerID="8b807b8111d0f0f33aee37200b1958ff47b158280257c5ed2833cf9c5c3a286f" Jan 31 15:06:10 crc kubenswrapper[4751]: I0131 15:06:10.971054 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b807b8111d0f0f33aee37200b1958ff47b158280257c5ed2833cf9c5c3a286f"} err="failed to get container status \"8b807b8111d0f0f33aee37200b1958ff47b158280257c5ed2833cf9c5c3a286f\": rpc error: code = NotFound desc = could not find container \"8b807b8111d0f0f33aee37200b1958ff47b158280257c5ed2833cf9c5c3a286f\": container with ID starting with 8b807b8111d0f0f33aee37200b1958ff47b158280257c5ed2833cf9c5c3a286f not found: ID does not exist" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.043616 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"90af064c-9d0a-4818-8e19-c87da44a879b\" (UID: \"90af064c-9d0a-4818-8e19-c87da44a879b\") " Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.044485 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/90af064c-9d0a-4818-8e19-c87da44a879b-etc-nvme\") pod \"90af064c-9d0a-4818-8e19-c87da44a879b\" (UID: \"90af064c-9d0a-4818-8e19-c87da44a879b\") " Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.044856 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"90af064c-9d0a-4818-8e19-c87da44a879b\" (UID: \"90af064c-9d0a-4818-8e19-c87da44a879b\") " Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.044933 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/90af064c-9d0a-4818-8e19-c87da44a879b-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "90af064c-9d0a-4818-8e19-c87da44a879b" (UID: "90af064c-9d0a-4818-8e19-c87da44a879b"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.045039 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/90af064c-9d0a-4818-8e19-c87da44a879b-sys\") pod \"90af064c-9d0a-4818-8e19-c87da44a879b\" (UID: \"90af064c-9d0a-4818-8e19-c87da44a879b\") " Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.045232 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/90af064c-9d0a-4818-8e19-c87da44a879b-var-locks-brick\") pod \"90af064c-9d0a-4818-8e19-c87da44a879b\" (UID: \"90af064c-9d0a-4818-8e19-c87da44a879b\") " Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.045301 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/90af064c-9d0a-4818-8e19-c87da44a879b-sys" (OuterVolumeSpecName: "sys") pod "90af064c-9d0a-4818-8e19-c87da44a879b" (UID: "90af064c-9d0a-4818-8e19-c87da44a879b"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.045425 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/90af064c-9d0a-4818-8e19-c87da44a879b-httpd-run\") pod \"90af064c-9d0a-4818-8e19-c87da44a879b\" (UID: \"90af064c-9d0a-4818-8e19-c87da44a879b\") " Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.045620 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90af064c-9d0a-4818-8e19-c87da44a879b-scripts\") pod \"90af064c-9d0a-4818-8e19-c87da44a879b\" (UID: \"90af064c-9d0a-4818-8e19-c87da44a879b\") " Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.045838 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/90af064c-9d0a-4818-8e19-c87da44a879b-dev\") pod \"90af064c-9d0a-4818-8e19-c87da44a879b\" (UID: \"90af064c-9d0a-4818-8e19-c87da44a879b\") " Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.045937 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwj7s\" (UniqueName: \"kubernetes.io/projected/90af064c-9d0a-4818-8e19-c87da44a879b-kube-api-access-rwj7s\") pod \"90af064c-9d0a-4818-8e19-c87da44a879b\" (UID: \"90af064c-9d0a-4818-8e19-c87da44a879b\") " Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.046036 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/90af064c-9d0a-4818-8e19-c87da44a879b-run\") pod \"90af064c-9d0a-4818-8e19-c87da44a879b\" (UID: \"90af064c-9d0a-4818-8e19-c87da44a879b\") " Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.046543 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90af064c-9d0a-4818-8e19-c87da44a879b-logs\") pod \"90af064c-9d0a-4818-8e19-c87da44a879b\" (UID: \"90af064c-9d0a-4818-8e19-c87da44a879b\") " Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.046678 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/90af064c-9d0a-4818-8e19-c87da44a879b-lib-modules\") pod \"90af064c-9d0a-4818-8e19-c87da44a879b\" (UID: \"90af064c-9d0a-4818-8e19-c87da44a879b\") " Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.046909 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90af064c-9d0a-4818-8e19-c87da44a879b-config-data\") pod \"90af064c-9d0a-4818-8e19-c87da44a879b\" (UID: \"90af064c-9d0a-4818-8e19-c87da44a879b\") " Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.047022 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/90af064c-9d0a-4818-8e19-c87da44a879b-etc-iscsi\") pod \"90af064c-9d0a-4818-8e19-c87da44a879b\" (UID: \"90af064c-9d0a-4818-8e19-c87da44a879b\") " Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.045433 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/90af064c-9d0a-4818-8e19-c87da44a879b-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "90af064c-9d0a-4818-8e19-c87da44a879b" (UID: "90af064c-9d0a-4818-8e19-c87da44a879b"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.045696 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90af064c-9d0a-4818-8e19-c87da44a879b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "90af064c-9d0a-4818-8e19-c87da44a879b" (UID: "90af064c-9d0a-4818-8e19-c87da44a879b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.046167 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/90af064c-9d0a-4818-8e19-c87da44a879b-dev" (OuterVolumeSpecName: "dev") pod "90af064c-9d0a-4818-8e19-c87da44a879b" (UID: "90af064c-9d0a-4818-8e19-c87da44a879b"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.046463 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/90af064c-9d0a-4818-8e19-c87da44a879b-run" (OuterVolumeSpecName: "run") pod "90af064c-9d0a-4818-8e19-c87da44a879b" (UID: "90af064c-9d0a-4818-8e19-c87da44a879b"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.046828 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90af064c-9d0a-4818-8e19-c87da44a879b-logs" (OuterVolumeSpecName: "logs") pod "90af064c-9d0a-4818-8e19-c87da44a879b" (UID: "90af064c-9d0a-4818-8e19-c87da44a879b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.046860 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/90af064c-9d0a-4818-8e19-c87da44a879b-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "90af064c-9d0a-4818-8e19-c87da44a879b" (UID: "90af064c-9d0a-4818-8e19-c87da44a879b"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.047331 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/90af064c-9d0a-4818-8e19-c87da44a879b-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "90af064c-9d0a-4818-8e19-c87da44a879b" (UID: "90af064c-9d0a-4818-8e19-c87da44a879b"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.049869 4751 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/90af064c-9d0a-4818-8e19-c87da44a879b-dev\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.049989 4751 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/90af064c-9d0a-4818-8e19-c87da44a879b-run\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.050100 4751 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90af064c-9d0a-4818-8e19-c87da44a879b-logs\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.050187 4751 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/90af064c-9d0a-4818-8e19-c87da44a879b-lib-modules\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.050270 4751 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/90af064c-9d0a-4818-8e19-c87da44a879b-etc-iscsi\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.050350 4751 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/90af064c-9d0a-4818-8e19-c87da44a879b-etc-nvme\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.050426 4751 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/90af064c-9d0a-4818-8e19-c87da44a879b-sys\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.050902 4751 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/90af064c-9d0a-4818-8e19-c87da44a879b-var-locks-brick\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.050979 4751 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/90af064c-9d0a-4818-8e19-c87da44a879b-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.105996 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90af064c-9d0a-4818-8e19-c87da44a879b-kube-api-access-rwj7s" (OuterVolumeSpecName: "kube-api-access-rwj7s") pod "90af064c-9d0a-4818-8e19-c87da44a879b" (UID: "90af064c-9d0a-4818-8e19-c87da44a879b"). InnerVolumeSpecName "kube-api-access-rwj7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.106558 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90af064c-9d0a-4818-8e19-c87da44a879b-scripts" (OuterVolumeSpecName: "scripts") pod "90af064c-9d0a-4818-8e19-c87da44a879b" (UID: "90af064c-9d0a-4818-8e19-c87da44a879b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.106614 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance-cache") pod "90af064c-9d0a-4818-8e19-c87da44a879b" (UID: "90af064c-9d0a-4818-8e19-c87da44a879b"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.107188 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "90af064c-9d0a-4818-8e19-c87da44a879b" (UID: "90af064c-9d0a-4818-8e19-c87da44a879b"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.143445 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.144217 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90af064c-9d0a-4818-8e19-c87da44a879b-config-data" (OuterVolumeSpecName: "config-data") pod "90af064c-9d0a-4818-8e19-c87da44a879b" (UID: "90af064c-9d0a-4818-8e19-c87da44a879b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.152206 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.152444 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.152510 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90af064c-9d0a-4818-8e19-c87da44a879b-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.152605 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwj7s\" (UniqueName: \"kubernetes.io/projected/90af064c-9d0a-4818-8e19-c87da44a879b-kube-api-access-rwj7s\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.152664 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90af064c-9d0a-4818-8e19-c87da44a879b-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.178059 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.178400 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.254820 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e0e8efba-9adc-482b-bd77-553d76648ac6-etc-iscsi\") pod \"e0e8efba-9adc-482b-bd77-553d76648ac6\" (UID: \"e0e8efba-9adc-482b-bd77-553d76648ac6\") " Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.254900 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e0e8efba-9adc-482b-bd77-553d76648ac6-httpd-run\") pod \"e0e8efba-9adc-482b-bd77-553d76648ac6\" (UID: \"e0e8efba-9adc-482b-bd77-553d76648ac6\") " Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.254919 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"e0e8efba-9adc-482b-bd77-553d76648ac6\" (UID: \"e0e8efba-9adc-482b-bd77-553d76648ac6\") " Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.254945 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e0e8efba-9adc-482b-bd77-553d76648ac6-run\") pod \"e0e8efba-9adc-482b-bd77-553d76648ac6\" (UID: \"e0e8efba-9adc-482b-bd77-553d76648ac6\") " Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.255027 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e0e8efba-9adc-482b-bd77-553d76648ac6-sys\") pod \"e0e8efba-9adc-482b-bd77-553d76648ac6\" (UID: \"e0e8efba-9adc-482b-bd77-553d76648ac6\") " Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.255085 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e0e8efba-9adc-482b-bd77-553d76648ac6-lib-modules\") pod \"e0e8efba-9adc-482b-bd77-553d76648ac6\" (UID: \"e0e8efba-9adc-482b-bd77-553d76648ac6\") " Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.255134 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e0e8efba-9adc-482b-bd77-553d76648ac6-var-locks-brick\") pod \"e0e8efba-9adc-482b-bd77-553d76648ac6\" (UID: \"e0e8efba-9adc-482b-bd77-553d76648ac6\") " Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.255207 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"e0e8efba-9adc-482b-bd77-553d76648ac6\" (UID: \"e0e8efba-9adc-482b-bd77-553d76648ac6\") " Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.255268 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e0e8efba-9adc-482b-bd77-553d76648ac6-etc-nvme\") pod \"e0e8efba-9adc-482b-bd77-553d76648ac6\" (UID: \"e0e8efba-9adc-482b-bd77-553d76648ac6\") " Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.255331 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0e8efba-9adc-482b-bd77-553d76648ac6-logs\") pod \"e0e8efba-9adc-482b-bd77-553d76648ac6\" (UID: \"e0e8efba-9adc-482b-bd77-553d76648ac6\") " Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.255364 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0e8efba-9adc-482b-bd77-553d76648ac6-scripts\") pod \"e0e8efba-9adc-482b-bd77-553d76648ac6\" (UID: \"e0e8efba-9adc-482b-bd77-553d76648ac6\") " Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.255419 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9kwf\" (UniqueName: \"kubernetes.io/projected/e0e8efba-9adc-482b-bd77-553d76648ac6-kube-api-access-l9kwf\") pod \"e0e8efba-9adc-482b-bd77-553d76648ac6\" (UID: \"e0e8efba-9adc-482b-bd77-553d76648ac6\") " Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.255452 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e0e8efba-9adc-482b-bd77-553d76648ac6-dev\") pod \"e0e8efba-9adc-482b-bd77-553d76648ac6\" (UID: \"e0e8efba-9adc-482b-bd77-553d76648ac6\") " Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.255500 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0e8efba-9adc-482b-bd77-553d76648ac6-config-data\") pod \"e0e8efba-9adc-482b-bd77-553d76648ac6\" (UID: \"e0e8efba-9adc-482b-bd77-553d76648ac6\") " Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.255908 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.255928 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.263222 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0e8efba-9adc-482b-bd77-553d76648ac6-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "e0e8efba-9adc-482b-bd77-553d76648ac6" (UID: "e0e8efba-9adc-482b-bd77-553d76648ac6"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.263540 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0e8efba-9adc-482b-bd77-553d76648ac6-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e0e8efba-9adc-482b-bd77-553d76648ac6" (UID: "e0e8efba-9adc-482b-bd77-553d76648ac6"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.266624 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0e8efba-9adc-482b-bd77-553d76648ac6-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "e0e8efba-9adc-482b-bd77-553d76648ac6" (UID: "e0e8efba-9adc-482b-bd77-553d76648ac6"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.266892 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0e8efba-9adc-482b-bd77-553d76648ac6-logs" (OuterVolumeSpecName: "logs") pod "e0e8efba-9adc-482b-bd77-553d76648ac6" (UID: "e0e8efba-9adc-482b-bd77-553d76648ac6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.266920 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0e8efba-9adc-482b-bd77-553d76648ac6-sys" (OuterVolumeSpecName: "sys") pod "e0e8efba-9adc-482b-bd77-553d76648ac6" (UID: "e0e8efba-9adc-482b-bd77-553d76648ac6"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.266938 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0e8efba-9adc-482b-bd77-553d76648ac6-run" (OuterVolumeSpecName: "run") pod "e0e8efba-9adc-482b-bd77-553d76648ac6" (UID: "e0e8efba-9adc-482b-bd77-553d76648ac6"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.266954 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0e8efba-9adc-482b-bd77-553d76648ac6-dev" (OuterVolumeSpecName: "dev") pod "e0e8efba-9adc-482b-bd77-553d76648ac6" (UID: "e0e8efba-9adc-482b-bd77-553d76648ac6"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.266971 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0e8efba-9adc-482b-bd77-553d76648ac6-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "e0e8efba-9adc-482b-bd77-553d76648ac6" (UID: "e0e8efba-9adc-482b-bd77-553d76648ac6"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.266990 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0e8efba-9adc-482b-bd77-553d76648ac6-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "e0e8efba-9adc-482b-bd77-553d76648ac6" (UID: "e0e8efba-9adc-482b-bd77-553d76648ac6"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.295290 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0e8efba-9adc-482b-bd77-553d76648ac6-kube-api-access-l9kwf" (OuterVolumeSpecName: "kube-api-access-l9kwf") pod "e0e8efba-9adc-482b-bd77-553d76648ac6" (UID: "e0e8efba-9adc-482b-bd77-553d76648ac6"). InnerVolumeSpecName "kube-api-access-l9kwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.295423 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0e8efba-9adc-482b-bd77-553d76648ac6-scripts" (OuterVolumeSpecName: "scripts") pod "e0e8efba-9adc-482b-bd77-553d76648ac6" (UID: "e0e8efba-9adc-482b-bd77-553d76648ac6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.296669 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance-cache") pod "e0e8efba-9adc-482b-bd77-553d76648ac6" (UID: "e0e8efba-9adc-482b-bd77-553d76648ac6"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.296776 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage15-crc" (OuterVolumeSpecName: "glance") pod "e0e8efba-9adc-482b-bd77-553d76648ac6" (UID: "e0e8efba-9adc-482b-bd77-553d76648ac6"). InnerVolumeSpecName "local-storage15-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.307183 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.319159 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.321560 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0e8efba-9adc-482b-bd77-553d76648ac6-config-data" (OuterVolumeSpecName: "config-data") pod "e0e8efba-9adc-482b-bd77-553d76648ac6" (UID: "e0e8efba-9adc-482b-bd77-553d76648ac6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.356986 4751 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e0e8efba-9adc-482b-bd77-553d76648ac6-dev\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.357014 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0e8efba-9adc-482b-bd77-553d76648ac6-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.357024 4751 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e0e8efba-9adc-482b-bd77-553d76648ac6-etc-iscsi\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.357032 4751 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e0e8efba-9adc-482b-bd77-553d76648ac6-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.357064 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.357090 4751 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e0e8efba-9adc-482b-bd77-553d76648ac6-run\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.357098 4751 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e0e8efba-9adc-482b-bd77-553d76648ac6-sys\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.357106 4751 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e0e8efba-9adc-482b-bd77-553d76648ac6-lib-modules\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.357113 4751 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e0e8efba-9adc-482b-bd77-553d76648ac6-var-locks-brick\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.357126 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" " Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.357134 4751 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e0e8efba-9adc-482b-bd77-553d76648ac6-etc-nvme\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.357142 4751 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0e8efba-9adc-482b-bd77-553d76648ac6-logs\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.357149 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0e8efba-9adc-482b-bd77-553d76648ac6-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.357161 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9kwf\" (UniqueName: \"kubernetes.io/projected/e0e8efba-9adc-482b-bd77-553d76648ac6-kube-api-access-l9kwf\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.369723 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage15-crc" (UniqueName: "kubernetes.io/local-volume/local-storage15-crc") on node "crc" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.370945 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.458635 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.458668 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.943720 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"e0e8efba-9adc-482b-bd77-553d76648ac6","Type":"ContainerDied","Data":"8b49f74873882c94e30e439215c7b1269126be109dcab9f528966ad2a1118a0c"} Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.943771 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.943801 4751 scope.go:117] "RemoveContainer" containerID="90a6f0dd1552854347833452de54355b3cea39a33ffea2db5092440b501dadb7" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.980924 4751 scope.go:117] "RemoveContainer" containerID="de4d5808fc4ba7308f826962b8650c6ce882dcfa84a2b9961ed782c3d596f76e" Jan 31 15:06:11 crc kubenswrapper[4751]: I0131 15:06:11.992061 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 15:06:12 crc kubenswrapper[4751]: I0131 15:06:12.000015 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 15:06:12 crc kubenswrapper[4751]: I0131 15:06:12.190743 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-create-bbbff"] Jan 31 15:06:12 crc kubenswrapper[4751]: I0131 15:06:12.205653 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-create-bbbff"] Jan 31 15:06:12 crc kubenswrapper[4751]: I0131 15:06:12.211568 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glancefb75-account-delete-rcplg"] Jan 31 15:06:12 crc kubenswrapper[4751]: I0131 15:06:12.219762 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glancefb75-account-delete-rcplg"] Jan 31 15:06:12 crc kubenswrapper[4751]: I0131 15:06:12.226049 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-fb75-account-create-update-8nfdw"] Jan 31 15:06:12 crc kubenswrapper[4751]: I0131 15:06:12.232978 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-fb75-account-create-update-8nfdw"] Jan 31 15:06:12 crc kubenswrapper[4751]: I0131 15:06:12.417065 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ca3bf6-027a-4e7b-a142-44ad4308fd3e" path="/var/lib/kubelet/pods/09ca3bf6-027a-4e7b-a142-44ad4308fd3e/volumes" Jan 31 15:06:12 crc kubenswrapper[4751]: I0131 15:06:12.417811 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d2dc104-ad94-47b2-add7-9314eb88e5b0" path="/var/lib/kubelet/pods/2d2dc104-ad94-47b2-add7-9314eb88e5b0/volumes" Jan 31 15:06:12 crc kubenswrapper[4751]: I0131 15:06:12.418545 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88adbd16-7694-4f3b-8de1-b15932042491" path="/var/lib/kubelet/pods/88adbd16-7694-4f3b-8de1-b15932042491/volumes" Jan 31 15:06:12 crc kubenswrapper[4751]: I0131 15:06:12.420157 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90af064c-9d0a-4818-8e19-c87da44a879b" path="/var/lib/kubelet/pods/90af064c-9d0a-4818-8e19-c87da44a879b/volumes" Jan 31 15:06:12 crc kubenswrapper[4751]: I0131 15:06:12.422386 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0e8efba-9adc-482b-bd77-553d76648ac6" path="/var/lib/kubelet/pods/e0e8efba-9adc-482b-bd77-553d76648ac6/volumes" Jan 31 15:06:14 crc kubenswrapper[4751]: I0131 15:06:14.708573 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ztzpn"] Jan 31 15:06:14 crc kubenswrapper[4751]: E0131 15:06:14.710807 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fada73df-4c18-4f18-9fcd-9fe24825a32c" containerName="glance-httpd" Jan 31 15:06:14 crc kubenswrapper[4751]: I0131 15:06:14.710986 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="fada73df-4c18-4f18-9fcd-9fe24825a32c" containerName="glance-httpd" Jan 31 15:06:14 crc kubenswrapper[4751]: E0131 15:06:14.711165 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09ca3bf6-027a-4e7b-a142-44ad4308fd3e" containerName="mariadb-account-delete" Jan 31 15:06:14 crc kubenswrapper[4751]: I0131 15:06:14.711289 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="09ca3bf6-027a-4e7b-a142-44ad4308fd3e" containerName="mariadb-account-delete" Jan 31 15:06:14 crc kubenswrapper[4751]: E0131 15:06:14.711436 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fada73df-4c18-4f18-9fcd-9fe24825a32c" containerName="glance-log" Jan 31 15:06:14 crc kubenswrapper[4751]: I0131 15:06:14.711571 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="fada73df-4c18-4f18-9fcd-9fe24825a32c" containerName="glance-log" Jan 31 15:06:14 crc kubenswrapper[4751]: E0131 15:06:14.711708 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0e8efba-9adc-482b-bd77-553d76648ac6" containerName="glance-httpd" Jan 31 15:06:14 crc kubenswrapper[4751]: I0131 15:06:14.711833 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0e8efba-9adc-482b-bd77-553d76648ac6" containerName="glance-httpd" Jan 31 15:06:14 crc kubenswrapper[4751]: E0131 15:06:14.711965 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ce40f98-80af-4a4b-8556-c5c7dd84fc58" containerName="glance-log" Jan 31 15:06:14 crc kubenswrapper[4751]: I0131 15:06:14.712118 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ce40f98-80af-4a4b-8556-c5c7dd84fc58" containerName="glance-log" Jan 31 15:06:14 crc kubenswrapper[4751]: E0131 15:06:14.712260 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90af064c-9d0a-4818-8e19-c87da44a879b" containerName="glance-log" Jan 31 15:06:14 crc kubenswrapper[4751]: I0131 15:06:14.712393 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="90af064c-9d0a-4818-8e19-c87da44a879b" containerName="glance-log" Jan 31 15:06:14 crc kubenswrapper[4751]: E0131 15:06:14.712524 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90af064c-9d0a-4818-8e19-c87da44a879b" containerName="glance-httpd" Jan 31 15:06:14 crc kubenswrapper[4751]: I0131 15:06:14.712635 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="90af064c-9d0a-4818-8e19-c87da44a879b" containerName="glance-httpd" Jan 31 15:06:14 crc kubenswrapper[4751]: E0131 15:06:14.712776 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ce40f98-80af-4a4b-8556-c5c7dd84fc58" containerName="glance-httpd" Jan 31 15:06:14 crc kubenswrapper[4751]: I0131 15:06:14.712899 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ce40f98-80af-4a4b-8556-c5c7dd84fc58" containerName="glance-httpd" Jan 31 15:06:14 crc kubenswrapper[4751]: E0131 15:06:14.713039 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0e8efba-9adc-482b-bd77-553d76648ac6" containerName="glance-log" Jan 31 15:06:14 crc kubenswrapper[4751]: I0131 15:06:14.713227 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0e8efba-9adc-482b-bd77-553d76648ac6" containerName="glance-log" Jan 31 15:06:14 crc kubenswrapper[4751]: I0131 15:06:14.713617 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0e8efba-9adc-482b-bd77-553d76648ac6" containerName="glance-log" Jan 31 15:06:14 crc kubenswrapper[4751]: I0131 15:06:14.713770 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ce40f98-80af-4a4b-8556-c5c7dd84fc58" containerName="glance-httpd" Jan 31 15:06:14 crc kubenswrapper[4751]: I0131 15:06:14.713937 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="09ca3bf6-027a-4e7b-a142-44ad4308fd3e" containerName="mariadb-account-delete" Jan 31 15:06:14 crc kubenswrapper[4751]: I0131 15:06:14.714103 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ce40f98-80af-4a4b-8556-c5c7dd84fc58" containerName="glance-log" Jan 31 15:06:14 crc kubenswrapper[4751]: I0131 15:06:14.714248 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="90af064c-9d0a-4818-8e19-c87da44a879b" containerName="glance-httpd" Jan 31 15:06:14 crc kubenswrapper[4751]: I0131 15:06:14.714397 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="90af064c-9d0a-4818-8e19-c87da44a879b" containerName="glance-log" Jan 31 15:06:14 crc kubenswrapper[4751]: I0131 15:06:14.714532 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="fada73df-4c18-4f18-9fcd-9fe24825a32c" containerName="glance-httpd" Jan 31 15:06:14 crc kubenswrapper[4751]: I0131 15:06:14.714658 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="fada73df-4c18-4f18-9fcd-9fe24825a32c" containerName="glance-log" Jan 31 15:06:14 crc kubenswrapper[4751]: I0131 15:06:14.714788 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0e8efba-9adc-482b-bd77-553d76648ac6" containerName="glance-httpd" Jan 31 15:06:14 crc kubenswrapper[4751]: I0131 15:06:14.716641 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ztzpn" Jan 31 15:06:14 crc kubenswrapper[4751]: I0131 15:06:14.737124 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ztzpn"] Jan 31 15:06:14 crc kubenswrapper[4751]: I0131 15:06:14.821554 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgqfs\" (UniqueName: \"kubernetes.io/projected/d783dd01-73a7-4362-888a-ab84bc8739df-kube-api-access-sgqfs\") pod \"redhat-operators-ztzpn\" (UID: \"d783dd01-73a7-4362-888a-ab84bc8739df\") " pod="openshift-marketplace/redhat-operators-ztzpn" Jan 31 15:06:14 crc kubenswrapper[4751]: I0131 15:06:14.821707 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d783dd01-73a7-4362-888a-ab84bc8739df-catalog-content\") pod \"redhat-operators-ztzpn\" (UID: \"d783dd01-73a7-4362-888a-ab84bc8739df\") " pod="openshift-marketplace/redhat-operators-ztzpn" Jan 31 15:06:14 crc kubenswrapper[4751]: I0131 15:06:14.821784 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d783dd01-73a7-4362-888a-ab84bc8739df-utilities\") pod \"redhat-operators-ztzpn\" (UID: \"d783dd01-73a7-4362-888a-ab84bc8739df\") " pod="openshift-marketplace/redhat-operators-ztzpn" Jan 31 15:06:14 crc kubenswrapper[4751]: I0131 15:06:14.923713 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgqfs\" (UniqueName: \"kubernetes.io/projected/d783dd01-73a7-4362-888a-ab84bc8739df-kube-api-access-sgqfs\") pod \"redhat-operators-ztzpn\" (UID: \"d783dd01-73a7-4362-888a-ab84bc8739df\") " pod="openshift-marketplace/redhat-operators-ztzpn" Jan 31 15:06:14 crc kubenswrapper[4751]: I0131 15:06:14.923784 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d783dd01-73a7-4362-888a-ab84bc8739df-catalog-content\") pod \"redhat-operators-ztzpn\" (UID: \"d783dd01-73a7-4362-888a-ab84bc8739df\") " pod="openshift-marketplace/redhat-operators-ztzpn" Jan 31 15:06:14 crc kubenswrapper[4751]: I0131 15:06:14.923831 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d783dd01-73a7-4362-888a-ab84bc8739df-utilities\") pod \"redhat-operators-ztzpn\" (UID: \"d783dd01-73a7-4362-888a-ab84bc8739df\") " pod="openshift-marketplace/redhat-operators-ztzpn" Jan 31 15:06:14 crc kubenswrapper[4751]: I0131 15:06:14.924313 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d783dd01-73a7-4362-888a-ab84bc8739df-catalog-content\") pod \"redhat-operators-ztzpn\" (UID: \"d783dd01-73a7-4362-888a-ab84bc8739df\") " pod="openshift-marketplace/redhat-operators-ztzpn" Jan 31 15:06:14 crc kubenswrapper[4751]: I0131 15:06:14.924401 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d783dd01-73a7-4362-888a-ab84bc8739df-utilities\") pod \"redhat-operators-ztzpn\" (UID: \"d783dd01-73a7-4362-888a-ab84bc8739df\") " pod="openshift-marketplace/redhat-operators-ztzpn" Jan 31 15:06:14 crc kubenswrapper[4751]: I0131 15:06:14.941775 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgqfs\" (UniqueName: \"kubernetes.io/projected/d783dd01-73a7-4362-888a-ab84bc8739df-kube-api-access-sgqfs\") pod \"redhat-operators-ztzpn\" (UID: \"d783dd01-73a7-4362-888a-ab84bc8739df\") " pod="openshift-marketplace/redhat-operators-ztzpn" Jan 31 15:06:15 crc kubenswrapper[4751]: I0131 15:06:15.051107 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ztzpn" Jan 31 15:06:15 crc kubenswrapper[4751]: I0131 15:06:15.506911 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ztzpn"] Jan 31 15:06:15 crc kubenswrapper[4751]: I0131 15:06:15.984407 4751 generic.go:334] "Generic (PLEG): container finished" podID="d783dd01-73a7-4362-888a-ab84bc8739df" containerID="568906c2cc7feff3ba674be852dca9f1ba04b313f69bf113705a16e3309aa4da" exitCode=0 Jan 31 15:06:15 crc kubenswrapper[4751]: I0131 15:06:15.984454 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ztzpn" event={"ID":"d783dd01-73a7-4362-888a-ab84bc8739df","Type":"ContainerDied","Data":"568906c2cc7feff3ba674be852dca9f1ba04b313f69bf113705a16e3309aa4da"} Jan 31 15:06:15 crc kubenswrapper[4751]: I0131 15:06:15.984482 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ztzpn" event={"ID":"d783dd01-73a7-4362-888a-ab84bc8739df","Type":"ContainerStarted","Data":"f8e2ea7f77972f236bec476d7b7bb124f32cd9d091fcbabec970fc3dd4a6de6c"} Jan 31 15:06:15 crc kubenswrapper[4751]: I0131 15:06:15.987058 4751 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 31 15:06:16 crc kubenswrapper[4751]: I0131 15:06:16.992947 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ztzpn" event={"ID":"d783dd01-73a7-4362-888a-ab84bc8739df","Type":"ContainerStarted","Data":"c9f9f3a04268cfbac7a889faf5708fdd7ab535489380c76f269ae48567d562f0"} Jan 31 15:06:18 crc kubenswrapper[4751]: I0131 15:06:18.002372 4751 generic.go:334] "Generic (PLEG): container finished" podID="d783dd01-73a7-4362-888a-ab84bc8739df" containerID="c9f9f3a04268cfbac7a889faf5708fdd7ab535489380c76f269ae48567d562f0" exitCode=0 Jan 31 15:06:18 crc kubenswrapper[4751]: I0131 15:06:18.002425 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ztzpn" event={"ID":"d783dd01-73a7-4362-888a-ab84bc8739df","Type":"ContainerDied","Data":"c9f9f3a04268cfbac7a889faf5708fdd7ab535489380c76f269ae48567d562f0"} Jan 31 15:06:19 crc kubenswrapper[4751]: I0131 15:06:19.032286 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ztzpn" event={"ID":"d783dd01-73a7-4362-888a-ab84bc8739df","Type":"ContainerStarted","Data":"8a8f6ec3fc4799718a2c776fd8b2c60694522c37afe696834e35482b1037e761"} Jan 31 15:06:19 crc kubenswrapper[4751]: I0131 15:06:19.060056 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ztzpn" podStartSLOduration=2.6316717240000003 podStartE2EDuration="5.060034629s" podCreationTimestamp="2026-01-31 15:06:14 +0000 UTC" firstStartedPulling="2026-01-31 15:06:15.986825606 +0000 UTC m=+1480.361538491" lastFinishedPulling="2026-01-31 15:06:18.415188461 +0000 UTC m=+1482.789901396" observedRunningTime="2026-01-31 15:06:19.056544877 +0000 UTC m=+1483.431257772" watchObservedRunningTime="2026-01-31 15:06:19.060034629 +0000 UTC m=+1483.434747524" Jan 31 15:06:20 crc kubenswrapper[4751]: I0131 15:06:20.289592 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-z72xp"] Jan 31 15:06:20 crc kubenswrapper[4751]: I0131 15:06:20.297812 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-z72xp"] Jan 31 15:06:20 crc kubenswrapper[4751]: I0131 15:06:20.317693 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/swift-storage-0"] Jan 31 15:06:20 crc kubenswrapper[4751]: I0131 15:06:20.318148 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-storage-0" podUID="440e5809-7b49-4b21-99dd-668468c84017" containerName="account-server" containerID="cri-o://d93f0c8cc4f4e310c9d207351f924f281c14e44b511b3d4a8f51fed27dbeed8f" gracePeriod=30 Jan 31 15:06:20 crc kubenswrapper[4751]: I0131 15:06:20.318471 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-storage-0" podUID="440e5809-7b49-4b21-99dd-668468c84017" containerName="swift-recon-cron" containerID="cri-o://ae11b6c0a7f7893c0ba728593c9e1b6db0bc399ae9c55df1f1023d422fc9333c" gracePeriod=30 Jan 31 15:06:20 crc kubenswrapper[4751]: I0131 15:06:20.318515 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-storage-0" podUID="440e5809-7b49-4b21-99dd-668468c84017" containerName="rsync" containerID="cri-o://519bd8155f30918b172e24832e84310378bd7ea10e796377a992dd3fe9e7276d" gracePeriod=30 Jan 31 15:06:20 crc kubenswrapper[4751]: I0131 15:06:20.318547 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-storage-0" podUID="440e5809-7b49-4b21-99dd-668468c84017" containerName="object-expirer" containerID="cri-o://950232b5b660c70b9100e81003ff993443f745f40d7da6ba8dc037822059cb8e" gracePeriod=30 Jan 31 15:06:20 crc kubenswrapper[4751]: I0131 15:06:20.318576 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-storage-0" podUID="440e5809-7b49-4b21-99dd-668468c84017" containerName="object-updater" containerID="cri-o://71ca1416bdc095b268ec385a4ebcd269b729c80c3aee7f832db2892f4fe6e78a" gracePeriod=30 Jan 31 15:06:20 crc kubenswrapper[4751]: I0131 15:06:20.318605 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-storage-0" podUID="440e5809-7b49-4b21-99dd-668468c84017" containerName="object-auditor" containerID="cri-o://1e2003fe4d2366b583ebedf393e2492c910be0ebf3f2652f5a15b1e8c78961df" gracePeriod=30 Jan 31 15:06:20 crc kubenswrapper[4751]: I0131 15:06:20.318635 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-storage-0" podUID="440e5809-7b49-4b21-99dd-668468c84017" containerName="object-replicator" containerID="cri-o://03b25054db738f38056ec8af2822c9203e252f1a4f95be8c4ab8c1c34de3455c" gracePeriod=30 Jan 31 15:06:20 crc kubenswrapper[4751]: I0131 15:06:20.318664 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-storage-0" podUID="440e5809-7b49-4b21-99dd-668468c84017" containerName="object-server" containerID="cri-o://1f74cf8c2ce97cd17f509447e4c986197d8af0e8b2f40e7c6a07653c81e66d3b" gracePeriod=30 Jan 31 15:06:20 crc kubenswrapper[4751]: I0131 15:06:20.318692 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-storage-0" podUID="440e5809-7b49-4b21-99dd-668468c84017" containerName="container-updater" containerID="cri-o://400722d3dac6cd5b0b727b3e599b127bb527981160049f2561a32e7ada14affd" gracePeriod=30 Jan 31 15:06:20 crc kubenswrapper[4751]: I0131 15:06:20.318721 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-storage-0" podUID="440e5809-7b49-4b21-99dd-668468c84017" containerName="container-auditor" containerID="cri-o://34a87b0cfca857f6a2c07d4713531103b7df75f0fdc3e2be299ecaf554d5d9db" gracePeriod=30 Jan 31 15:06:20 crc kubenswrapper[4751]: I0131 15:06:20.318747 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-storage-0" podUID="440e5809-7b49-4b21-99dd-668468c84017" containerName="container-replicator" containerID="cri-o://03c86cbbc819872662746f2a8384c7c50f07b481c42b5f3d39e0b1e87c7b0557" gracePeriod=30 Jan 31 15:06:20 crc kubenswrapper[4751]: I0131 15:06:20.318776 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-storage-0" podUID="440e5809-7b49-4b21-99dd-668468c84017" containerName="container-server" containerID="cri-o://3b4375e902d16ea731761694aa85354dcfcda568f68f1d4210b06b07c701f380" gracePeriod=30 Jan 31 15:06:20 crc kubenswrapper[4751]: I0131 15:06:20.318804 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-storage-0" podUID="440e5809-7b49-4b21-99dd-668468c84017" containerName="account-reaper" containerID="cri-o://a4e14596c5c3a7af2ea9e82736c916fc73b8fcbf27a523b8fe47f9a8e69b1bc2" gracePeriod=30 Jan 31 15:06:20 crc kubenswrapper[4751]: I0131 15:06:20.318832 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-storage-0" podUID="440e5809-7b49-4b21-99dd-668468c84017" containerName="account-auditor" containerID="cri-o://461a1aaa8bc72705195647c97b28e111484e900c69e9a4da07e510a6c451ed4c" gracePeriod=30 Jan 31 15:06:20 crc kubenswrapper[4751]: I0131 15:06:20.318858 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-storage-0" podUID="440e5809-7b49-4b21-99dd-668468c84017" containerName="account-replicator" containerID="cri-o://d0ab6cd06ea2abbd171a5345dc579495df175d9d8a52b30a0139e24e65e43616" gracePeriod=30 Jan 31 15:06:20 crc kubenswrapper[4751]: I0131 15:06:20.360042 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/swift-proxy-6d699db77c-58vrl"] Jan 31 15:06:20 crc kubenswrapper[4751]: I0131 15:06:20.360299 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-proxy-6d699db77c-58vrl" podUID="26ee66f9-5607-4559-9a64-6767dfbcc078" containerName="proxy-httpd" containerID="cri-o://7994c2196fb62df3ba578a245a33690acd7fb1518638072c0dcea5a66bf4d321" gracePeriod=30 Jan 31 15:06:20 crc kubenswrapper[4751]: I0131 15:06:20.360427 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-proxy-6d699db77c-58vrl" podUID="26ee66f9-5607-4559-9a64-6767dfbcc078" containerName="proxy-server" containerID="cri-o://67b86444b56a0b82ee27cdb476ad3cf81bbe2a2988cb8d86234cd5cb875fb2be" gracePeriod=30 Jan 31 15:06:20 crc kubenswrapper[4751]: I0131 15:06:20.415173 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="606aa4a9-2afe-4f51-a562-90f716040b58" path="/var/lib/kubelet/pods/606aa4a9-2afe-4f51-a562-90f716040b58/volumes" Jan 31 15:06:20 crc kubenswrapper[4751]: I0131 15:06:20.706331 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/swift-proxy-6d699db77c-58vrl" podUID="26ee66f9-5607-4559-9a64-6767dfbcc078" containerName="proxy-server" probeResult="failure" output="Get \"http://10.217.0.91:8080/healthcheck\": dial tcp 10.217.0.91:8080: connect: connection refused" Jan 31 15:06:20 crc kubenswrapper[4751]: I0131 15:06:20.706565 4751 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/swift-proxy-6d699db77c-58vrl" podUID="26ee66f9-5607-4559-9a64-6767dfbcc078" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.91:8080/healthcheck\": dial tcp 10.217.0.91:8080: connect: connection refused" Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.003301 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-proxy-6d699db77c-58vrl" Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.054457 4751 generic.go:334] "Generic (PLEG): container finished" podID="440e5809-7b49-4b21-99dd-668468c84017" containerID="519bd8155f30918b172e24832e84310378bd7ea10e796377a992dd3fe9e7276d" exitCode=0 Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.054488 4751 generic.go:334] "Generic (PLEG): container finished" podID="440e5809-7b49-4b21-99dd-668468c84017" containerID="950232b5b660c70b9100e81003ff993443f745f40d7da6ba8dc037822059cb8e" exitCode=0 Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.054497 4751 generic.go:334] "Generic (PLEG): container finished" podID="440e5809-7b49-4b21-99dd-668468c84017" containerID="71ca1416bdc095b268ec385a4ebcd269b729c80c3aee7f832db2892f4fe6e78a" exitCode=0 Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.054503 4751 generic.go:334] "Generic (PLEG): container finished" podID="440e5809-7b49-4b21-99dd-668468c84017" containerID="1e2003fe4d2366b583ebedf393e2492c910be0ebf3f2652f5a15b1e8c78961df" exitCode=0 Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.054509 4751 generic.go:334] "Generic (PLEG): container finished" podID="440e5809-7b49-4b21-99dd-668468c84017" containerID="03b25054db738f38056ec8af2822c9203e252f1a4f95be8c4ab8c1c34de3455c" exitCode=0 Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.054516 4751 generic.go:334] "Generic (PLEG): container finished" podID="440e5809-7b49-4b21-99dd-668468c84017" containerID="1f74cf8c2ce97cd17f509447e4c986197d8af0e8b2f40e7c6a07653c81e66d3b" exitCode=0 Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.054523 4751 generic.go:334] "Generic (PLEG): container finished" podID="440e5809-7b49-4b21-99dd-668468c84017" containerID="400722d3dac6cd5b0b727b3e599b127bb527981160049f2561a32e7ada14affd" exitCode=0 Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.054530 4751 generic.go:334] "Generic (PLEG): container finished" podID="440e5809-7b49-4b21-99dd-668468c84017" containerID="34a87b0cfca857f6a2c07d4713531103b7df75f0fdc3e2be299ecaf554d5d9db" exitCode=0 Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.054536 4751 generic.go:334] "Generic (PLEG): container finished" podID="440e5809-7b49-4b21-99dd-668468c84017" containerID="03c86cbbc819872662746f2a8384c7c50f07b481c42b5f3d39e0b1e87c7b0557" exitCode=0 Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.054542 4751 generic.go:334] "Generic (PLEG): container finished" podID="440e5809-7b49-4b21-99dd-668468c84017" containerID="3b4375e902d16ea731761694aa85354dcfcda568f68f1d4210b06b07c701f380" exitCode=0 Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.054549 4751 generic.go:334] "Generic (PLEG): container finished" podID="440e5809-7b49-4b21-99dd-668468c84017" containerID="a4e14596c5c3a7af2ea9e82736c916fc73b8fcbf27a523b8fe47f9a8e69b1bc2" exitCode=0 Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.054555 4751 generic.go:334] "Generic (PLEG): container finished" podID="440e5809-7b49-4b21-99dd-668468c84017" containerID="461a1aaa8bc72705195647c97b28e111484e900c69e9a4da07e510a6c451ed4c" exitCode=0 Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.054560 4751 generic.go:334] "Generic (PLEG): container finished" podID="440e5809-7b49-4b21-99dd-668468c84017" containerID="d0ab6cd06ea2abbd171a5345dc579495df175d9d8a52b30a0139e24e65e43616" exitCode=0 Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.054567 4751 generic.go:334] "Generic (PLEG): container finished" podID="440e5809-7b49-4b21-99dd-668468c84017" containerID="d93f0c8cc4f4e310c9d207351f924f281c14e44b511b3d4a8f51fed27dbeed8f" exitCode=0 Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.054605 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"440e5809-7b49-4b21-99dd-668468c84017","Type":"ContainerDied","Data":"519bd8155f30918b172e24832e84310378bd7ea10e796377a992dd3fe9e7276d"} Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.054630 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"440e5809-7b49-4b21-99dd-668468c84017","Type":"ContainerDied","Data":"950232b5b660c70b9100e81003ff993443f745f40d7da6ba8dc037822059cb8e"} Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.054640 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"440e5809-7b49-4b21-99dd-668468c84017","Type":"ContainerDied","Data":"71ca1416bdc095b268ec385a4ebcd269b729c80c3aee7f832db2892f4fe6e78a"} Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.054650 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"440e5809-7b49-4b21-99dd-668468c84017","Type":"ContainerDied","Data":"1e2003fe4d2366b583ebedf393e2492c910be0ebf3f2652f5a15b1e8c78961df"} Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.054659 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"440e5809-7b49-4b21-99dd-668468c84017","Type":"ContainerDied","Data":"03b25054db738f38056ec8af2822c9203e252f1a4f95be8c4ab8c1c34de3455c"} Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.054669 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"440e5809-7b49-4b21-99dd-668468c84017","Type":"ContainerDied","Data":"1f74cf8c2ce97cd17f509447e4c986197d8af0e8b2f40e7c6a07653c81e66d3b"} Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.054677 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"440e5809-7b49-4b21-99dd-668468c84017","Type":"ContainerDied","Data":"400722d3dac6cd5b0b727b3e599b127bb527981160049f2561a32e7ada14affd"} Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.054685 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"440e5809-7b49-4b21-99dd-668468c84017","Type":"ContainerDied","Data":"34a87b0cfca857f6a2c07d4713531103b7df75f0fdc3e2be299ecaf554d5d9db"} Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.054694 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"440e5809-7b49-4b21-99dd-668468c84017","Type":"ContainerDied","Data":"03c86cbbc819872662746f2a8384c7c50f07b481c42b5f3d39e0b1e87c7b0557"} Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.054702 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"440e5809-7b49-4b21-99dd-668468c84017","Type":"ContainerDied","Data":"3b4375e902d16ea731761694aa85354dcfcda568f68f1d4210b06b07c701f380"} Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.054712 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"440e5809-7b49-4b21-99dd-668468c84017","Type":"ContainerDied","Data":"a4e14596c5c3a7af2ea9e82736c916fc73b8fcbf27a523b8fe47f9a8e69b1bc2"} Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.054723 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"440e5809-7b49-4b21-99dd-668468c84017","Type":"ContainerDied","Data":"461a1aaa8bc72705195647c97b28e111484e900c69e9a4da07e510a6c451ed4c"} Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.054735 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"440e5809-7b49-4b21-99dd-668468c84017","Type":"ContainerDied","Data":"d0ab6cd06ea2abbd171a5345dc579495df175d9d8a52b30a0139e24e65e43616"} Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.054745 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"440e5809-7b49-4b21-99dd-668468c84017","Type":"ContainerDied","Data":"d93f0c8cc4f4e310c9d207351f924f281c14e44b511b3d4a8f51fed27dbeed8f"} Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.057043 4751 generic.go:334] "Generic (PLEG): container finished" podID="26ee66f9-5607-4559-9a64-6767dfbcc078" containerID="67b86444b56a0b82ee27cdb476ad3cf81bbe2a2988cb8d86234cd5cb875fb2be" exitCode=0 Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.057063 4751 generic.go:334] "Generic (PLEG): container finished" podID="26ee66f9-5607-4559-9a64-6767dfbcc078" containerID="7994c2196fb62df3ba578a245a33690acd7fb1518638072c0dcea5a66bf4d321" exitCode=0 Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.057134 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-proxy-6d699db77c-58vrl" event={"ID":"26ee66f9-5607-4559-9a64-6767dfbcc078","Type":"ContainerDied","Data":"67b86444b56a0b82ee27cdb476ad3cf81bbe2a2988cb8d86234cd5cb875fb2be"} Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.057215 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-proxy-6d699db77c-58vrl" event={"ID":"26ee66f9-5607-4559-9a64-6767dfbcc078","Type":"ContainerDied","Data":"7994c2196fb62df3ba578a245a33690acd7fb1518638072c0dcea5a66bf4d321"} Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.057231 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-proxy-6d699db77c-58vrl" event={"ID":"26ee66f9-5607-4559-9a64-6767dfbcc078","Type":"ContainerDied","Data":"85b271a6f57bacb15bd471b08b0e25366c5d1865f74c103bc014e71042620a53"} Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.057282 4751 scope.go:117] "RemoveContainer" containerID="67b86444b56a0b82ee27cdb476ad3cf81bbe2a2988cb8d86234cd5cb875fb2be" Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.057156 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-proxy-6d699db77c-58vrl" Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.078579 4751 scope.go:117] "RemoveContainer" containerID="7994c2196fb62df3ba578a245a33690acd7fb1518638072c0dcea5a66bf4d321" Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.097375 4751 scope.go:117] "RemoveContainer" containerID="67b86444b56a0b82ee27cdb476ad3cf81bbe2a2988cb8d86234cd5cb875fb2be" Jan 31 15:06:21 crc kubenswrapper[4751]: E0131 15:06:21.097868 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67b86444b56a0b82ee27cdb476ad3cf81bbe2a2988cb8d86234cd5cb875fb2be\": container with ID starting with 67b86444b56a0b82ee27cdb476ad3cf81bbe2a2988cb8d86234cd5cb875fb2be not found: ID does not exist" containerID="67b86444b56a0b82ee27cdb476ad3cf81bbe2a2988cb8d86234cd5cb875fb2be" Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.097915 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67b86444b56a0b82ee27cdb476ad3cf81bbe2a2988cb8d86234cd5cb875fb2be"} err="failed to get container status \"67b86444b56a0b82ee27cdb476ad3cf81bbe2a2988cb8d86234cd5cb875fb2be\": rpc error: code = NotFound desc = could not find container \"67b86444b56a0b82ee27cdb476ad3cf81bbe2a2988cb8d86234cd5cb875fb2be\": container with ID starting with 67b86444b56a0b82ee27cdb476ad3cf81bbe2a2988cb8d86234cd5cb875fb2be not found: ID does not exist" Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.097945 4751 scope.go:117] "RemoveContainer" containerID="7994c2196fb62df3ba578a245a33690acd7fb1518638072c0dcea5a66bf4d321" Jan 31 15:06:21 crc kubenswrapper[4751]: E0131 15:06:21.098377 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7994c2196fb62df3ba578a245a33690acd7fb1518638072c0dcea5a66bf4d321\": container with ID starting with 7994c2196fb62df3ba578a245a33690acd7fb1518638072c0dcea5a66bf4d321 not found: ID does not exist" containerID="7994c2196fb62df3ba578a245a33690acd7fb1518638072c0dcea5a66bf4d321" Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.098402 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7994c2196fb62df3ba578a245a33690acd7fb1518638072c0dcea5a66bf4d321"} err="failed to get container status \"7994c2196fb62df3ba578a245a33690acd7fb1518638072c0dcea5a66bf4d321\": rpc error: code = NotFound desc = could not find container \"7994c2196fb62df3ba578a245a33690acd7fb1518638072c0dcea5a66bf4d321\": container with ID starting with 7994c2196fb62df3ba578a245a33690acd7fb1518638072c0dcea5a66bf4d321 not found: ID does not exist" Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.098420 4751 scope.go:117] "RemoveContainer" containerID="67b86444b56a0b82ee27cdb476ad3cf81bbe2a2988cb8d86234cd5cb875fb2be" Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.098782 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67b86444b56a0b82ee27cdb476ad3cf81bbe2a2988cb8d86234cd5cb875fb2be"} err="failed to get container status \"67b86444b56a0b82ee27cdb476ad3cf81bbe2a2988cb8d86234cd5cb875fb2be\": rpc error: code = NotFound desc = could not find container \"67b86444b56a0b82ee27cdb476ad3cf81bbe2a2988cb8d86234cd5cb875fb2be\": container with ID starting with 67b86444b56a0b82ee27cdb476ad3cf81bbe2a2988cb8d86234cd5cb875fb2be not found: ID does not exist" Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.098803 4751 scope.go:117] "RemoveContainer" containerID="7994c2196fb62df3ba578a245a33690acd7fb1518638072c0dcea5a66bf4d321" Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.099231 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7994c2196fb62df3ba578a245a33690acd7fb1518638072c0dcea5a66bf4d321"} err="failed to get container status \"7994c2196fb62df3ba578a245a33690acd7fb1518638072c0dcea5a66bf4d321\": rpc error: code = NotFound desc = could not find container \"7994c2196fb62df3ba578a245a33690acd7fb1518638072c0dcea5a66bf4d321\": container with ID starting with 7994c2196fb62df3ba578a245a33690acd7fb1518638072c0dcea5a66bf4d321 not found: ID does not exist" Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.121985 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26ee66f9-5607-4559-9a64-6767dfbcc078-run-httpd\") pod \"26ee66f9-5607-4559-9a64-6767dfbcc078\" (UID: \"26ee66f9-5607-4559-9a64-6767dfbcc078\") " Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.122028 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26ee66f9-5607-4559-9a64-6767dfbcc078-config-data\") pod \"26ee66f9-5607-4559-9a64-6767dfbcc078\" (UID: \"26ee66f9-5607-4559-9a64-6767dfbcc078\") " Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.122092 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26ee66f9-5607-4559-9a64-6767dfbcc078-log-httpd\") pod \"26ee66f9-5607-4559-9a64-6767dfbcc078\" (UID: \"26ee66f9-5607-4559-9a64-6767dfbcc078\") " Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.122122 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/26ee66f9-5607-4559-9a64-6767dfbcc078-etc-swift\") pod \"26ee66f9-5607-4559-9a64-6767dfbcc078\" (UID: \"26ee66f9-5607-4559-9a64-6767dfbcc078\") " Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.122245 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xgx9\" (UniqueName: \"kubernetes.io/projected/26ee66f9-5607-4559-9a64-6767dfbcc078-kube-api-access-6xgx9\") pod \"26ee66f9-5607-4559-9a64-6767dfbcc078\" (UID: \"26ee66f9-5607-4559-9a64-6767dfbcc078\") " Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.122522 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26ee66f9-5607-4559-9a64-6767dfbcc078-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "26ee66f9-5607-4559-9a64-6767dfbcc078" (UID: "26ee66f9-5607-4559-9a64-6767dfbcc078"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.122658 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26ee66f9-5607-4559-9a64-6767dfbcc078-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "26ee66f9-5607-4559-9a64-6767dfbcc078" (UID: "26ee66f9-5607-4559-9a64-6767dfbcc078"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.126831 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26ee66f9-5607-4559-9a64-6767dfbcc078-kube-api-access-6xgx9" (OuterVolumeSpecName: "kube-api-access-6xgx9") pod "26ee66f9-5607-4559-9a64-6767dfbcc078" (UID: "26ee66f9-5607-4559-9a64-6767dfbcc078"). InnerVolumeSpecName "kube-api-access-6xgx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.127006 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26ee66f9-5607-4559-9a64-6767dfbcc078-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "26ee66f9-5607-4559-9a64-6767dfbcc078" (UID: "26ee66f9-5607-4559-9a64-6767dfbcc078"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.154350 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26ee66f9-5607-4559-9a64-6767dfbcc078-config-data" (OuterVolumeSpecName: "config-data") pod "26ee66f9-5607-4559-9a64-6767dfbcc078" (UID: "26ee66f9-5607-4559-9a64-6767dfbcc078"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.224391 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xgx9\" (UniqueName: \"kubernetes.io/projected/26ee66f9-5607-4559-9a64-6767dfbcc078-kube-api-access-6xgx9\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.224421 4751 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26ee66f9-5607-4559-9a64-6767dfbcc078-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.224431 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26ee66f9-5607-4559-9a64-6767dfbcc078-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.224438 4751 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26ee66f9-5607-4559-9a64-6767dfbcc078-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.224446 4751 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/26ee66f9-5607-4559-9a64-6767dfbcc078-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.388849 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/swift-proxy-6d699db77c-58vrl"] Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.396000 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/swift-proxy-6d699db77c-58vrl"] Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.451942 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystone-db-sync-56bwv"] Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.457216 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/keystone-db-sync-56bwv"] Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.470853 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystone-bootstrap-hnxnd"] Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.494699 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/keystone-bootstrap-hnxnd"] Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.508628 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystone-859d455469-zqqzw"] Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.508893 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/keystone-859d455469-zqqzw" podUID="dabb55da-08db-4d2a-8b2d-ac7b2b657053" containerName="keystone-api" containerID="cri-o://6d283eaf7e9a4eadb7f123ebbd0723c09363494de09f8fb76c6271216f1a8ecb" gracePeriod=30 Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.529730 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystone-cron-29497861-5bd6d"] Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.533348 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/keystone-cron-29497861-5bd6d"] Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.539501 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystonedfde-account-delete-rzcsj"] Jan 31 15:06:21 crc kubenswrapper[4751]: E0131 15:06:21.539800 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26ee66f9-5607-4559-9a64-6767dfbcc078" containerName="proxy-httpd" Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.539816 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="26ee66f9-5607-4559-9a64-6767dfbcc078" containerName="proxy-httpd" Jan 31 15:06:21 crc kubenswrapper[4751]: E0131 15:06:21.539844 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26ee66f9-5607-4559-9a64-6767dfbcc078" containerName="proxy-server" Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.539851 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="26ee66f9-5607-4559-9a64-6767dfbcc078" containerName="proxy-server" Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.539988 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="26ee66f9-5607-4559-9a64-6767dfbcc078" containerName="proxy-server" Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.540011 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="26ee66f9-5607-4559-9a64-6767dfbcc078" containerName="proxy-httpd" Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.540530 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystonedfde-account-delete-rzcsj" Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.545360 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystonedfde-account-delete-rzcsj"] Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.653993 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/949efaf4-a5db-405d-9d40-c44d525c603c-operator-scripts\") pod \"keystonedfde-account-delete-rzcsj\" (UID: \"949efaf4-a5db-405d-9d40-c44d525c603c\") " pod="glance-kuttl-tests/keystonedfde-account-delete-rzcsj" Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.654135 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lk8gf\" (UniqueName: \"kubernetes.io/projected/949efaf4-a5db-405d-9d40-c44d525c603c-kube-api-access-lk8gf\") pod \"keystonedfde-account-delete-rzcsj\" (UID: \"949efaf4-a5db-405d-9d40-c44d525c603c\") " pod="glance-kuttl-tests/keystonedfde-account-delete-rzcsj" Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.755124 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/949efaf4-a5db-405d-9d40-c44d525c603c-operator-scripts\") pod \"keystonedfde-account-delete-rzcsj\" (UID: \"949efaf4-a5db-405d-9d40-c44d525c603c\") " pod="glance-kuttl-tests/keystonedfde-account-delete-rzcsj" Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.755725 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lk8gf\" (UniqueName: \"kubernetes.io/projected/949efaf4-a5db-405d-9d40-c44d525c603c-kube-api-access-lk8gf\") pod \"keystonedfde-account-delete-rzcsj\" (UID: \"949efaf4-a5db-405d-9d40-c44d525c603c\") " pod="glance-kuttl-tests/keystonedfde-account-delete-rzcsj" Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.755889 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/949efaf4-a5db-405d-9d40-c44d525c603c-operator-scripts\") pod \"keystonedfde-account-delete-rzcsj\" (UID: \"949efaf4-a5db-405d-9d40-c44d525c603c\") " pod="glance-kuttl-tests/keystonedfde-account-delete-rzcsj" Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.777851 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lk8gf\" (UniqueName: \"kubernetes.io/projected/949efaf4-a5db-405d-9d40-c44d525c603c-kube-api-access-lk8gf\") pod \"keystonedfde-account-delete-rzcsj\" (UID: \"949efaf4-a5db-405d-9d40-c44d525c603c\") " pod="glance-kuttl-tests/keystonedfde-account-delete-rzcsj" Jan 31 15:06:21 crc kubenswrapper[4751]: I0131 15:06:21.856154 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystonedfde-account-delete-rzcsj" Jan 31 15:06:22 crc kubenswrapper[4751]: I0131 15:06:22.073757 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystonedfde-account-delete-rzcsj"] Jan 31 15:06:22 crc kubenswrapper[4751]: W0131 15:06:22.087875 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod949efaf4_a5db_405d_9d40_c44d525c603c.slice/crio-5c96c33c1fd64acca4de260b260d0cd9dfb53370d0a6e41639488a98be0757ce WatchSource:0}: Error finding container 5c96c33c1fd64acca4de260b260d0cd9dfb53370d0a6e41639488a98be0757ce: Status 404 returned error can't find the container with id 5c96c33c1fd64acca4de260b260d0cd9dfb53370d0a6e41639488a98be0757ce Jan 31 15:06:22 crc kubenswrapper[4751]: I0131 15:06:22.418354 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="041ede36-25a1-4d6d-9de2-d16218c5fc67" path="/var/lib/kubelet/pods/041ede36-25a1-4d6d-9de2-d16218c5fc67/volumes" Jan 31 15:06:22 crc kubenswrapper[4751]: I0131 15:06:22.419358 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26ee66f9-5607-4559-9a64-6767dfbcc078" path="/var/lib/kubelet/pods/26ee66f9-5607-4559-9a64-6767dfbcc078/volumes" Jan 31 15:06:22 crc kubenswrapper[4751]: I0131 15:06:22.420039 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bce6ceb9-5b0d-4ec7-9492-94dce9bb261d" path="/var/lib/kubelet/pods/bce6ceb9-5b0d-4ec7-9492-94dce9bb261d/volumes" Jan 31 15:06:22 crc kubenswrapper[4751]: I0131 15:06:22.421232 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff5e8bad-e481-445e-99e8-5a5487e908d8" path="/var/lib/kubelet/pods/ff5e8bad-e481-445e-99e8-5a5487e908d8/volumes" Jan 31 15:06:23 crc kubenswrapper[4751]: I0131 15:06:23.079617 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystonedfde-account-delete-rzcsj" event={"ID":"949efaf4-a5db-405d-9d40-c44d525c603c","Type":"ContainerStarted","Data":"5c96c33c1fd64acca4de260b260d0cd9dfb53370d0a6e41639488a98be0757ce"} Jan 31 15:06:24 crc kubenswrapper[4751]: I0131 15:06:24.088761 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystonedfde-account-delete-rzcsj" event={"ID":"949efaf4-a5db-405d-9d40-c44d525c603c","Type":"ContainerStarted","Data":"b5cb3ee4032129b568b4ee0fa56e2f13d4d48986ad6a3c19ca00fa7b56b0e716"} Jan 31 15:06:24 crc kubenswrapper[4751]: I0131 15:06:24.108005 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/keystonedfde-account-delete-rzcsj" podStartSLOduration=3.1079861 podStartE2EDuration="3.1079861s" podCreationTimestamp="2026-01-31 15:06:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 15:06:24.102029461 +0000 UTC m=+1488.476742346" watchObservedRunningTime="2026-01-31 15:06:24.1079861 +0000 UTC m=+1488.482698985" Jan 31 15:06:25 crc kubenswrapper[4751]: I0131 15:06:25.052233 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ztzpn" Jan 31 15:06:25 crc kubenswrapper[4751]: I0131 15:06:25.052624 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ztzpn" Jan 31 15:06:25 crc kubenswrapper[4751]: I0131 15:06:25.124040 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ztzpn" Jan 31 15:06:25 crc kubenswrapper[4751]: I0131 15:06:25.178643 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ztzpn" Jan 31 15:06:25 crc kubenswrapper[4751]: I0131 15:06:25.360168 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ztzpn"] Jan 31 15:06:25 crc kubenswrapper[4751]: I0131 15:06:25.552729 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/root-account-create-update-6tvsv"] Jan 31 15:06:25 crc kubenswrapper[4751]: I0131 15:06:25.553837 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/root-account-create-update-6tvsv" Jan 31 15:06:25 crc kubenswrapper[4751]: I0131 15:06:25.571693 4751 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"openstack-mariadb-root-db-secret" Jan 31 15:06:25 crc kubenswrapper[4751]: I0131 15:06:25.624618 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/root-account-create-update-6tvsv"] Jan 31 15:06:25 crc kubenswrapper[4751]: I0131 15:06:25.640625 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/openstack-galera-1"] Jan 31 15:06:25 crc kubenswrapper[4751]: I0131 15:06:25.645924 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/openstack-galera-2"] Jan 31 15:06:25 crc kubenswrapper[4751]: I0131 15:06:25.652081 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/openstack-galera-0"] Jan 31 15:06:25 crc kubenswrapper[4751]: I0131 15:06:25.662645 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/root-account-create-update-6tvsv"] Jan 31 15:06:25 crc kubenswrapper[4751]: E0131 15:06:25.663448 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-sgblm operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="glance-kuttl-tests/root-account-create-update-6tvsv" podUID="f6b12715-fb69-4237-ac73-a59a6972d988" Jan 31 15:06:25 crc kubenswrapper[4751]: I0131 15:06:25.730018 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6b12715-fb69-4237-ac73-a59a6972d988-operator-scripts\") pod \"root-account-create-update-6tvsv\" (UID: \"f6b12715-fb69-4237-ac73-a59a6972d988\") " pod="glance-kuttl-tests/root-account-create-update-6tvsv" Jan 31 15:06:25 crc kubenswrapper[4751]: I0131 15:06:25.730206 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgblm\" (UniqueName: \"kubernetes.io/projected/f6b12715-fb69-4237-ac73-a59a6972d988-kube-api-access-sgblm\") pod \"root-account-create-update-6tvsv\" (UID: \"f6b12715-fb69-4237-ac73-a59a6972d988\") " pod="glance-kuttl-tests/root-account-create-update-6tvsv" Jan 31 15:06:25 crc kubenswrapper[4751]: I0131 15:06:25.767793 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/openstack-galera-2" podUID="3fcd9bac-c0cb-4de4-b630-0db07f110da7" containerName="galera" containerID="cri-o://eeb0727f6d7a3d1d251766b50edc1058bc460aa581ba0d5f746de288b9b3f16b" gracePeriod=30 Jan 31 15:06:25 crc kubenswrapper[4751]: I0131 15:06:25.831955 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6b12715-fb69-4237-ac73-a59a6972d988-operator-scripts\") pod \"root-account-create-update-6tvsv\" (UID: \"f6b12715-fb69-4237-ac73-a59a6972d988\") " pod="glance-kuttl-tests/root-account-create-update-6tvsv" Jan 31 15:06:25 crc kubenswrapper[4751]: I0131 15:06:25.832137 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgblm\" (UniqueName: \"kubernetes.io/projected/f6b12715-fb69-4237-ac73-a59a6972d988-kube-api-access-sgblm\") pod \"root-account-create-update-6tvsv\" (UID: \"f6b12715-fb69-4237-ac73-a59a6972d988\") " pod="glance-kuttl-tests/root-account-create-update-6tvsv" Jan 31 15:06:25 crc kubenswrapper[4751]: E0131 15:06:25.832148 4751 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 31 15:06:25 crc kubenswrapper[4751]: E0131 15:06:25.832232 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f6b12715-fb69-4237-ac73-a59a6972d988-operator-scripts podName:f6b12715-fb69-4237-ac73-a59a6972d988 nodeName:}" failed. No retries permitted until 2026-01-31 15:06:26.332205098 +0000 UTC m=+1490.706917983 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/f6b12715-fb69-4237-ac73-a59a6972d988-operator-scripts") pod "root-account-create-update-6tvsv" (UID: "f6b12715-fb69-4237-ac73-a59a6972d988") : configmap "openstack-scripts" not found Jan 31 15:06:25 crc kubenswrapper[4751]: E0131 15:06:25.840159 4751 projected.go:194] Error preparing data for projected volume kube-api-access-sgblm for pod glance-kuttl-tests/root-account-create-update-6tvsv: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 31 15:06:25 crc kubenswrapper[4751]: E0131 15:06:25.840255 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f6b12715-fb69-4237-ac73-a59a6972d988-kube-api-access-sgblm podName:f6b12715-fb69-4237-ac73-a59a6972d988 nodeName:}" failed. No retries permitted until 2026-01-31 15:06:26.340232202 +0000 UTC m=+1490.714945087 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-sgblm" (UniqueName: "kubernetes.io/projected/f6b12715-fb69-4237-ac73-a59a6972d988-kube-api-access-sgblm") pod "root-account-create-update-6tvsv" (UID: "f6b12715-fb69-4237-ac73-a59a6972d988") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 31 15:06:26 crc kubenswrapper[4751]: I0131 15:06:26.082025 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-859d455469-zqqzw" Jan 31 15:06:26 crc kubenswrapper[4751]: I0131 15:06:26.107581 4751 generic.go:334] "Generic (PLEG): container finished" podID="dabb55da-08db-4d2a-8b2d-ac7b2b657053" containerID="6d283eaf7e9a4eadb7f123ebbd0723c09363494de09f8fb76c6271216f1a8ecb" exitCode=0 Jan 31 15:06:26 crc kubenswrapper[4751]: I0131 15:06:26.107690 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-859d455469-zqqzw" Jan 31 15:06:26 crc kubenswrapper[4751]: I0131 15:06:26.108196 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-859d455469-zqqzw" event={"ID":"dabb55da-08db-4d2a-8b2d-ac7b2b657053","Type":"ContainerDied","Data":"6d283eaf7e9a4eadb7f123ebbd0723c09363494de09f8fb76c6271216f1a8ecb"} Jan 31 15:06:26 crc kubenswrapper[4751]: I0131 15:06:26.108261 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-859d455469-zqqzw" event={"ID":"dabb55da-08db-4d2a-8b2d-ac7b2b657053","Type":"ContainerDied","Data":"02e1eb0fcf9c093b28dd6fc9f0fb02613d1865a02336d6e8e82c2fa50f8597a7"} Jan 31 15:06:26 crc kubenswrapper[4751]: I0131 15:06:26.108286 4751 scope.go:117] "RemoveContainer" containerID="6d283eaf7e9a4eadb7f123ebbd0723c09363494de09f8fb76c6271216f1a8ecb" Jan 31 15:06:26 crc kubenswrapper[4751]: I0131 15:06:26.110405 4751 generic.go:334] "Generic (PLEG): container finished" podID="949efaf4-a5db-405d-9d40-c44d525c603c" containerID="b5cb3ee4032129b568b4ee0fa56e2f13d4d48986ad6a3c19ca00fa7b56b0e716" exitCode=0 Jan 31 15:06:26 crc kubenswrapper[4751]: I0131 15:06:26.110446 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystonedfde-account-delete-rzcsj" event={"ID":"949efaf4-a5db-405d-9d40-c44d525c603c","Type":"ContainerDied","Data":"b5cb3ee4032129b568b4ee0fa56e2f13d4d48986ad6a3c19ca00fa7b56b0e716"} Jan 31 15:06:26 crc kubenswrapper[4751]: I0131 15:06:26.110531 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/root-account-create-update-6tvsv" Jan 31 15:06:26 crc kubenswrapper[4751]: I0131 15:06:26.130105 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/root-account-create-update-6tvsv" Jan 31 15:06:26 crc kubenswrapper[4751]: I0131 15:06:26.134263 4751 scope.go:117] "RemoveContainer" containerID="6d283eaf7e9a4eadb7f123ebbd0723c09363494de09f8fb76c6271216f1a8ecb" Jan 31 15:06:26 crc kubenswrapper[4751]: E0131 15:06:26.134654 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d283eaf7e9a4eadb7f123ebbd0723c09363494de09f8fb76c6271216f1a8ecb\": container with ID starting with 6d283eaf7e9a4eadb7f123ebbd0723c09363494de09f8fb76c6271216f1a8ecb not found: ID does not exist" containerID="6d283eaf7e9a4eadb7f123ebbd0723c09363494de09f8fb76c6271216f1a8ecb" Jan 31 15:06:26 crc kubenswrapper[4751]: I0131 15:06:26.134703 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d283eaf7e9a4eadb7f123ebbd0723c09363494de09f8fb76c6271216f1a8ecb"} err="failed to get container status \"6d283eaf7e9a4eadb7f123ebbd0723c09363494de09f8fb76c6271216f1a8ecb\": rpc error: code = NotFound desc = could not find container \"6d283eaf7e9a4eadb7f123ebbd0723c09363494de09f8fb76c6271216f1a8ecb\": container with ID starting with 6d283eaf7e9a4eadb7f123ebbd0723c09363494de09f8fb76c6271216f1a8ecb not found: ID does not exist" Jan 31 15:06:26 crc kubenswrapper[4751]: I0131 15:06:26.141093 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dabb55da-08db-4d2a-8b2d-ac7b2b657053-config-data\") pod \"dabb55da-08db-4d2a-8b2d-ac7b2b657053\" (UID: \"dabb55da-08db-4d2a-8b2d-ac7b2b657053\") " Jan 31 15:06:26 crc kubenswrapper[4751]: I0131 15:06:26.141159 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dabb55da-08db-4d2a-8b2d-ac7b2b657053-fernet-keys\") pod \"dabb55da-08db-4d2a-8b2d-ac7b2b657053\" (UID: \"dabb55da-08db-4d2a-8b2d-ac7b2b657053\") " Jan 31 15:06:26 crc kubenswrapper[4751]: I0131 15:06:26.141188 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dabb55da-08db-4d2a-8b2d-ac7b2b657053-scripts\") pod \"dabb55da-08db-4d2a-8b2d-ac7b2b657053\" (UID: \"dabb55da-08db-4d2a-8b2d-ac7b2b657053\") " Jan 31 15:06:26 crc kubenswrapper[4751]: I0131 15:06:26.141242 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46qfq\" (UniqueName: \"kubernetes.io/projected/dabb55da-08db-4d2a-8b2d-ac7b2b657053-kube-api-access-46qfq\") pod \"dabb55da-08db-4d2a-8b2d-ac7b2b657053\" (UID: \"dabb55da-08db-4d2a-8b2d-ac7b2b657053\") " Jan 31 15:06:26 crc kubenswrapper[4751]: I0131 15:06:26.141323 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dabb55da-08db-4d2a-8b2d-ac7b2b657053-credential-keys\") pod \"dabb55da-08db-4d2a-8b2d-ac7b2b657053\" (UID: \"dabb55da-08db-4d2a-8b2d-ac7b2b657053\") " Jan 31 15:06:26 crc kubenswrapper[4751]: I0131 15:06:26.147185 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dabb55da-08db-4d2a-8b2d-ac7b2b657053-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "dabb55da-08db-4d2a-8b2d-ac7b2b657053" (UID: "dabb55da-08db-4d2a-8b2d-ac7b2b657053"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:06:26 crc kubenswrapper[4751]: I0131 15:06:26.147271 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dabb55da-08db-4d2a-8b2d-ac7b2b657053-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "dabb55da-08db-4d2a-8b2d-ac7b2b657053" (UID: "dabb55da-08db-4d2a-8b2d-ac7b2b657053"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:06:26 crc kubenswrapper[4751]: I0131 15:06:26.147452 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dabb55da-08db-4d2a-8b2d-ac7b2b657053-scripts" (OuterVolumeSpecName: "scripts") pod "dabb55da-08db-4d2a-8b2d-ac7b2b657053" (UID: "dabb55da-08db-4d2a-8b2d-ac7b2b657053"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:06:26 crc kubenswrapper[4751]: I0131 15:06:26.153180 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dabb55da-08db-4d2a-8b2d-ac7b2b657053-kube-api-access-46qfq" (OuterVolumeSpecName: "kube-api-access-46qfq") pod "dabb55da-08db-4d2a-8b2d-ac7b2b657053" (UID: "dabb55da-08db-4d2a-8b2d-ac7b2b657053"). InnerVolumeSpecName "kube-api-access-46qfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:06:26 crc kubenswrapper[4751]: I0131 15:06:26.165893 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dabb55da-08db-4d2a-8b2d-ac7b2b657053-config-data" (OuterVolumeSpecName: "config-data") pod "dabb55da-08db-4d2a-8b2d-ac7b2b657053" (UID: "dabb55da-08db-4d2a-8b2d-ac7b2b657053"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:06:26 crc kubenswrapper[4751]: I0131 15:06:26.211461 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/memcached-0"] Jan 31 15:06:26 crc kubenswrapper[4751]: I0131 15:06:26.211868 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/memcached-0" podUID="9dfaa3fc-8bf7-420f-8581-4e917bf3f41c" containerName="memcached" containerID="cri-o://7d9c0759f36bb098c88e33085270280041e2db4b3aa27d3f10dea45195deff2f" gracePeriod=30 Jan 31 15:06:26 crc kubenswrapper[4751]: I0131 15:06:26.243660 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dabb55da-08db-4d2a-8b2d-ac7b2b657053-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:26 crc kubenswrapper[4751]: I0131 15:06:26.243697 4751 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dabb55da-08db-4d2a-8b2d-ac7b2b657053-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:26 crc kubenswrapper[4751]: I0131 15:06:26.243706 4751 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dabb55da-08db-4d2a-8b2d-ac7b2b657053-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:26 crc kubenswrapper[4751]: I0131 15:06:26.243715 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46qfq\" (UniqueName: \"kubernetes.io/projected/dabb55da-08db-4d2a-8b2d-ac7b2b657053-kube-api-access-46qfq\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:26 crc kubenswrapper[4751]: I0131 15:06:26.243726 4751 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dabb55da-08db-4d2a-8b2d-ac7b2b657053-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:26 crc kubenswrapper[4751]: I0131 15:06:26.345497 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgblm\" (UniqueName: \"kubernetes.io/projected/f6b12715-fb69-4237-ac73-a59a6972d988-kube-api-access-sgblm\") pod \"root-account-create-update-6tvsv\" (UID: \"f6b12715-fb69-4237-ac73-a59a6972d988\") " pod="glance-kuttl-tests/root-account-create-update-6tvsv" Jan 31 15:06:26 crc kubenswrapper[4751]: I0131 15:06:26.345915 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6b12715-fb69-4237-ac73-a59a6972d988-operator-scripts\") pod \"root-account-create-update-6tvsv\" (UID: \"f6b12715-fb69-4237-ac73-a59a6972d988\") " pod="glance-kuttl-tests/root-account-create-update-6tvsv" Jan 31 15:06:26 crc kubenswrapper[4751]: E0131 15:06:26.346108 4751 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 31 15:06:26 crc kubenswrapper[4751]: E0131 15:06:26.346185 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f6b12715-fb69-4237-ac73-a59a6972d988-operator-scripts podName:f6b12715-fb69-4237-ac73-a59a6972d988 nodeName:}" failed. No retries permitted until 2026-01-31 15:06:27.346167458 +0000 UTC m=+1491.720880343 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/f6b12715-fb69-4237-ac73-a59a6972d988-operator-scripts") pod "root-account-create-update-6tvsv" (UID: "f6b12715-fb69-4237-ac73-a59a6972d988") : configmap "openstack-scripts" not found Jan 31 15:06:26 crc kubenswrapper[4751]: E0131 15:06:26.348440 4751 projected.go:194] Error preparing data for projected volume kube-api-access-sgblm for pod glance-kuttl-tests/root-account-create-update-6tvsv: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 31 15:06:26 crc kubenswrapper[4751]: E0131 15:06:26.348560 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f6b12715-fb69-4237-ac73-a59a6972d988-kube-api-access-sgblm podName:f6b12715-fb69-4237-ac73-a59a6972d988 nodeName:}" failed. No retries permitted until 2026-01-31 15:06:27.348533021 +0000 UTC m=+1491.723245926 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-sgblm" (UniqueName: "kubernetes.io/projected/f6b12715-fb69-4237-ac73-a59a6972d988-kube-api-access-sgblm") pod "root-account-create-update-6tvsv" (UID: "f6b12715-fb69-4237-ac73-a59a6972d988") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 31 15:06:26 crc kubenswrapper[4751]: I0131 15:06:26.463378 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystone-859d455469-zqqzw"] Jan 31 15:06:26 crc kubenswrapper[4751]: I0131 15:06:26.469313 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/keystone-859d455469-zqqzw"] Jan 31 15:06:26 crc kubenswrapper[4751]: I0131 15:06:26.537120 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystone-db-create-pl5bs"] Jan 31 15:06:26 crc kubenswrapper[4751]: I0131 15:06:26.542453 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/keystone-db-create-pl5bs"] Jan 31 15:06:26 crc kubenswrapper[4751]: I0131 15:06:26.588972 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystonedfde-account-delete-rzcsj"] Jan 31 15:06:26 crc kubenswrapper[4751]: I0131 15:06:26.593553 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystone-dfde-account-create-update-vbcnd"] Jan 31 15:06:26 crc kubenswrapper[4751]: I0131 15:06:26.598266 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/keystone-dfde-account-create-update-vbcnd"] Jan 31 15:06:26 crc kubenswrapper[4751]: I0131 15:06:26.633723 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/rabbitmq-server-0"] Jan 31 15:06:26 crc kubenswrapper[4751]: I0131 15:06:26.730396 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-2" Jan 31 15:06:26 crc kubenswrapper[4751]: I0131 15:06:26.853870 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3fcd9bac-c0cb-4de4-b630-0db07f110da7-config-data-generated\") pod \"3fcd9bac-c0cb-4de4-b630-0db07f110da7\" (UID: \"3fcd9bac-c0cb-4de4-b630-0db07f110da7\") " Jan 31 15:06:26 crc kubenswrapper[4751]: I0131 15:06:26.854230 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"3fcd9bac-c0cb-4de4-b630-0db07f110da7\" (UID: \"3fcd9bac-c0cb-4de4-b630-0db07f110da7\") " Jan 31 15:06:26 crc kubenswrapper[4751]: I0131 15:06:26.854242 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fcd9bac-c0cb-4de4-b630-0db07f110da7-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "3fcd9bac-c0cb-4de4-b630-0db07f110da7" (UID: "3fcd9bac-c0cb-4de4-b630-0db07f110da7"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:06:26 crc kubenswrapper[4751]: I0131 15:06:26.854272 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3fcd9bac-c0cb-4de4-b630-0db07f110da7-kolla-config\") pod \"3fcd9bac-c0cb-4de4-b630-0db07f110da7\" (UID: \"3fcd9bac-c0cb-4de4-b630-0db07f110da7\") " Jan 31 15:06:26 crc kubenswrapper[4751]: I0131 15:06:26.854387 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3fcd9bac-c0cb-4de4-b630-0db07f110da7-config-data-default\") pod \"3fcd9bac-c0cb-4de4-b630-0db07f110da7\" (UID: \"3fcd9bac-c0cb-4de4-b630-0db07f110da7\") " Jan 31 15:06:26 crc kubenswrapper[4751]: I0131 15:06:26.854434 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3fcd9bac-c0cb-4de4-b630-0db07f110da7-operator-scripts\") pod \"3fcd9bac-c0cb-4de4-b630-0db07f110da7\" (UID: \"3fcd9bac-c0cb-4de4-b630-0db07f110da7\") " Jan 31 15:06:26 crc kubenswrapper[4751]: I0131 15:06:26.854484 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwsz6\" (UniqueName: \"kubernetes.io/projected/3fcd9bac-c0cb-4de4-b630-0db07f110da7-kube-api-access-rwsz6\") pod \"3fcd9bac-c0cb-4de4-b630-0db07f110da7\" (UID: \"3fcd9bac-c0cb-4de4-b630-0db07f110da7\") " Jan 31 15:06:26 crc kubenswrapper[4751]: I0131 15:06:26.854855 4751 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3fcd9bac-c0cb-4de4-b630-0db07f110da7-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:26 crc kubenswrapper[4751]: I0131 15:06:26.855286 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fcd9bac-c0cb-4de4-b630-0db07f110da7-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "3fcd9bac-c0cb-4de4-b630-0db07f110da7" (UID: "3fcd9bac-c0cb-4de4-b630-0db07f110da7"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 15:06:26 crc kubenswrapper[4751]: I0131 15:06:26.855334 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fcd9bac-c0cb-4de4-b630-0db07f110da7-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "3fcd9bac-c0cb-4de4-b630-0db07f110da7" (UID: "3fcd9bac-c0cb-4de4-b630-0db07f110da7"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 15:06:26 crc kubenswrapper[4751]: I0131 15:06:26.855346 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fcd9bac-c0cb-4de4-b630-0db07f110da7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3fcd9bac-c0cb-4de4-b630-0db07f110da7" (UID: "3fcd9bac-c0cb-4de4-b630-0db07f110da7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 15:06:26 crc kubenswrapper[4751]: I0131 15:06:26.858303 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fcd9bac-c0cb-4de4-b630-0db07f110da7-kube-api-access-rwsz6" (OuterVolumeSpecName: "kube-api-access-rwsz6") pod "3fcd9bac-c0cb-4de4-b630-0db07f110da7" (UID: "3fcd9bac-c0cb-4de4-b630-0db07f110da7"). InnerVolumeSpecName "kube-api-access-rwsz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:06:26 crc kubenswrapper[4751]: I0131 15:06:26.862478 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "mysql-db") pod "3fcd9bac-c0cb-4de4-b630-0db07f110da7" (UID: "3fcd9bac-c0cb-4de4-b630-0db07f110da7"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:06:26 crc kubenswrapper[4751]: I0131 15:06:26.956619 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Jan 31 15:06:26 crc kubenswrapper[4751]: I0131 15:06:26.956677 4751 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3fcd9bac-c0cb-4de4-b630-0db07f110da7-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:26 crc kubenswrapper[4751]: I0131 15:06:26.956700 4751 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3fcd9bac-c0cb-4de4-b630-0db07f110da7-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:26 crc kubenswrapper[4751]: I0131 15:06:26.956718 4751 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3fcd9bac-c0cb-4de4-b630-0db07f110da7-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:26 crc kubenswrapper[4751]: I0131 15:06:26.956734 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwsz6\" (UniqueName: \"kubernetes.io/projected/3fcd9bac-c0cb-4de4-b630-0db07f110da7-kube-api-access-rwsz6\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:26 crc kubenswrapper[4751]: I0131 15:06:26.972626 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Jan 31 15:06:27 crc kubenswrapper[4751]: I0131 15:06:27.051451 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/rabbitmq-server-0"] Jan 31 15:06:27 crc kubenswrapper[4751]: I0131 15:06:27.058636 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:27 crc kubenswrapper[4751]: I0131 15:06:27.124828 4751 generic.go:334] "Generic (PLEG): container finished" podID="3fcd9bac-c0cb-4de4-b630-0db07f110da7" containerID="eeb0727f6d7a3d1d251766b50edc1058bc460aa581ba0d5f746de288b9b3f16b" exitCode=0 Jan 31 15:06:27 crc kubenswrapper[4751]: I0131 15:06:27.124887 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-2" Jan 31 15:06:27 crc kubenswrapper[4751]: I0131 15:06:27.124909 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-2" event={"ID":"3fcd9bac-c0cb-4de4-b630-0db07f110da7","Type":"ContainerDied","Data":"eeb0727f6d7a3d1d251766b50edc1058bc460aa581ba0d5f746de288b9b3f16b"} Jan 31 15:06:27 crc kubenswrapper[4751]: I0131 15:06:27.124940 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-2" event={"ID":"3fcd9bac-c0cb-4de4-b630-0db07f110da7","Type":"ContainerDied","Data":"4483e874a8f4e15e4dfcdca687206a7af35257a8c5ba1cb56d33195e769924f9"} Jan 31 15:06:27 crc kubenswrapper[4751]: I0131 15:06:27.124955 4751 scope.go:117] "RemoveContainer" containerID="eeb0727f6d7a3d1d251766b50edc1058bc460aa581ba0d5f746de288b9b3f16b" Jan 31 15:06:27 crc kubenswrapper[4751]: I0131 15:06:27.125117 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/root-account-create-update-6tvsv" Jan 31 15:06:27 crc kubenswrapper[4751]: I0131 15:06:27.125915 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ztzpn" podUID="d783dd01-73a7-4362-888a-ab84bc8739df" containerName="registry-server" containerID="cri-o://8a8f6ec3fc4799718a2c776fd8b2c60694522c37afe696834e35482b1037e761" gracePeriod=2 Jan 31 15:06:27 crc kubenswrapper[4751]: I0131 15:06:27.152910 4751 scope.go:117] "RemoveContainer" containerID="b0d3ea91f474d5f0241c4f1e0b20927cdf5d85e229fe91747902d0e90daf242d" Jan 31 15:06:27 crc kubenswrapper[4751]: I0131 15:06:27.171956 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/rabbitmq-server-0" podUID="19317a08-b18b-42c9-bdc9-394e1e06257d" containerName="rabbitmq" containerID="cri-o://07f687eb09cbc17ef2ede020cb3e1c35352131bf2222486a5b70524349e266f9" gracePeriod=604800 Jan 31 15:06:27 crc kubenswrapper[4751]: I0131 15:06:27.182504 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/root-account-create-update-6tvsv"] Jan 31 15:06:27 crc kubenswrapper[4751]: I0131 15:06:27.196248 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/root-account-create-update-6tvsv"] Jan 31 15:06:27 crc kubenswrapper[4751]: I0131 15:06:27.217243 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/openstack-galera-2"] Jan 31 15:06:27 crc kubenswrapper[4751]: I0131 15:06:27.227368 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/openstack-galera-2"] Jan 31 15:06:27 crc kubenswrapper[4751]: I0131 15:06:27.232742 4751 scope.go:117] "RemoveContainer" containerID="eeb0727f6d7a3d1d251766b50edc1058bc460aa581ba0d5f746de288b9b3f16b" Jan 31 15:06:27 crc kubenswrapper[4751]: E0131 15:06:27.233065 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eeb0727f6d7a3d1d251766b50edc1058bc460aa581ba0d5f746de288b9b3f16b\": container with ID starting with eeb0727f6d7a3d1d251766b50edc1058bc460aa581ba0d5f746de288b9b3f16b not found: ID does not exist" containerID="eeb0727f6d7a3d1d251766b50edc1058bc460aa581ba0d5f746de288b9b3f16b" Jan 31 15:06:27 crc kubenswrapper[4751]: I0131 15:06:27.233122 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eeb0727f6d7a3d1d251766b50edc1058bc460aa581ba0d5f746de288b9b3f16b"} err="failed to get container status \"eeb0727f6d7a3d1d251766b50edc1058bc460aa581ba0d5f746de288b9b3f16b\": rpc error: code = NotFound desc = could not find container \"eeb0727f6d7a3d1d251766b50edc1058bc460aa581ba0d5f746de288b9b3f16b\": container with ID starting with eeb0727f6d7a3d1d251766b50edc1058bc460aa581ba0d5f746de288b9b3f16b not found: ID does not exist" Jan 31 15:06:27 crc kubenswrapper[4751]: I0131 15:06:27.233149 4751 scope.go:117] "RemoveContainer" containerID="b0d3ea91f474d5f0241c4f1e0b20927cdf5d85e229fe91747902d0e90daf242d" Jan 31 15:06:27 crc kubenswrapper[4751]: E0131 15:06:27.233559 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0d3ea91f474d5f0241c4f1e0b20927cdf5d85e229fe91747902d0e90daf242d\": container with ID starting with b0d3ea91f474d5f0241c4f1e0b20927cdf5d85e229fe91747902d0e90daf242d not found: ID does not exist" containerID="b0d3ea91f474d5f0241c4f1e0b20927cdf5d85e229fe91747902d0e90daf242d" Jan 31 15:06:27 crc kubenswrapper[4751]: I0131 15:06:27.233598 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0d3ea91f474d5f0241c4f1e0b20927cdf5d85e229fe91747902d0e90daf242d"} err="failed to get container status \"b0d3ea91f474d5f0241c4f1e0b20927cdf5d85e229fe91747902d0e90daf242d\": rpc error: code = NotFound desc = could not find container \"b0d3ea91f474d5f0241c4f1e0b20927cdf5d85e229fe91747902d0e90daf242d\": container with ID starting with b0d3ea91f474d5f0241c4f1e0b20927cdf5d85e229fe91747902d0e90daf242d not found: ID does not exist" Jan 31 15:06:27 crc kubenswrapper[4751]: I0131 15:06:27.363362 4751 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6b12715-fb69-4237-ac73-a59a6972d988-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:27 crc kubenswrapper[4751]: I0131 15:06:27.363401 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgblm\" (UniqueName: \"kubernetes.io/projected/f6b12715-fb69-4237-ac73-a59a6972d988-kube-api-access-sgblm\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:27 crc kubenswrapper[4751]: I0131 15:06:27.445399 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystonedfde-account-delete-rzcsj" Jan 31 15:06:27 crc kubenswrapper[4751]: I0131 15:06:27.566821 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lk8gf\" (UniqueName: \"kubernetes.io/projected/949efaf4-a5db-405d-9d40-c44d525c603c-kube-api-access-lk8gf\") pod \"949efaf4-a5db-405d-9d40-c44d525c603c\" (UID: \"949efaf4-a5db-405d-9d40-c44d525c603c\") " Jan 31 15:06:27 crc kubenswrapper[4751]: I0131 15:06:27.566979 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/949efaf4-a5db-405d-9d40-c44d525c603c-operator-scripts\") pod \"949efaf4-a5db-405d-9d40-c44d525c603c\" (UID: \"949efaf4-a5db-405d-9d40-c44d525c603c\") " Jan 31 15:06:27 crc kubenswrapper[4751]: I0131 15:06:27.567549 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/949efaf4-a5db-405d-9d40-c44d525c603c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "949efaf4-a5db-405d-9d40-c44d525c603c" (UID: "949efaf4-a5db-405d-9d40-c44d525c603c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 15:06:27 crc kubenswrapper[4751]: I0131 15:06:27.576243 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/949efaf4-a5db-405d-9d40-c44d525c603c-kube-api-access-lk8gf" (OuterVolumeSpecName: "kube-api-access-lk8gf") pod "949efaf4-a5db-405d-9d40-c44d525c603c" (UID: "949efaf4-a5db-405d-9d40-c44d525c603c"). InnerVolumeSpecName "kube-api-access-lk8gf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:06:27 crc kubenswrapper[4751]: I0131 15:06:27.669226 4751 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/949efaf4-a5db-405d-9d40-c44d525c603c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:27 crc kubenswrapper[4751]: I0131 15:06:27.669273 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lk8gf\" (UniqueName: \"kubernetes.io/projected/949efaf4-a5db-405d-9d40-c44d525c603c-kube-api-access-lk8gf\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:27 crc kubenswrapper[4751]: I0131 15:06:27.841614 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/glance-operator-controller-manager-75dc47fc9-v4thz"] Jan 31 15:06:27 crc kubenswrapper[4751]: I0131 15:06:27.841851 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/glance-operator-controller-manager-75dc47fc9-v4thz" podUID="f70443db-a342-4f5d-81b2-39c01f494cf8" containerName="manager" containerID="cri-o://ab946ef56298d90f2da08c7aa03dc9761afb66c0a527a34685eef2375ecebd56" gracePeriod=10 Jan 31 15:06:27 crc kubenswrapper[4751]: I0131 15:06:27.856417 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/openstack-galera-1" podUID="22459bcc-672e-4390-89ae-2b5fa48ded71" containerName="galera" containerID="cri-o://6234bbbcfac3eddf715e4285a6b3d7b6a0aff6d850ad2df858a2deee34d9f571" gracePeriod=28 Jan 31 15:06:28 crc kubenswrapper[4751]: I0131 15:06:28.112610 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/glance-operator-index-bvvpv"] Jan 31 15:06:28 crc kubenswrapper[4751]: I0131 15:06:28.113166 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/glance-operator-index-bvvpv" podUID="eacc0c6c-95c4-487f-945e-4a1e3e17c508" containerName="registry-server" containerID="cri-o://432c19d38a03c6f3813c47fccb7f600a3290b65e75e89319b377dead29257091" gracePeriod=30 Jan 31 15:06:28 crc kubenswrapper[4751]: I0131 15:06:28.147865 4751 generic.go:334] "Generic (PLEG): container finished" podID="d783dd01-73a7-4362-888a-ab84bc8739df" containerID="8a8f6ec3fc4799718a2c776fd8b2c60694522c37afe696834e35482b1037e761" exitCode=0 Jan 31 15:06:28 crc kubenswrapper[4751]: I0131 15:06:28.147936 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ztzpn" event={"ID":"d783dd01-73a7-4362-888a-ab84bc8739df","Type":"ContainerDied","Data":"8a8f6ec3fc4799718a2c776fd8b2c60694522c37afe696834e35482b1037e761"} Jan 31 15:06:28 crc kubenswrapper[4751]: I0131 15:06:28.152661 4751 generic.go:334] "Generic (PLEG): container finished" podID="f70443db-a342-4f5d-81b2-39c01f494cf8" containerID="ab946ef56298d90f2da08c7aa03dc9761afb66c0a527a34685eef2375ecebd56" exitCode=0 Jan 31 15:06:28 crc kubenswrapper[4751]: I0131 15:06:28.152751 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-75dc47fc9-v4thz" event={"ID":"f70443db-a342-4f5d-81b2-39c01f494cf8","Type":"ContainerDied","Data":"ab946ef56298d90f2da08c7aa03dc9761afb66c0a527a34685eef2375ecebd56"} Jan 31 15:06:28 crc kubenswrapper[4751]: I0131 15:06:28.159954 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/9a978acee48f3b64f3e45376f243bba270f67c38c120bc2488cd2a78caspwrp"] Jan 31 15:06:28 crc kubenswrapper[4751]: I0131 15:06:28.160946 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystonedfde-account-delete-rzcsj" Jan 31 15:06:28 crc kubenswrapper[4751]: I0131 15:06:28.161115 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystonedfde-account-delete-rzcsj" event={"ID":"949efaf4-a5db-405d-9d40-c44d525c603c","Type":"ContainerDied","Data":"5c96c33c1fd64acca4de260b260d0cd9dfb53370d0a6e41639488a98be0757ce"} Jan 31 15:06:28 crc kubenswrapper[4751]: I0131 15:06:28.161153 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c96c33c1fd64acca4de260b260d0cd9dfb53370d0a6e41639488a98be0757ce" Jan 31 15:06:28 crc kubenswrapper[4751]: I0131 15:06:28.162737 4751 generic.go:334] "Generic (PLEG): container finished" podID="9dfaa3fc-8bf7-420f-8581-4e917bf3f41c" containerID="7d9c0759f36bb098c88e33085270280041e2db4b3aa27d3f10dea45195deff2f" exitCode=0 Jan 31 15:06:28 crc kubenswrapper[4751]: I0131 15:06:28.162772 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/memcached-0" event={"ID":"9dfaa3fc-8bf7-420f-8581-4e917bf3f41c","Type":"ContainerDied","Data":"7d9c0759f36bb098c88e33085270280041e2db4b3aa27d3f10dea45195deff2f"} Jan 31 15:06:28 crc kubenswrapper[4751]: I0131 15:06:28.171233 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/9a978acee48f3b64f3e45376f243bba270f67c38c120bc2488cd2a78caspwrp"] Jan 31 15:06:28 crc kubenswrapper[4751]: I0131 15:06:28.293660 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystonedfde-account-delete-rzcsj"] Jan 31 15:06:28 crc kubenswrapper[4751]: I0131 15:06:28.302919 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/keystonedfde-account-delete-rzcsj"] Jan 31 15:06:28 crc kubenswrapper[4751]: I0131 15:06:28.428857 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06a47516-5cf6-431b-86ee-7732bd88fed4" path="/var/lib/kubelet/pods/06a47516-5cf6-431b-86ee-7732bd88fed4/volumes" Jan 31 15:06:28 crc kubenswrapper[4751]: I0131 15:06:28.429637 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/memcached-0" Jan 31 15:06:28 crc kubenswrapper[4751]: I0131 15:06:28.429649 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fcd9bac-c0cb-4de4-b630-0db07f110da7" path="/var/lib/kubelet/pods/3fcd9bac-c0cb-4de4-b630-0db07f110da7/volumes" Jan 31 15:06:28 crc kubenswrapper[4751]: I0131 15:06:28.430233 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="568d26c9-1fe8-4e01-a7c0-cbe91951fe60" path="/var/lib/kubelet/pods/568d26c9-1fe8-4e01-a7c0-cbe91951fe60/volumes" Jan 31 15:06:28 crc kubenswrapper[4751]: I0131 15:06:28.431351 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="585f0c4b-3594-4683-bb38-d1fcbbee12cd" path="/var/lib/kubelet/pods/585f0c4b-3594-4683-bb38-d1fcbbee12cd/volumes" Jan 31 15:06:28 crc kubenswrapper[4751]: I0131 15:06:28.431937 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="949efaf4-a5db-405d-9d40-c44d525c603c" path="/var/lib/kubelet/pods/949efaf4-a5db-405d-9d40-c44d525c603c/volumes" Jan 31 15:06:28 crc kubenswrapper[4751]: I0131 15:06:28.432402 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dabb55da-08db-4d2a-8b2d-ac7b2b657053" path="/var/lib/kubelet/pods/dabb55da-08db-4d2a-8b2d-ac7b2b657053/volumes" Jan 31 15:06:28 crc kubenswrapper[4751]: I0131 15:06:28.433204 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6b12715-fb69-4237-ac73-a59a6972d988" path="/var/lib/kubelet/pods/f6b12715-fb69-4237-ac73-a59a6972d988/volumes" Jan 31 15:06:28 crc kubenswrapper[4751]: I0131 15:06:28.578952 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9dfaa3fc-8bf7-420f-8581-4e917bf3f41c-kolla-config\") pod \"9dfaa3fc-8bf7-420f-8581-4e917bf3f41c\" (UID: \"9dfaa3fc-8bf7-420f-8581-4e917bf3f41c\") " Jan 31 15:06:28 crc kubenswrapper[4751]: I0131 15:06:28.579062 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbtbr\" (UniqueName: \"kubernetes.io/projected/9dfaa3fc-8bf7-420f-8581-4e917bf3f41c-kube-api-access-gbtbr\") pod \"9dfaa3fc-8bf7-420f-8581-4e917bf3f41c\" (UID: \"9dfaa3fc-8bf7-420f-8581-4e917bf3f41c\") " Jan 31 15:06:28 crc kubenswrapper[4751]: I0131 15:06:28.579200 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9dfaa3fc-8bf7-420f-8581-4e917bf3f41c-config-data\") pod \"9dfaa3fc-8bf7-420f-8581-4e917bf3f41c\" (UID: \"9dfaa3fc-8bf7-420f-8581-4e917bf3f41c\") " Jan 31 15:06:28 crc kubenswrapper[4751]: I0131 15:06:28.580907 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9dfaa3fc-8bf7-420f-8581-4e917bf3f41c-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "9dfaa3fc-8bf7-420f-8581-4e917bf3f41c" (UID: "9dfaa3fc-8bf7-420f-8581-4e917bf3f41c"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 15:06:28 crc kubenswrapper[4751]: I0131 15:06:28.583221 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9dfaa3fc-8bf7-420f-8581-4e917bf3f41c-config-data" (OuterVolumeSpecName: "config-data") pod "9dfaa3fc-8bf7-420f-8581-4e917bf3f41c" (UID: "9dfaa3fc-8bf7-420f-8581-4e917bf3f41c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 15:06:28 crc kubenswrapper[4751]: I0131 15:06:28.598330 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dfaa3fc-8bf7-420f-8581-4e917bf3f41c-kube-api-access-gbtbr" (OuterVolumeSpecName: "kube-api-access-gbtbr") pod "9dfaa3fc-8bf7-420f-8581-4e917bf3f41c" (UID: "9dfaa3fc-8bf7-420f-8581-4e917bf3f41c"). InnerVolumeSpecName "kube-api-access-gbtbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:06:28 crc kubenswrapper[4751]: I0131 15:06:28.681051 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbtbr\" (UniqueName: \"kubernetes.io/projected/9dfaa3fc-8bf7-420f-8581-4e917bf3f41c-kube-api-access-gbtbr\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:28 crc kubenswrapper[4751]: I0131 15:06:28.681101 4751 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9dfaa3fc-8bf7-420f-8581-4e917bf3f41c-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:28 crc kubenswrapper[4751]: I0131 15:06:28.681111 4751 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9dfaa3fc-8bf7-420f-8581-4e917bf3f41c-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:28 crc kubenswrapper[4751]: I0131 15:06:28.725269 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ztzpn" Jan 31 15:06:28 crc kubenswrapper[4751]: I0131 15:06:28.739487 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-index-bvvpv" Jan 31 15:06:28 crc kubenswrapper[4751]: I0131 15:06:28.806524 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-75dc47fc9-v4thz" Jan 31 15:06:28 crc kubenswrapper[4751]: I0131 15:06:28.883407 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d783dd01-73a7-4362-888a-ab84bc8739df-catalog-content\") pod \"d783dd01-73a7-4362-888a-ab84bc8739df\" (UID: \"d783dd01-73a7-4362-888a-ab84bc8739df\") " Jan 31 15:06:28 crc kubenswrapper[4751]: I0131 15:06:28.883462 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f70443db-a342-4f5d-81b2-39c01f494cf8-apiservice-cert\") pod \"f70443db-a342-4f5d-81b2-39c01f494cf8\" (UID: \"f70443db-a342-4f5d-81b2-39c01f494cf8\") " Jan 31 15:06:28 crc kubenswrapper[4751]: I0131 15:06:28.883543 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f70443db-a342-4f5d-81b2-39c01f494cf8-webhook-cert\") pod \"f70443db-a342-4f5d-81b2-39c01f494cf8\" (UID: \"f70443db-a342-4f5d-81b2-39c01f494cf8\") " Jan 31 15:06:28 crc kubenswrapper[4751]: I0131 15:06:28.883568 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgqfs\" (UniqueName: \"kubernetes.io/projected/d783dd01-73a7-4362-888a-ab84bc8739df-kube-api-access-sgqfs\") pod \"d783dd01-73a7-4362-888a-ab84bc8739df\" (UID: \"d783dd01-73a7-4362-888a-ab84bc8739df\") " Jan 31 15:06:28 crc kubenswrapper[4751]: I0131 15:06:28.883614 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6f7g\" (UniqueName: \"kubernetes.io/projected/f70443db-a342-4f5d-81b2-39c01f494cf8-kube-api-access-l6f7g\") pod \"f70443db-a342-4f5d-81b2-39c01f494cf8\" (UID: \"f70443db-a342-4f5d-81b2-39c01f494cf8\") " Jan 31 15:06:28 crc kubenswrapper[4751]: I0131 15:06:28.883638 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d783dd01-73a7-4362-888a-ab84bc8739df-utilities\") pod \"d783dd01-73a7-4362-888a-ab84bc8739df\" (UID: \"d783dd01-73a7-4362-888a-ab84bc8739df\") " Jan 31 15:06:28 crc kubenswrapper[4751]: I0131 15:06:28.883659 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtvgv\" (UniqueName: \"kubernetes.io/projected/eacc0c6c-95c4-487f-945e-4a1e3e17c508-kube-api-access-vtvgv\") pod \"eacc0c6c-95c4-487f-945e-4a1e3e17c508\" (UID: \"eacc0c6c-95c4-487f-945e-4a1e3e17c508\") " Jan 31 15:06:28 crc kubenswrapper[4751]: I0131 15:06:28.884564 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d783dd01-73a7-4362-888a-ab84bc8739df-utilities" (OuterVolumeSpecName: "utilities") pod "d783dd01-73a7-4362-888a-ab84bc8739df" (UID: "d783dd01-73a7-4362-888a-ab84bc8739df"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:06:28 crc kubenswrapper[4751]: I0131 15:06:28.888063 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f70443db-a342-4f5d-81b2-39c01f494cf8-kube-api-access-l6f7g" (OuterVolumeSpecName: "kube-api-access-l6f7g") pod "f70443db-a342-4f5d-81b2-39c01f494cf8" (UID: "f70443db-a342-4f5d-81b2-39c01f494cf8"). InnerVolumeSpecName "kube-api-access-l6f7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:06:28 crc kubenswrapper[4751]: I0131 15:06:28.888313 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d783dd01-73a7-4362-888a-ab84bc8739df-kube-api-access-sgqfs" (OuterVolumeSpecName: "kube-api-access-sgqfs") pod "d783dd01-73a7-4362-888a-ab84bc8739df" (UID: "d783dd01-73a7-4362-888a-ab84bc8739df"). InnerVolumeSpecName "kube-api-access-sgqfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:06:28 crc kubenswrapper[4751]: I0131 15:06:28.888820 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f70443db-a342-4f5d-81b2-39c01f494cf8-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "f70443db-a342-4f5d-81b2-39c01f494cf8" (UID: "f70443db-a342-4f5d-81b2-39c01f494cf8"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:06:28 crc kubenswrapper[4751]: I0131 15:06:28.889895 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eacc0c6c-95c4-487f-945e-4a1e3e17c508-kube-api-access-vtvgv" (OuterVolumeSpecName: "kube-api-access-vtvgv") pod "eacc0c6c-95c4-487f-945e-4a1e3e17c508" (UID: "eacc0c6c-95c4-487f-945e-4a1e3e17c508"). InnerVolumeSpecName "kube-api-access-vtvgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:06:28 crc kubenswrapper[4751]: I0131 15:06:28.890622 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f70443db-a342-4f5d-81b2-39c01f494cf8-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "f70443db-a342-4f5d-81b2-39c01f494cf8" (UID: "f70443db-a342-4f5d-81b2-39c01f494cf8"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:06:28 crc kubenswrapper[4751]: I0131 15:06:28.985321 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtvgv\" (UniqueName: \"kubernetes.io/projected/eacc0c6c-95c4-487f-945e-4a1e3e17c508-kube-api-access-vtvgv\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:28 crc kubenswrapper[4751]: I0131 15:06:28.985361 4751 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f70443db-a342-4f5d-81b2-39c01f494cf8-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:28 crc kubenswrapper[4751]: I0131 15:06:28.985374 4751 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f70443db-a342-4f5d-81b2-39c01f494cf8-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:28 crc kubenswrapper[4751]: I0131 15:06:28.985387 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgqfs\" (UniqueName: \"kubernetes.io/projected/d783dd01-73a7-4362-888a-ab84bc8739df-kube-api-access-sgqfs\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:28 crc kubenswrapper[4751]: I0131 15:06:28.985399 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6f7g\" (UniqueName: \"kubernetes.io/projected/f70443db-a342-4f5d-81b2-39c01f494cf8-kube-api-access-l6f7g\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:28 crc kubenswrapper[4751]: I0131 15:06:28.985411 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d783dd01-73a7-4362-888a-ab84bc8739df-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:29 crc kubenswrapper[4751]: I0131 15:06:29.010861 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d783dd01-73a7-4362-888a-ab84bc8739df-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d783dd01-73a7-4362-888a-ab84bc8739df" (UID: "d783dd01-73a7-4362-888a-ab84bc8739df"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:06:29 crc kubenswrapper[4751]: I0131 15:06:29.086556 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d783dd01-73a7-4362-888a-ab84bc8739df-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:29 crc kubenswrapper[4751]: I0131 15:06:29.172549 4751 generic.go:334] "Generic (PLEG): container finished" podID="eacc0c6c-95c4-487f-945e-4a1e3e17c508" containerID="432c19d38a03c6f3813c47fccb7f600a3290b65e75e89319b377dead29257091" exitCode=0 Jan 31 15:06:29 crc kubenswrapper[4751]: I0131 15:06:29.172602 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-index-bvvpv" event={"ID":"eacc0c6c-95c4-487f-945e-4a1e3e17c508","Type":"ContainerDied","Data":"432c19d38a03c6f3813c47fccb7f600a3290b65e75e89319b377dead29257091"} Jan 31 15:06:29 crc kubenswrapper[4751]: I0131 15:06:29.172625 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-index-bvvpv" event={"ID":"eacc0c6c-95c4-487f-945e-4a1e3e17c508","Type":"ContainerDied","Data":"701664b77023940ba4b0968a1f7dc87bd2c93fe4b8f5f2f39b4e39a24e4b2f4b"} Jan 31 15:06:29 crc kubenswrapper[4751]: I0131 15:06:29.172644 4751 scope.go:117] "RemoveContainer" containerID="432c19d38a03c6f3813c47fccb7f600a3290b65e75e89319b377dead29257091" Jan 31 15:06:29 crc kubenswrapper[4751]: I0131 15:06:29.172714 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-index-bvvpv" Jan 31 15:06:29 crc kubenswrapper[4751]: I0131 15:06:29.183590 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/memcached-0" Jan 31 15:06:29 crc kubenswrapper[4751]: I0131 15:06:29.183665 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/memcached-0" event={"ID":"9dfaa3fc-8bf7-420f-8581-4e917bf3f41c","Type":"ContainerDied","Data":"cf904354b92714c266cf175421ba71e5ed9cb49d7ba4bbc0c72df9a09635ce8a"} Jan 31 15:06:29 crc kubenswrapper[4751]: I0131 15:06:29.194047 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ztzpn" event={"ID":"d783dd01-73a7-4362-888a-ab84bc8739df","Type":"ContainerDied","Data":"f8e2ea7f77972f236bec476d7b7bb124f32cd9d091fcbabec970fc3dd4a6de6c"} Jan 31 15:06:29 crc kubenswrapper[4751]: I0131 15:06:29.194164 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ztzpn" Jan 31 15:06:29 crc kubenswrapper[4751]: I0131 15:06:29.208516 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-75dc47fc9-v4thz" event={"ID":"f70443db-a342-4f5d-81b2-39c01f494cf8","Type":"ContainerDied","Data":"1eff2ed52d31a6cb86d6cac75fe9fb2899624e91687b3dbe55c93d71e4cef517"} Jan 31 15:06:29 crc kubenswrapper[4751]: I0131 15:06:29.208596 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-75dc47fc9-v4thz" Jan 31 15:06:29 crc kubenswrapper[4751]: I0131 15:06:29.212328 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/glance-operator-index-bvvpv"] Jan 31 15:06:29 crc kubenswrapper[4751]: I0131 15:06:29.215475 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/glance-operator-index-bvvpv"] Jan 31 15:06:29 crc kubenswrapper[4751]: I0131 15:06:29.244527 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ztzpn"] Jan 31 15:06:29 crc kubenswrapper[4751]: I0131 15:06:29.250150 4751 scope.go:117] "RemoveContainer" containerID="432c19d38a03c6f3813c47fccb7f600a3290b65e75e89319b377dead29257091" Jan 31 15:06:29 crc kubenswrapper[4751]: E0131 15:06:29.250652 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"432c19d38a03c6f3813c47fccb7f600a3290b65e75e89319b377dead29257091\": container with ID starting with 432c19d38a03c6f3813c47fccb7f600a3290b65e75e89319b377dead29257091 not found: ID does not exist" containerID="432c19d38a03c6f3813c47fccb7f600a3290b65e75e89319b377dead29257091" Jan 31 15:06:29 crc kubenswrapper[4751]: I0131 15:06:29.250683 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"432c19d38a03c6f3813c47fccb7f600a3290b65e75e89319b377dead29257091"} err="failed to get container status \"432c19d38a03c6f3813c47fccb7f600a3290b65e75e89319b377dead29257091\": rpc error: code = NotFound desc = could not find container \"432c19d38a03c6f3813c47fccb7f600a3290b65e75e89319b377dead29257091\": container with ID starting with 432c19d38a03c6f3813c47fccb7f600a3290b65e75e89319b377dead29257091 not found: ID does not exist" Jan 31 15:06:29 crc kubenswrapper[4751]: I0131 15:06:29.250704 4751 scope.go:117] "RemoveContainer" containerID="7d9c0759f36bb098c88e33085270280041e2db4b3aa27d3f10dea45195deff2f" Jan 31 15:06:29 crc kubenswrapper[4751]: I0131 15:06:29.254103 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ztzpn"] Jan 31 15:06:29 crc kubenswrapper[4751]: I0131 15:06:29.272650 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/memcached-0"] Jan 31 15:06:29 crc kubenswrapper[4751]: I0131 15:06:29.286193 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/memcached-0"] Jan 31 15:06:29 crc kubenswrapper[4751]: I0131 15:06:29.290865 4751 scope.go:117] "RemoveContainer" containerID="8a8f6ec3fc4799718a2c776fd8b2c60694522c37afe696834e35482b1037e761" Jan 31 15:06:29 crc kubenswrapper[4751]: I0131 15:06:29.293841 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/glance-operator-controller-manager-75dc47fc9-v4thz"] Jan 31 15:06:29 crc kubenswrapper[4751]: I0131 15:06:29.300703 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/glance-operator-controller-manager-75dc47fc9-v4thz"] Jan 31 15:06:29 crc kubenswrapper[4751]: I0131 15:06:29.340685 4751 scope.go:117] "RemoveContainer" containerID="c9f9f3a04268cfbac7a889faf5708fdd7ab535489380c76f269ae48567d562f0" Jan 31 15:06:29 crc kubenswrapper[4751]: I0131 15:06:29.360211 4751 scope.go:117] "RemoveContainer" containerID="568906c2cc7feff3ba674be852dca9f1ba04b313f69bf113705a16e3309aa4da" Jan 31 15:06:29 crc kubenswrapper[4751]: I0131 15:06:29.382018 4751 scope.go:117] "RemoveContainer" containerID="ab946ef56298d90f2da08c7aa03dc9761afb66c0a527a34685eef2375ecebd56" Jan 31 15:06:29 crc kubenswrapper[4751]: I0131 15:06:29.875925 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/openstack-galera-0" podUID="07a2906d-db30-4578-8b1e-088ca2f20ced" containerName="galera" containerID="cri-o://0e2fc16f141e03061cef807a2713d8f66a6c5d9ed59205690727526ba6a882ea" gracePeriod=26 Jan 31 15:06:29 crc kubenswrapper[4751]: I0131 15:06:29.995521 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-1" Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.099586 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/22459bcc-672e-4390-89ae-2b5fa48ded71-config-data-generated\") pod \"22459bcc-672e-4390-89ae-2b5fa48ded71\" (UID: \"22459bcc-672e-4390-89ae-2b5fa48ded71\") " Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.099642 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/22459bcc-672e-4390-89ae-2b5fa48ded71-config-data-default\") pod \"22459bcc-672e-4390-89ae-2b5fa48ded71\" (UID: \"22459bcc-672e-4390-89ae-2b5fa48ded71\") " Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.099789 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"22459bcc-672e-4390-89ae-2b5fa48ded71\" (UID: \"22459bcc-672e-4390-89ae-2b5fa48ded71\") " Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.099917 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/22459bcc-672e-4390-89ae-2b5fa48ded71-kolla-config\") pod \"22459bcc-672e-4390-89ae-2b5fa48ded71\" (UID: \"22459bcc-672e-4390-89ae-2b5fa48ded71\") " Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.099986 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22459bcc-672e-4390-89ae-2b5fa48ded71-operator-scripts\") pod \"22459bcc-672e-4390-89ae-2b5fa48ded71\" (UID: \"22459bcc-672e-4390-89ae-2b5fa48ded71\") " Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.100044 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nj4k\" (UniqueName: \"kubernetes.io/projected/22459bcc-672e-4390-89ae-2b5fa48ded71-kube-api-access-5nj4k\") pod \"22459bcc-672e-4390-89ae-2b5fa48ded71\" (UID: \"22459bcc-672e-4390-89ae-2b5fa48ded71\") " Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.100263 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22459bcc-672e-4390-89ae-2b5fa48ded71-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "22459bcc-672e-4390-89ae-2b5fa48ded71" (UID: "22459bcc-672e-4390-89ae-2b5fa48ded71"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.100480 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22459bcc-672e-4390-89ae-2b5fa48ded71-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "22459bcc-672e-4390-89ae-2b5fa48ded71" (UID: "22459bcc-672e-4390-89ae-2b5fa48ded71"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.100564 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22459bcc-672e-4390-89ae-2b5fa48ded71-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "22459bcc-672e-4390-89ae-2b5fa48ded71" (UID: "22459bcc-672e-4390-89ae-2b5fa48ded71"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.100674 4751 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/22459bcc-672e-4390-89ae-2b5fa48ded71-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.100690 4751 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/22459bcc-672e-4390-89ae-2b5fa48ded71-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.100724 4751 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/22459bcc-672e-4390-89ae-2b5fa48ded71-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.101234 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22459bcc-672e-4390-89ae-2b5fa48ded71-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "22459bcc-672e-4390-89ae-2b5fa48ded71" (UID: "22459bcc-672e-4390-89ae-2b5fa48ded71"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.107268 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22459bcc-672e-4390-89ae-2b5fa48ded71-kube-api-access-5nj4k" (OuterVolumeSpecName: "kube-api-access-5nj4k") pod "22459bcc-672e-4390-89ae-2b5fa48ded71" (UID: "22459bcc-672e-4390-89ae-2b5fa48ded71"). InnerVolumeSpecName "kube-api-access-5nj4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.111766 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "mysql-db") pod "22459bcc-672e-4390-89ae-2b5fa48ded71" (UID: "22459bcc-672e-4390-89ae-2b5fa48ded71"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.170992 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.202444 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.203452 4751 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22459bcc-672e-4390-89ae-2b5fa48ded71-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.203544 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nj4k\" (UniqueName: \"kubernetes.io/projected/22459bcc-672e-4390-89ae-2b5fa48ded71-kube-api-access-5nj4k\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.215173 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.231500 4751 generic.go:334] "Generic (PLEG): container finished" podID="22459bcc-672e-4390-89ae-2b5fa48ded71" containerID="6234bbbcfac3eddf715e4285a6b3d7b6a0aff6d850ad2df858a2deee34d9f571" exitCode=0 Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.231569 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-1" event={"ID":"22459bcc-672e-4390-89ae-2b5fa48ded71","Type":"ContainerDied","Data":"6234bbbcfac3eddf715e4285a6b3d7b6a0aff6d850ad2df858a2deee34d9f571"} Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.231596 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-1" event={"ID":"22459bcc-672e-4390-89ae-2b5fa48ded71","Type":"ContainerDied","Data":"6b6faf7aa73840af2027f08065efac105f4b0ad43c2d2c60890bf024de99e2ca"} Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.231615 4751 scope.go:117] "RemoveContainer" containerID="6234bbbcfac3eddf715e4285a6b3d7b6a0aff6d850ad2df858a2deee34d9f571" Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.231703 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-1" Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.239377 4751 generic.go:334] "Generic (PLEG): container finished" podID="19317a08-b18b-42c9-bdc9-394e1e06257d" containerID="07f687eb09cbc17ef2ede020cb3e1c35352131bf2222486a5b70524349e266f9" exitCode=0 Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.239445 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/rabbitmq-server-0" event={"ID":"19317a08-b18b-42c9-bdc9-394e1e06257d","Type":"ContainerDied","Data":"07f687eb09cbc17ef2ede020cb3e1c35352131bf2222486a5b70524349e266f9"} Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.239470 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/rabbitmq-server-0" event={"ID":"19317a08-b18b-42c9-bdc9-394e1e06257d","Type":"ContainerDied","Data":"f6c134f960dca8717c0eb288c9e0a54cef2dc5968f5f68b04744d850b9ec573e"} Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.239534 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.270107 4751 scope.go:117] "RemoveContainer" containerID="0844a74085d7d943d717fb7babb3a7b7db796dff92dfd7c3894a2eccb22eb987" Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.296184 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/openstack-galera-1"] Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.302157 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/openstack-galera-1"] Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.306433 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/19317a08-b18b-42c9-bdc9-394e1e06257d-plugins-conf\") pod \"19317a08-b18b-42c9-bdc9-394e1e06257d\" (UID: \"19317a08-b18b-42c9-bdc9-394e1e06257d\") " Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.306473 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/19317a08-b18b-42c9-bdc9-394e1e06257d-rabbitmq-erlang-cookie\") pod \"19317a08-b18b-42c9-bdc9-394e1e06257d\" (UID: \"19317a08-b18b-42c9-bdc9-394e1e06257d\") " Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.306529 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/19317a08-b18b-42c9-bdc9-394e1e06257d-rabbitmq-plugins\") pod \"19317a08-b18b-42c9-bdc9-394e1e06257d\" (UID: \"19317a08-b18b-42c9-bdc9-394e1e06257d\") " Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.306567 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/19317a08-b18b-42c9-bdc9-394e1e06257d-rabbitmq-confd\") pod \"19317a08-b18b-42c9-bdc9-394e1e06257d\" (UID: \"19317a08-b18b-42c9-bdc9-394e1e06257d\") " Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.306604 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/19317a08-b18b-42c9-bdc9-394e1e06257d-erlang-cookie-secret\") pod \"19317a08-b18b-42c9-bdc9-394e1e06257d\" (UID: \"19317a08-b18b-42c9-bdc9-394e1e06257d\") " Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.306742 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f145c232-830a-4841-bd1f-7c42e25cd443\") pod \"19317a08-b18b-42c9-bdc9-394e1e06257d\" (UID: \"19317a08-b18b-42c9-bdc9-394e1e06257d\") " Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.306787 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/19317a08-b18b-42c9-bdc9-394e1e06257d-pod-info\") pod \"19317a08-b18b-42c9-bdc9-394e1e06257d\" (UID: \"19317a08-b18b-42c9-bdc9-394e1e06257d\") " Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.306806 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrbjl\" (UniqueName: \"kubernetes.io/projected/19317a08-b18b-42c9-bdc9-394e1e06257d-kube-api-access-zrbjl\") pod \"19317a08-b18b-42c9-bdc9-394e1e06257d\" (UID: \"19317a08-b18b-42c9-bdc9-394e1e06257d\") " Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.307027 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.307089 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19317a08-b18b-42c9-bdc9-394e1e06257d-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "19317a08-b18b-42c9-bdc9-394e1e06257d" (UID: "19317a08-b18b-42c9-bdc9-394e1e06257d"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.307131 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19317a08-b18b-42c9-bdc9-394e1e06257d-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "19317a08-b18b-42c9-bdc9-394e1e06257d" (UID: "19317a08-b18b-42c9-bdc9-394e1e06257d"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.307977 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19317a08-b18b-42c9-bdc9-394e1e06257d-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "19317a08-b18b-42c9-bdc9-394e1e06257d" (UID: "19317a08-b18b-42c9-bdc9-394e1e06257d"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.310416 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19317a08-b18b-42c9-bdc9-394e1e06257d-kube-api-access-zrbjl" (OuterVolumeSpecName: "kube-api-access-zrbjl") pod "19317a08-b18b-42c9-bdc9-394e1e06257d" (UID: "19317a08-b18b-42c9-bdc9-394e1e06257d"). InnerVolumeSpecName "kube-api-access-zrbjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.315252 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/19317a08-b18b-42c9-bdc9-394e1e06257d-pod-info" (OuterVolumeSpecName: "pod-info") pod "19317a08-b18b-42c9-bdc9-394e1e06257d" (UID: "19317a08-b18b-42c9-bdc9-394e1e06257d"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.318743 4751 scope.go:117] "RemoveContainer" containerID="6234bbbcfac3eddf715e4285a6b3d7b6a0aff6d850ad2df858a2deee34d9f571" Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.319827 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/swift-operator-controller-manager-59595cd-9djr5"] Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.320018 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/swift-operator-controller-manager-59595cd-9djr5" podUID="9f3dfaad-d451-448b-a447-47fc7bbff0e5" containerName="manager" containerID="cri-o://668d137892e68f6f4b2298a804a817a3b16a09cc4c85201f8a03fca82e38e755" gracePeriod=10 Jan 31 15:06:30 crc kubenswrapper[4751]: E0131 15:06:30.321625 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6234bbbcfac3eddf715e4285a6b3d7b6a0aff6d850ad2df858a2deee34d9f571\": container with ID starting with 6234bbbcfac3eddf715e4285a6b3d7b6a0aff6d850ad2df858a2deee34d9f571 not found: ID does not exist" containerID="6234bbbcfac3eddf715e4285a6b3d7b6a0aff6d850ad2df858a2deee34d9f571" Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.321678 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6234bbbcfac3eddf715e4285a6b3d7b6a0aff6d850ad2df858a2deee34d9f571"} err="failed to get container status \"6234bbbcfac3eddf715e4285a6b3d7b6a0aff6d850ad2df858a2deee34d9f571\": rpc error: code = NotFound desc = could not find container \"6234bbbcfac3eddf715e4285a6b3d7b6a0aff6d850ad2df858a2deee34d9f571\": container with ID starting with 6234bbbcfac3eddf715e4285a6b3d7b6a0aff6d850ad2df858a2deee34d9f571 not found: ID does not exist" Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.321702 4751 scope.go:117] "RemoveContainer" containerID="0844a74085d7d943d717fb7babb3a7b7db796dff92dfd7c3894a2eccb22eb987" Jan 31 15:06:30 crc kubenswrapper[4751]: E0131 15:06:30.323584 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0844a74085d7d943d717fb7babb3a7b7db796dff92dfd7c3894a2eccb22eb987\": container with ID starting with 0844a74085d7d943d717fb7babb3a7b7db796dff92dfd7c3894a2eccb22eb987 not found: ID does not exist" containerID="0844a74085d7d943d717fb7babb3a7b7db796dff92dfd7c3894a2eccb22eb987" Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.323615 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0844a74085d7d943d717fb7babb3a7b7db796dff92dfd7c3894a2eccb22eb987"} err="failed to get container status \"0844a74085d7d943d717fb7babb3a7b7db796dff92dfd7c3894a2eccb22eb987\": rpc error: code = NotFound desc = could not find container \"0844a74085d7d943d717fb7babb3a7b7db796dff92dfd7c3894a2eccb22eb987\": container with ID starting with 0844a74085d7d943d717fb7babb3a7b7db796dff92dfd7c3894a2eccb22eb987 not found: ID does not exist" Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.323635 4751 scope.go:117] "RemoveContainer" containerID="07f687eb09cbc17ef2ede020cb3e1c35352131bf2222486a5b70524349e266f9" Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.324799 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19317a08-b18b-42c9-bdc9-394e1e06257d-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "19317a08-b18b-42c9-bdc9-394e1e06257d" (UID: "19317a08-b18b-42c9-bdc9-394e1e06257d"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.332526 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f145c232-830a-4841-bd1f-7c42e25cd443" (OuterVolumeSpecName: "persistence") pod "19317a08-b18b-42c9-bdc9-394e1e06257d" (UID: "19317a08-b18b-42c9-bdc9-394e1e06257d"). InnerVolumeSpecName "pvc-f145c232-830a-4841-bd1f-7c42e25cd443". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.349028 4751 scope.go:117] "RemoveContainer" containerID="505748b1e10e777b66b173a6705d54ff333de5b60e6cd125a1cf81bd7167e586" Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.379324 4751 scope.go:117] "RemoveContainer" containerID="07f687eb09cbc17ef2ede020cb3e1c35352131bf2222486a5b70524349e266f9" Jan 31 15:06:30 crc kubenswrapper[4751]: E0131 15:06:30.379613 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07f687eb09cbc17ef2ede020cb3e1c35352131bf2222486a5b70524349e266f9\": container with ID starting with 07f687eb09cbc17ef2ede020cb3e1c35352131bf2222486a5b70524349e266f9 not found: ID does not exist" containerID="07f687eb09cbc17ef2ede020cb3e1c35352131bf2222486a5b70524349e266f9" Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.379638 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07f687eb09cbc17ef2ede020cb3e1c35352131bf2222486a5b70524349e266f9"} err="failed to get container status \"07f687eb09cbc17ef2ede020cb3e1c35352131bf2222486a5b70524349e266f9\": rpc error: code = NotFound desc = could not find container \"07f687eb09cbc17ef2ede020cb3e1c35352131bf2222486a5b70524349e266f9\": container with ID starting with 07f687eb09cbc17ef2ede020cb3e1c35352131bf2222486a5b70524349e266f9 not found: ID does not exist" Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.379656 4751 scope.go:117] "RemoveContainer" containerID="505748b1e10e777b66b173a6705d54ff333de5b60e6cd125a1cf81bd7167e586" Jan 31 15:06:30 crc kubenswrapper[4751]: E0131 15:06:30.380107 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"505748b1e10e777b66b173a6705d54ff333de5b60e6cd125a1cf81bd7167e586\": container with ID starting with 505748b1e10e777b66b173a6705d54ff333de5b60e6cd125a1cf81bd7167e586 not found: ID does not exist" containerID="505748b1e10e777b66b173a6705d54ff333de5b60e6cd125a1cf81bd7167e586" Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.380143 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"505748b1e10e777b66b173a6705d54ff333de5b60e6cd125a1cf81bd7167e586"} err="failed to get container status \"505748b1e10e777b66b173a6705d54ff333de5b60e6cd125a1cf81bd7167e586\": rpc error: code = NotFound desc = could not find container \"505748b1e10e777b66b173a6705d54ff333de5b60e6cd125a1cf81bd7167e586\": container with ID starting with 505748b1e10e777b66b173a6705d54ff333de5b60e6cd125a1cf81bd7167e586 not found: ID does not exist" Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.395534 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19317a08-b18b-42c9-bdc9-394e1e06257d-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "19317a08-b18b-42c9-bdc9-394e1e06257d" (UID: "19317a08-b18b-42c9-bdc9-394e1e06257d"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.407911 4751 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/19317a08-b18b-42c9-bdc9-394e1e06257d-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.407955 4751 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/19317a08-b18b-42c9-bdc9-394e1e06257d-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.407997 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-f145c232-830a-4841-bd1f-7c42e25cd443\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f145c232-830a-4841-bd1f-7c42e25cd443\") on node \"crc\" " Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.408015 4751 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/19317a08-b18b-42c9-bdc9-394e1e06257d-pod-info\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.408028 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrbjl\" (UniqueName: \"kubernetes.io/projected/19317a08-b18b-42c9-bdc9-394e1e06257d-kube-api-access-zrbjl\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.408039 4751 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/19317a08-b18b-42c9-bdc9-394e1e06257d-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.408050 4751 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/19317a08-b18b-42c9-bdc9-394e1e06257d-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.408061 4751 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/19317a08-b18b-42c9-bdc9-394e1e06257d-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.423345 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22459bcc-672e-4390-89ae-2b5fa48ded71" path="/var/lib/kubelet/pods/22459bcc-672e-4390-89ae-2b5fa48ded71/volumes" Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.425092 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9dfaa3fc-8bf7-420f-8581-4e917bf3f41c" path="/var/lib/kubelet/pods/9dfaa3fc-8bf7-420f-8581-4e917bf3f41c/volumes" Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.426146 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d783dd01-73a7-4362-888a-ab84bc8739df" path="/var/lib/kubelet/pods/d783dd01-73a7-4362-888a-ab84bc8739df/volumes" Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.427464 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eacc0c6c-95c4-487f-945e-4a1e3e17c508" path="/var/lib/kubelet/pods/eacc0c6c-95c4-487f-945e-4a1e3e17c508/volumes" Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.428004 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f70443db-a342-4f5d-81b2-39c01f494cf8" path="/var/lib/kubelet/pods/f70443db-a342-4f5d-81b2-39c01f494cf8/volumes" Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.429704 4751 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.429855 4751 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-f145c232-830a-4841-bd1f-7c42e25cd443" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f145c232-830a-4841-bd1f-7c42e25cd443") on node "crc" Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.510545 4751 reconciler_common.go:293] "Volume detached for volume \"pvc-f145c232-830a-4841-bd1f-7c42e25cd443\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f145c232-830a-4841-bd1f-7c42e25cd443\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.578208 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/swift-operator-index-75pvx"] Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.578491 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/swift-operator-index-75pvx" podUID="065b8624-7cdb-463c-9636-d3e980119eb7" containerName="registry-server" containerID="cri-o://d60e188fde30ce119895fe465702862991673f5195ee276a966d98efdbbb7cf3" gracePeriod=30 Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.587142 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/rabbitmq-server-0"] Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.595460 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/rabbitmq-server-0"] Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.607973 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b4bbztq"] Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.612565 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b4bbztq"] Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.721899 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-59595cd-9djr5" Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.813407 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9f3dfaad-d451-448b-a447-47fc7bbff0e5-webhook-cert\") pod \"9f3dfaad-d451-448b-a447-47fc7bbff0e5\" (UID: \"9f3dfaad-d451-448b-a447-47fc7bbff0e5\") " Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.813494 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67l2j\" (UniqueName: \"kubernetes.io/projected/9f3dfaad-d451-448b-a447-47fc7bbff0e5-kube-api-access-67l2j\") pod \"9f3dfaad-d451-448b-a447-47fc7bbff0e5\" (UID: \"9f3dfaad-d451-448b-a447-47fc7bbff0e5\") " Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.813563 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9f3dfaad-d451-448b-a447-47fc7bbff0e5-apiservice-cert\") pod \"9f3dfaad-d451-448b-a447-47fc7bbff0e5\" (UID: \"9f3dfaad-d451-448b-a447-47fc7bbff0e5\") " Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.816575 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f3dfaad-d451-448b-a447-47fc7bbff0e5-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "9f3dfaad-d451-448b-a447-47fc7bbff0e5" (UID: "9f3dfaad-d451-448b-a447-47fc7bbff0e5"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.816699 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f3dfaad-d451-448b-a447-47fc7bbff0e5-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "9f3dfaad-d451-448b-a447-47fc7bbff0e5" (UID: "9f3dfaad-d451-448b-a447-47fc7bbff0e5"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.816941 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f3dfaad-d451-448b-a447-47fc7bbff0e5-kube-api-access-67l2j" (OuterVolumeSpecName: "kube-api-access-67l2j") pod "9f3dfaad-d451-448b-a447-47fc7bbff0e5" (UID: "9f3dfaad-d451-448b-a447-47fc7bbff0e5"). InnerVolumeSpecName "kube-api-access-67l2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.856202 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-0" Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.915613 4751 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9f3dfaad-d451-448b-a447-47fc7bbff0e5-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.915648 4751 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9f3dfaad-d451-448b-a447-47fc7bbff0e5-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.915656 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67l2j\" (UniqueName: \"kubernetes.io/projected/9f3dfaad-d451-448b-a447-47fc7bbff0e5-kube-api-access-67l2j\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:30 crc kubenswrapper[4751]: I0131 15:06:30.976602 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-index-75pvx" Jan 31 15:06:31 crc kubenswrapper[4751]: I0131 15:06:31.016981 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/07a2906d-db30-4578-8b1e-088ca2f20ced-operator-scripts\") pod \"07a2906d-db30-4578-8b1e-088ca2f20ced\" (UID: \"07a2906d-db30-4578-8b1e-088ca2f20ced\") " Jan 31 15:06:31 crc kubenswrapper[4751]: I0131 15:06:31.017096 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ng8cd\" (UniqueName: \"kubernetes.io/projected/07a2906d-db30-4578-8b1e-088ca2f20ced-kube-api-access-ng8cd\") pod \"07a2906d-db30-4578-8b1e-088ca2f20ced\" (UID: \"07a2906d-db30-4578-8b1e-088ca2f20ced\") " Jan 31 15:06:31 crc kubenswrapper[4751]: I0131 15:06:31.017138 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/07a2906d-db30-4578-8b1e-088ca2f20ced-config-data-default\") pod \"07a2906d-db30-4578-8b1e-088ca2f20ced\" (UID: \"07a2906d-db30-4578-8b1e-088ca2f20ced\") " Jan 31 15:06:31 crc kubenswrapper[4751]: I0131 15:06:31.017177 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/07a2906d-db30-4578-8b1e-088ca2f20ced-kolla-config\") pod \"07a2906d-db30-4578-8b1e-088ca2f20ced\" (UID: \"07a2906d-db30-4578-8b1e-088ca2f20ced\") " Jan 31 15:06:31 crc kubenswrapper[4751]: I0131 15:06:31.017242 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"07a2906d-db30-4578-8b1e-088ca2f20ced\" (UID: \"07a2906d-db30-4578-8b1e-088ca2f20ced\") " Jan 31 15:06:31 crc kubenswrapper[4751]: I0131 15:06:31.017279 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/07a2906d-db30-4578-8b1e-088ca2f20ced-config-data-generated\") pod \"07a2906d-db30-4578-8b1e-088ca2f20ced\" (UID: \"07a2906d-db30-4578-8b1e-088ca2f20ced\") " Jan 31 15:06:31 crc kubenswrapper[4751]: I0131 15:06:31.018516 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07a2906d-db30-4578-8b1e-088ca2f20ced-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "07a2906d-db30-4578-8b1e-088ca2f20ced" (UID: "07a2906d-db30-4578-8b1e-088ca2f20ced"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 15:06:31 crc kubenswrapper[4751]: I0131 15:06:31.018575 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07a2906d-db30-4578-8b1e-088ca2f20ced-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "07a2906d-db30-4578-8b1e-088ca2f20ced" (UID: "07a2906d-db30-4578-8b1e-088ca2f20ced"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:06:31 crc kubenswrapper[4751]: I0131 15:06:31.019673 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07a2906d-db30-4578-8b1e-088ca2f20ced-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "07a2906d-db30-4578-8b1e-088ca2f20ced" (UID: "07a2906d-db30-4578-8b1e-088ca2f20ced"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 15:06:31 crc kubenswrapper[4751]: I0131 15:06:31.020310 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07a2906d-db30-4578-8b1e-088ca2f20ced-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "07a2906d-db30-4578-8b1e-088ca2f20ced" (UID: "07a2906d-db30-4578-8b1e-088ca2f20ced"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 15:06:31 crc kubenswrapper[4751]: I0131 15:06:31.024338 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07a2906d-db30-4578-8b1e-088ca2f20ced-kube-api-access-ng8cd" (OuterVolumeSpecName: "kube-api-access-ng8cd") pod "07a2906d-db30-4578-8b1e-088ca2f20ced" (UID: "07a2906d-db30-4578-8b1e-088ca2f20ced"). InnerVolumeSpecName "kube-api-access-ng8cd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:06:31 crc kubenswrapper[4751]: I0131 15:06:31.026913 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "mysql-db") pod "07a2906d-db30-4578-8b1e-088ca2f20ced" (UID: "07a2906d-db30-4578-8b1e-088ca2f20ced"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:06:31 crc kubenswrapper[4751]: I0131 15:06:31.119221 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpg2k\" (UniqueName: \"kubernetes.io/projected/065b8624-7cdb-463c-9636-d3e980119eb7-kube-api-access-qpg2k\") pod \"065b8624-7cdb-463c-9636-d3e980119eb7\" (UID: \"065b8624-7cdb-463c-9636-d3e980119eb7\") " Jan 31 15:06:31 crc kubenswrapper[4751]: I0131 15:06:31.119758 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ng8cd\" (UniqueName: \"kubernetes.io/projected/07a2906d-db30-4578-8b1e-088ca2f20ced-kube-api-access-ng8cd\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:31 crc kubenswrapper[4751]: I0131 15:06:31.119784 4751 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/07a2906d-db30-4578-8b1e-088ca2f20ced-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:31 crc kubenswrapper[4751]: I0131 15:06:31.119820 4751 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/07a2906d-db30-4578-8b1e-088ca2f20ced-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:31 crc kubenswrapper[4751]: I0131 15:06:31.119846 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Jan 31 15:06:31 crc kubenswrapper[4751]: I0131 15:06:31.119861 4751 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/07a2906d-db30-4578-8b1e-088ca2f20ced-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:31 crc kubenswrapper[4751]: I0131 15:06:31.119897 4751 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/07a2906d-db30-4578-8b1e-088ca2f20ced-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:31 crc kubenswrapper[4751]: I0131 15:06:31.129903 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/065b8624-7cdb-463c-9636-d3e980119eb7-kube-api-access-qpg2k" (OuterVolumeSpecName: "kube-api-access-qpg2k") pod "065b8624-7cdb-463c-9636-d3e980119eb7" (UID: "065b8624-7cdb-463c-9636-d3e980119eb7"). InnerVolumeSpecName "kube-api-access-qpg2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:06:31 crc kubenswrapper[4751]: I0131 15:06:31.137353 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Jan 31 15:06:31 crc kubenswrapper[4751]: I0131 15:06:31.221023 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:31 crc kubenswrapper[4751]: I0131 15:06:31.221059 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpg2k\" (UniqueName: \"kubernetes.io/projected/065b8624-7cdb-463c-9636-d3e980119eb7-kube-api-access-qpg2k\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:31 crc kubenswrapper[4751]: I0131 15:06:31.256569 4751 generic.go:334] "Generic (PLEG): container finished" podID="9f3dfaad-d451-448b-a447-47fc7bbff0e5" containerID="668d137892e68f6f4b2298a804a817a3b16a09cc4c85201f8a03fca82e38e755" exitCode=0 Jan 31 15:06:31 crc kubenswrapper[4751]: I0131 15:06:31.256648 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-59595cd-9djr5" event={"ID":"9f3dfaad-d451-448b-a447-47fc7bbff0e5","Type":"ContainerDied","Data":"668d137892e68f6f4b2298a804a817a3b16a09cc4c85201f8a03fca82e38e755"} Jan 31 15:06:31 crc kubenswrapper[4751]: I0131 15:06:31.256678 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-59595cd-9djr5" event={"ID":"9f3dfaad-d451-448b-a447-47fc7bbff0e5","Type":"ContainerDied","Data":"da3b689c07e135768fb2bc22c72ffa9872cf722e04a986707e86515f65114b9c"} Jan 31 15:06:31 crc kubenswrapper[4751]: I0131 15:06:31.256700 4751 scope.go:117] "RemoveContainer" containerID="668d137892e68f6f4b2298a804a817a3b16a09cc4c85201f8a03fca82e38e755" Jan 31 15:06:31 crc kubenswrapper[4751]: I0131 15:06:31.256798 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-59595cd-9djr5" Jan 31 15:06:31 crc kubenswrapper[4751]: I0131 15:06:31.260564 4751 generic.go:334] "Generic (PLEG): container finished" podID="07a2906d-db30-4578-8b1e-088ca2f20ced" containerID="0e2fc16f141e03061cef807a2713d8f66a6c5d9ed59205690727526ba6a882ea" exitCode=0 Jan 31 15:06:31 crc kubenswrapper[4751]: I0131 15:06:31.260691 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-0" event={"ID":"07a2906d-db30-4578-8b1e-088ca2f20ced","Type":"ContainerDied","Data":"0e2fc16f141e03061cef807a2713d8f66a6c5d9ed59205690727526ba6a882ea"} Jan 31 15:06:31 crc kubenswrapper[4751]: I0131 15:06:31.260766 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-0" event={"ID":"07a2906d-db30-4578-8b1e-088ca2f20ced","Type":"ContainerDied","Data":"2161c6d33cfda8a5b256a8346412b18ad489372437142a6a6602a50128a7c01a"} Jan 31 15:06:31 crc kubenswrapper[4751]: I0131 15:06:31.260878 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-0" Jan 31 15:06:31 crc kubenswrapper[4751]: I0131 15:06:31.267099 4751 generic.go:334] "Generic (PLEG): container finished" podID="065b8624-7cdb-463c-9636-d3e980119eb7" containerID="d60e188fde30ce119895fe465702862991673f5195ee276a966d98efdbbb7cf3" exitCode=0 Jan 31 15:06:31 crc kubenswrapper[4751]: I0131 15:06:31.267133 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-index-75pvx" event={"ID":"065b8624-7cdb-463c-9636-d3e980119eb7","Type":"ContainerDied","Data":"d60e188fde30ce119895fe465702862991673f5195ee276a966d98efdbbb7cf3"} Jan 31 15:06:31 crc kubenswrapper[4751]: I0131 15:06:31.267151 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-index-75pvx" event={"ID":"065b8624-7cdb-463c-9636-d3e980119eb7","Type":"ContainerDied","Data":"5f493f4e9467a7936b5f9e1ffc78338268f76a1484833cfafa1962d6944fc1c3"} Jan 31 15:06:31 crc kubenswrapper[4751]: I0131 15:06:31.267153 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-index-75pvx" Jan 31 15:06:31 crc kubenswrapper[4751]: I0131 15:06:31.278169 4751 scope.go:117] "RemoveContainer" containerID="668d137892e68f6f4b2298a804a817a3b16a09cc4c85201f8a03fca82e38e755" Jan 31 15:06:31 crc kubenswrapper[4751]: E0131 15:06:31.278606 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"668d137892e68f6f4b2298a804a817a3b16a09cc4c85201f8a03fca82e38e755\": container with ID starting with 668d137892e68f6f4b2298a804a817a3b16a09cc4c85201f8a03fca82e38e755 not found: ID does not exist" containerID="668d137892e68f6f4b2298a804a817a3b16a09cc4c85201f8a03fca82e38e755" Jan 31 15:06:31 crc kubenswrapper[4751]: I0131 15:06:31.278637 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"668d137892e68f6f4b2298a804a817a3b16a09cc4c85201f8a03fca82e38e755"} err="failed to get container status \"668d137892e68f6f4b2298a804a817a3b16a09cc4c85201f8a03fca82e38e755\": rpc error: code = NotFound desc = could not find container \"668d137892e68f6f4b2298a804a817a3b16a09cc4c85201f8a03fca82e38e755\": container with ID starting with 668d137892e68f6f4b2298a804a817a3b16a09cc4c85201f8a03fca82e38e755 not found: ID does not exist" Jan 31 15:06:31 crc kubenswrapper[4751]: I0131 15:06:31.278657 4751 scope.go:117] "RemoveContainer" containerID="0e2fc16f141e03061cef807a2713d8f66a6c5d9ed59205690727526ba6a882ea" Jan 31 15:06:31 crc kubenswrapper[4751]: I0131 15:06:31.296508 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/openstack-galera-0"] Jan 31 15:06:31 crc kubenswrapper[4751]: I0131 15:06:31.304118 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/openstack-galera-0"] Jan 31 15:06:31 crc kubenswrapper[4751]: I0131 15:06:31.309350 4751 scope.go:117] "RemoveContainer" containerID="1dd59d047e2f99760bb45d01f43a08d4aeb1e5d45326b19b0123bcf023e41f96" Jan 31 15:06:31 crc kubenswrapper[4751]: I0131 15:06:31.321657 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/swift-operator-index-75pvx"] Jan 31 15:06:31 crc kubenswrapper[4751]: I0131 15:06:31.329027 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/swift-operator-index-75pvx"] Jan 31 15:06:31 crc kubenswrapper[4751]: I0131 15:06:31.334344 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/swift-operator-controller-manager-59595cd-9djr5"] Jan 31 15:06:31 crc kubenswrapper[4751]: I0131 15:06:31.338928 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/swift-operator-controller-manager-59595cd-9djr5"] Jan 31 15:06:31 crc kubenswrapper[4751]: I0131 15:06:31.345124 4751 scope.go:117] "RemoveContainer" containerID="0e2fc16f141e03061cef807a2713d8f66a6c5d9ed59205690727526ba6a882ea" Jan 31 15:06:31 crc kubenswrapper[4751]: E0131 15:06:31.345564 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e2fc16f141e03061cef807a2713d8f66a6c5d9ed59205690727526ba6a882ea\": container with ID starting with 0e2fc16f141e03061cef807a2713d8f66a6c5d9ed59205690727526ba6a882ea not found: ID does not exist" containerID="0e2fc16f141e03061cef807a2713d8f66a6c5d9ed59205690727526ba6a882ea" Jan 31 15:06:31 crc kubenswrapper[4751]: I0131 15:06:31.345606 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e2fc16f141e03061cef807a2713d8f66a6c5d9ed59205690727526ba6a882ea"} err="failed to get container status \"0e2fc16f141e03061cef807a2713d8f66a6c5d9ed59205690727526ba6a882ea\": rpc error: code = NotFound desc = could not find container \"0e2fc16f141e03061cef807a2713d8f66a6c5d9ed59205690727526ba6a882ea\": container with ID starting with 0e2fc16f141e03061cef807a2713d8f66a6c5d9ed59205690727526ba6a882ea not found: ID does not exist" Jan 31 15:06:31 crc kubenswrapper[4751]: I0131 15:06:31.345636 4751 scope.go:117] "RemoveContainer" containerID="1dd59d047e2f99760bb45d01f43a08d4aeb1e5d45326b19b0123bcf023e41f96" Jan 31 15:06:31 crc kubenswrapper[4751]: E0131 15:06:31.346123 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1dd59d047e2f99760bb45d01f43a08d4aeb1e5d45326b19b0123bcf023e41f96\": container with ID starting with 1dd59d047e2f99760bb45d01f43a08d4aeb1e5d45326b19b0123bcf023e41f96 not found: ID does not exist" containerID="1dd59d047e2f99760bb45d01f43a08d4aeb1e5d45326b19b0123bcf023e41f96" Jan 31 15:06:31 crc kubenswrapper[4751]: I0131 15:06:31.346233 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1dd59d047e2f99760bb45d01f43a08d4aeb1e5d45326b19b0123bcf023e41f96"} err="failed to get container status \"1dd59d047e2f99760bb45d01f43a08d4aeb1e5d45326b19b0123bcf023e41f96\": rpc error: code = NotFound desc = could not find container \"1dd59d047e2f99760bb45d01f43a08d4aeb1e5d45326b19b0123bcf023e41f96\": container with ID starting with 1dd59d047e2f99760bb45d01f43a08d4aeb1e5d45326b19b0123bcf023e41f96 not found: ID does not exist" Jan 31 15:06:31 crc kubenswrapper[4751]: I0131 15:06:31.346310 4751 scope.go:117] "RemoveContainer" containerID="d60e188fde30ce119895fe465702862991673f5195ee276a966d98efdbbb7cf3" Jan 31 15:06:31 crc kubenswrapper[4751]: I0131 15:06:31.365228 4751 scope.go:117] "RemoveContainer" containerID="d60e188fde30ce119895fe465702862991673f5195ee276a966d98efdbbb7cf3" Jan 31 15:06:31 crc kubenswrapper[4751]: E0131 15:06:31.365811 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d60e188fde30ce119895fe465702862991673f5195ee276a966d98efdbbb7cf3\": container with ID starting with d60e188fde30ce119895fe465702862991673f5195ee276a966d98efdbbb7cf3 not found: ID does not exist" containerID="d60e188fde30ce119895fe465702862991673f5195ee276a966d98efdbbb7cf3" Jan 31 15:06:31 crc kubenswrapper[4751]: I0131 15:06:31.365913 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d60e188fde30ce119895fe465702862991673f5195ee276a966d98efdbbb7cf3"} err="failed to get container status \"d60e188fde30ce119895fe465702862991673f5195ee276a966d98efdbbb7cf3\": rpc error: code = NotFound desc = could not find container \"d60e188fde30ce119895fe465702862991673f5195ee276a966d98efdbbb7cf3\": container with ID starting with d60e188fde30ce119895fe465702862991673f5195ee276a966d98efdbbb7cf3 not found: ID does not exist" Jan 31 15:06:32 crc kubenswrapper[4751]: I0131 15:06:32.416624 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="065b8624-7cdb-463c-9636-d3e980119eb7" path="/var/lib/kubelet/pods/065b8624-7cdb-463c-9636-d3e980119eb7/volumes" Jan 31 15:06:32 crc kubenswrapper[4751]: I0131 15:06:32.417506 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07a2906d-db30-4578-8b1e-088ca2f20ced" path="/var/lib/kubelet/pods/07a2906d-db30-4578-8b1e-088ca2f20ced/volumes" Jan 31 15:06:32 crc kubenswrapper[4751]: I0131 15:06:32.418504 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19317a08-b18b-42c9-bdc9-394e1e06257d" path="/var/lib/kubelet/pods/19317a08-b18b-42c9-bdc9-394e1e06257d/volumes" Jan 31 15:06:32 crc kubenswrapper[4751]: I0131 15:06:32.419964 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="886303a3-d05b-4551-bd03-ebc2e2aef77c" path="/var/lib/kubelet/pods/886303a3-d05b-4551-bd03-ebc2e2aef77c/volumes" Jan 31 15:06:32 crc kubenswrapper[4751]: I0131 15:06:32.420812 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f3dfaad-d451-448b-a447-47fc7bbff0e5" path="/var/lib/kubelet/pods/9f3dfaad-d451-448b-a447-47fc7bbff0e5/volumes" Jan 31 15:06:32 crc kubenswrapper[4751]: I0131 15:06:32.842545 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7f68887647-qvqrq"] Jan 31 15:06:32 crc kubenswrapper[4751]: I0131 15:06:32.842772 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/keystone-operator-controller-manager-7f68887647-qvqrq" podUID="49ea8aae-ad89-4383-8f2f-ba35872fd605" containerName="manager" containerID="cri-o://0c37d2b2bcc47557f4028d9e251b0db237f4b56ff1d49ca666627d0449655ab2" gracePeriod=10 Jan 31 15:06:33 crc kubenswrapper[4751]: I0131 15:06:33.154755 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-index-6bwnv"] Jan 31 15:06:33 crc kubenswrapper[4751]: I0131 15:06:33.155238 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/keystone-operator-index-6bwnv" podUID="08530f42-16c5-4253-a623-2a032aeb95a7" containerName="registry-server" containerID="cri-o://3035639c5750cf779b9b57b5d0ade23abfc3c28de57f8e43e074a91f02a62e68" gracePeriod=30 Jan 31 15:06:33 crc kubenswrapper[4751]: I0131 15:06:33.182904 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4ef6sfsr"] Jan 31 15:06:33 crc kubenswrapper[4751]: I0131 15:06:33.188636 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4ef6sfsr"] Jan 31 15:06:33 crc kubenswrapper[4751]: I0131 15:06:33.299922 4751 generic.go:334] "Generic (PLEG): container finished" podID="08530f42-16c5-4253-a623-2a032aeb95a7" containerID="3035639c5750cf779b9b57b5d0ade23abfc3c28de57f8e43e074a91f02a62e68" exitCode=0 Jan 31 15:06:33 crc kubenswrapper[4751]: I0131 15:06:33.299977 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-6bwnv" event={"ID":"08530f42-16c5-4253-a623-2a032aeb95a7","Type":"ContainerDied","Data":"3035639c5750cf779b9b57b5d0ade23abfc3c28de57f8e43e074a91f02a62e68"} Jan 31 15:06:33 crc kubenswrapper[4751]: I0131 15:06:33.301920 4751 generic.go:334] "Generic (PLEG): container finished" podID="49ea8aae-ad89-4383-8f2f-ba35872fd605" containerID="0c37d2b2bcc47557f4028d9e251b0db237f4b56ff1d49ca666627d0449655ab2" exitCode=0 Jan 31 15:06:33 crc kubenswrapper[4751]: I0131 15:06:33.301952 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7f68887647-qvqrq" event={"ID":"49ea8aae-ad89-4383-8f2f-ba35872fd605","Type":"ContainerDied","Data":"0c37d2b2bcc47557f4028d9e251b0db237f4b56ff1d49ca666627d0449655ab2"} Jan 31 15:06:33 crc kubenswrapper[4751]: I0131 15:06:33.757141 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-6bwnv" Jan 31 15:06:33 crc kubenswrapper[4751]: I0131 15:06:33.844523 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7f68887647-qvqrq" Jan 31 15:06:33 crc kubenswrapper[4751]: I0131 15:06:33.857177 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msbp2\" (UniqueName: \"kubernetes.io/projected/08530f42-16c5-4253-a623-2a032aeb95a7-kube-api-access-msbp2\") pod \"08530f42-16c5-4253-a623-2a032aeb95a7\" (UID: \"08530f42-16c5-4253-a623-2a032aeb95a7\") " Jan 31 15:06:33 crc kubenswrapper[4751]: I0131 15:06:33.862626 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08530f42-16c5-4253-a623-2a032aeb95a7-kube-api-access-msbp2" (OuterVolumeSpecName: "kube-api-access-msbp2") pod "08530f42-16c5-4253-a623-2a032aeb95a7" (UID: "08530f42-16c5-4253-a623-2a032aeb95a7"). InnerVolumeSpecName "kube-api-access-msbp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:06:33 crc kubenswrapper[4751]: I0131 15:06:33.961530 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/49ea8aae-ad89-4383-8f2f-ba35872fd605-apiservice-cert\") pod \"49ea8aae-ad89-4383-8f2f-ba35872fd605\" (UID: \"49ea8aae-ad89-4383-8f2f-ba35872fd605\") " Jan 31 15:06:33 crc kubenswrapper[4751]: I0131 15:06:33.961683 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvdb6\" (UniqueName: \"kubernetes.io/projected/49ea8aae-ad89-4383-8f2f-ba35872fd605-kube-api-access-mvdb6\") pod \"49ea8aae-ad89-4383-8f2f-ba35872fd605\" (UID: \"49ea8aae-ad89-4383-8f2f-ba35872fd605\") " Jan 31 15:06:33 crc kubenswrapper[4751]: I0131 15:06:33.961718 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/49ea8aae-ad89-4383-8f2f-ba35872fd605-webhook-cert\") pod \"49ea8aae-ad89-4383-8f2f-ba35872fd605\" (UID: \"49ea8aae-ad89-4383-8f2f-ba35872fd605\") " Jan 31 15:06:33 crc kubenswrapper[4751]: I0131 15:06:33.962563 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msbp2\" (UniqueName: \"kubernetes.io/projected/08530f42-16c5-4253-a623-2a032aeb95a7-kube-api-access-msbp2\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:33 crc kubenswrapper[4751]: I0131 15:06:33.965217 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49ea8aae-ad89-4383-8f2f-ba35872fd605-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "49ea8aae-ad89-4383-8f2f-ba35872fd605" (UID: "49ea8aae-ad89-4383-8f2f-ba35872fd605"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:06:33 crc kubenswrapper[4751]: I0131 15:06:33.967161 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49ea8aae-ad89-4383-8f2f-ba35872fd605-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "49ea8aae-ad89-4383-8f2f-ba35872fd605" (UID: "49ea8aae-ad89-4383-8f2f-ba35872fd605"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:06:33 crc kubenswrapper[4751]: I0131 15:06:33.968405 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ea8aae-ad89-4383-8f2f-ba35872fd605-kube-api-access-mvdb6" (OuterVolumeSpecName: "kube-api-access-mvdb6") pod "49ea8aae-ad89-4383-8f2f-ba35872fd605" (UID: "49ea8aae-ad89-4383-8f2f-ba35872fd605"). InnerVolumeSpecName "kube-api-access-mvdb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:06:34 crc kubenswrapper[4751]: I0131 15:06:34.063722 4751 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/49ea8aae-ad89-4383-8f2f-ba35872fd605-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:34 crc kubenswrapper[4751]: I0131 15:06:34.063759 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvdb6\" (UniqueName: \"kubernetes.io/projected/49ea8aae-ad89-4383-8f2f-ba35872fd605-kube-api-access-mvdb6\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:34 crc kubenswrapper[4751]: I0131 15:06:34.063772 4751 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/49ea8aae-ad89-4383-8f2f-ba35872fd605-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:34 crc kubenswrapper[4751]: I0131 15:06:34.310146 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-6bwnv" event={"ID":"08530f42-16c5-4253-a623-2a032aeb95a7","Type":"ContainerDied","Data":"287dcdc51fb3cdd7484c91633318b58c60ad6d8b753d031c65761b77a7b8670b"} Jan 31 15:06:34 crc kubenswrapper[4751]: I0131 15:06:34.310178 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-6bwnv" Jan 31 15:06:34 crc kubenswrapper[4751]: I0131 15:06:34.310198 4751 scope.go:117] "RemoveContainer" containerID="3035639c5750cf779b9b57b5d0ade23abfc3c28de57f8e43e074a91f02a62e68" Jan 31 15:06:34 crc kubenswrapper[4751]: I0131 15:06:34.312999 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7f68887647-qvqrq" event={"ID":"49ea8aae-ad89-4383-8f2f-ba35872fd605","Type":"ContainerDied","Data":"57b02fb0aa9fe148da3716e9376b1f52be8527b141c4d304bf8040ea0b69e451"} Jan 31 15:06:34 crc kubenswrapper[4751]: I0131 15:06:34.313056 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7f68887647-qvqrq" Jan 31 15:06:34 crc kubenswrapper[4751]: I0131 15:06:34.330932 4751 scope.go:117] "RemoveContainer" containerID="0c37d2b2bcc47557f4028d9e251b0db237f4b56ff1d49ca666627d0449655ab2" Jan 31 15:06:34 crc kubenswrapper[4751]: I0131 15:06:34.350606 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7f68887647-qvqrq"] Jan 31 15:06:34 crc kubenswrapper[4751]: I0131 15:06:34.358609 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7f68887647-qvqrq"] Jan 31 15:06:34 crc kubenswrapper[4751]: I0131 15:06:34.366268 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-index-6bwnv"] Jan 31 15:06:34 crc kubenswrapper[4751]: I0131 15:06:34.369756 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/keystone-operator-index-6bwnv"] Jan 31 15:06:34 crc kubenswrapper[4751]: I0131 15:06:34.413133 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08530f42-16c5-4253-a623-2a032aeb95a7" path="/var/lib/kubelet/pods/08530f42-16c5-4253-a623-2a032aeb95a7/volumes" Jan 31 15:06:34 crc kubenswrapper[4751]: I0131 15:06:34.413741 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ea8aae-ad89-4383-8f2f-ba35872fd605" path="/var/lib/kubelet/pods/49ea8aae-ad89-4383-8f2f-ba35872fd605/volumes" Jan 31 15:06:34 crc kubenswrapper[4751]: I0131 15:06:34.414428 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="772cd794-fe9a-4ac3-8df8-e7f29edb85bf" path="/var/lib/kubelet/pods/772cd794-fe9a-4ac3-8df8-e7f29edb85bf/volumes" Jan 31 15:06:35 crc kubenswrapper[4751]: I0131 15:06:35.225323 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-fnbvg"] Jan 31 15:06:35 crc kubenswrapper[4751]: I0131 15:06:35.225943 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-fnbvg" podUID="3b77f113-f8c0-47b8-ad79-d1be38bf6e09" containerName="operator" containerID="cri-o://669ca17a7f872d19f207a1db8b3eeef3d4392ee464f55983153e6452773c111f" gracePeriod=10 Jan 31 15:06:35 crc kubenswrapper[4751]: I0131 15:06:35.552510 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-2wvgm"] Jan 31 15:06:35 crc kubenswrapper[4751]: I0131 15:06:35.552740 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/rabbitmq-cluster-operator-index-2wvgm" podUID="44c515c1-f30f-44da-8959-cfd2530b46b7" containerName="registry-server" containerID="cri-o://584fb5ac2ba8eb206c7f2718045838c65ff9fa87a8c959a34945da15defa3f15" gracePeriod=30 Jan 31 15:06:35 crc kubenswrapper[4751]: I0131 15:06:35.585576 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590swhdb"] Jan 31 15:06:35 crc kubenswrapper[4751]: I0131 15:06:35.599728 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590swhdb"] Jan 31 15:06:35 crc kubenswrapper[4751]: I0131 15:06:35.699409 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-fnbvg" Jan 31 15:06:35 crc kubenswrapper[4751]: I0131 15:06:35.785042 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5bkt\" (UniqueName: \"kubernetes.io/projected/3b77f113-f8c0-47b8-ad79-d1be38bf6e09-kube-api-access-x5bkt\") pod \"3b77f113-f8c0-47b8-ad79-d1be38bf6e09\" (UID: \"3b77f113-f8c0-47b8-ad79-d1be38bf6e09\") " Jan 31 15:06:35 crc kubenswrapper[4751]: I0131 15:06:35.790842 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b77f113-f8c0-47b8-ad79-d1be38bf6e09-kube-api-access-x5bkt" (OuterVolumeSpecName: "kube-api-access-x5bkt") pod "3b77f113-f8c0-47b8-ad79-d1be38bf6e09" (UID: "3b77f113-f8c0-47b8-ad79-d1be38bf6e09"). InnerVolumeSpecName "kube-api-access-x5bkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:06:35 crc kubenswrapper[4751]: I0131 15:06:35.886889 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5bkt\" (UniqueName: \"kubernetes.io/projected/3b77f113-f8c0-47b8-ad79-d1be38bf6e09-kube-api-access-x5bkt\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:35 crc kubenswrapper[4751]: I0131 15:06:35.984536 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-2wvgm" Jan 31 15:06:36 crc kubenswrapper[4751]: I0131 15:06:36.088768 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzzlf\" (UniqueName: \"kubernetes.io/projected/44c515c1-f30f-44da-8959-cfd2530b46b7-kube-api-access-rzzlf\") pod \"44c515c1-f30f-44da-8959-cfd2530b46b7\" (UID: \"44c515c1-f30f-44da-8959-cfd2530b46b7\") " Jan 31 15:06:36 crc kubenswrapper[4751]: I0131 15:06:36.091693 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44c515c1-f30f-44da-8959-cfd2530b46b7-kube-api-access-rzzlf" (OuterVolumeSpecName: "kube-api-access-rzzlf") pod "44c515c1-f30f-44da-8959-cfd2530b46b7" (UID: "44c515c1-f30f-44da-8959-cfd2530b46b7"). InnerVolumeSpecName "kube-api-access-rzzlf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:06:36 crc kubenswrapper[4751]: I0131 15:06:36.189918 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzzlf\" (UniqueName: \"kubernetes.io/projected/44c515c1-f30f-44da-8959-cfd2530b46b7-kube-api-access-rzzlf\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:36 crc kubenswrapper[4751]: I0131 15:06:36.332932 4751 generic.go:334] "Generic (PLEG): container finished" podID="44c515c1-f30f-44da-8959-cfd2530b46b7" containerID="584fb5ac2ba8eb206c7f2718045838c65ff9fa87a8c959a34945da15defa3f15" exitCode=0 Jan 31 15:06:36 crc kubenswrapper[4751]: I0131 15:06:36.332960 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-2wvgm" Jan 31 15:06:36 crc kubenswrapper[4751]: I0131 15:06:36.333025 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-2wvgm" event={"ID":"44c515c1-f30f-44da-8959-cfd2530b46b7","Type":"ContainerDied","Data":"584fb5ac2ba8eb206c7f2718045838c65ff9fa87a8c959a34945da15defa3f15"} Jan 31 15:06:36 crc kubenswrapper[4751]: I0131 15:06:36.333064 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-2wvgm" event={"ID":"44c515c1-f30f-44da-8959-cfd2530b46b7","Type":"ContainerDied","Data":"b07bdb6979a897c42db641896c014136c4b8817fab040635c752ccba6b137d19"} Jan 31 15:06:36 crc kubenswrapper[4751]: I0131 15:06:36.333111 4751 scope.go:117] "RemoveContainer" containerID="584fb5ac2ba8eb206c7f2718045838c65ff9fa87a8c959a34945da15defa3f15" Jan 31 15:06:36 crc kubenswrapper[4751]: I0131 15:06:36.334569 4751 generic.go:334] "Generic (PLEG): container finished" podID="3b77f113-f8c0-47b8-ad79-d1be38bf6e09" containerID="669ca17a7f872d19f207a1db8b3eeef3d4392ee464f55983153e6452773c111f" exitCode=0 Jan 31 15:06:36 crc kubenswrapper[4751]: I0131 15:06:36.334590 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-fnbvg" event={"ID":"3b77f113-f8c0-47b8-ad79-d1be38bf6e09","Type":"ContainerDied","Data":"669ca17a7f872d19f207a1db8b3eeef3d4392ee464f55983153e6452773c111f"} Jan 31 15:06:36 crc kubenswrapper[4751]: I0131 15:06:36.334605 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-fnbvg" event={"ID":"3b77f113-f8c0-47b8-ad79-d1be38bf6e09","Type":"ContainerDied","Data":"9417f04e17815ef9de6ec5d2357c85d9f600b65c7a818fc63c494820d893f560"} Jan 31 15:06:36 crc kubenswrapper[4751]: I0131 15:06:36.334631 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-fnbvg" Jan 31 15:06:36 crc kubenswrapper[4751]: I0131 15:06:36.367684 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-2wvgm"] Jan 31 15:06:36 crc kubenswrapper[4751]: I0131 15:06:36.377101 4751 scope.go:117] "RemoveContainer" containerID="584fb5ac2ba8eb206c7f2718045838c65ff9fa87a8c959a34945da15defa3f15" Jan 31 15:06:36 crc kubenswrapper[4751]: E0131 15:06:36.377828 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"584fb5ac2ba8eb206c7f2718045838c65ff9fa87a8c959a34945da15defa3f15\": container with ID starting with 584fb5ac2ba8eb206c7f2718045838c65ff9fa87a8c959a34945da15defa3f15 not found: ID does not exist" containerID="584fb5ac2ba8eb206c7f2718045838c65ff9fa87a8c959a34945da15defa3f15" Jan 31 15:06:36 crc kubenswrapper[4751]: I0131 15:06:36.377874 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"584fb5ac2ba8eb206c7f2718045838c65ff9fa87a8c959a34945da15defa3f15"} err="failed to get container status \"584fb5ac2ba8eb206c7f2718045838c65ff9fa87a8c959a34945da15defa3f15\": rpc error: code = NotFound desc = could not find container \"584fb5ac2ba8eb206c7f2718045838c65ff9fa87a8c959a34945da15defa3f15\": container with ID starting with 584fb5ac2ba8eb206c7f2718045838c65ff9fa87a8c959a34945da15defa3f15 not found: ID does not exist" Jan 31 15:06:36 crc kubenswrapper[4751]: I0131 15:06:36.377894 4751 scope.go:117] "RemoveContainer" containerID="669ca17a7f872d19f207a1db8b3eeef3d4392ee464f55983153e6452773c111f" Jan 31 15:06:36 crc kubenswrapper[4751]: I0131 15:06:36.383059 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-2wvgm"] Jan 31 15:06:36 crc kubenswrapper[4751]: I0131 15:06:36.392946 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-fnbvg"] Jan 31 15:06:36 crc kubenswrapper[4751]: I0131 15:06:36.398501 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-fnbvg"] Jan 31 15:06:36 crc kubenswrapper[4751]: I0131 15:06:36.401748 4751 scope.go:117] "RemoveContainer" containerID="669ca17a7f872d19f207a1db8b3eeef3d4392ee464f55983153e6452773c111f" Jan 31 15:06:36 crc kubenswrapper[4751]: E0131 15:06:36.402251 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"669ca17a7f872d19f207a1db8b3eeef3d4392ee464f55983153e6452773c111f\": container with ID starting with 669ca17a7f872d19f207a1db8b3eeef3d4392ee464f55983153e6452773c111f not found: ID does not exist" containerID="669ca17a7f872d19f207a1db8b3eeef3d4392ee464f55983153e6452773c111f" Jan 31 15:06:36 crc kubenswrapper[4751]: I0131 15:06:36.402276 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"669ca17a7f872d19f207a1db8b3eeef3d4392ee464f55983153e6452773c111f"} err="failed to get container status \"669ca17a7f872d19f207a1db8b3eeef3d4392ee464f55983153e6452773c111f\": rpc error: code = NotFound desc = could not find container \"669ca17a7f872d19f207a1db8b3eeef3d4392ee464f55983153e6452773c111f\": container with ID starting with 669ca17a7f872d19f207a1db8b3eeef3d4392ee464f55983153e6452773c111f not found: ID does not exist" Jan 31 15:06:36 crc kubenswrapper[4751]: I0131 15:06:36.417706 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b77f113-f8c0-47b8-ad79-d1be38bf6e09" path="/var/lib/kubelet/pods/3b77f113-f8c0-47b8-ad79-d1be38bf6e09/volumes" Jan 31 15:06:36 crc kubenswrapper[4751]: I0131 15:06:36.424372 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44c515c1-f30f-44da-8959-cfd2530b46b7" path="/var/lib/kubelet/pods/44c515c1-f30f-44da-8959-cfd2530b46b7/volumes" Jan 31 15:06:36 crc kubenswrapper[4751]: I0131 15:06:36.425718 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a525382f-29ee-4393-9e5b-1b3e989a1bc3" path="/var/lib/kubelet/pods/a525382f-29ee-4393-9e5b-1b3e989a1bc3/volumes" Jan 31 15:06:37 crc kubenswrapper[4751]: I0131 15:06:37.505158 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57f67fdff5-45pkl"] Jan 31 15:06:37 crc kubenswrapper[4751]: I0131 15:06:37.505708 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-controller-manager-57f67fdff5-45pkl" podUID="6578d137-d120-43b2-99e3-71d4f6525d6c" containerName="manager" containerID="cri-o://5dbebc52897c07a2e7dfa38ad3cf5873d3f7ef9969f655b93dd162236a7cbaa8" gracePeriod=10 Jan 31 15:06:37 crc kubenswrapper[4751]: I0131 15:06:37.760503 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-5tz82"] Jan 31 15:06:37 crc kubenswrapper[4751]: I0131 15:06:37.760729 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-index-5tz82" podUID="15539f33-874c-45ae-8ee2-7f821c54b267" containerName="registry-server" containerID="cri-o://2880fdfb3f748e15e74bf089fd52f14043b007613c5d6a9fb47038e2c465f42a" gracePeriod=30 Jan 31 15:06:37 crc kubenswrapper[4751]: I0131 15:06:37.792494 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576w2vw6"] Jan 31 15:06:37 crc kubenswrapper[4751]: I0131 15:06:37.796898 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576w2vw6"] Jan 31 15:06:37 crc kubenswrapper[4751]: I0131 15:06:37.974360 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57f67fdff5-45pkl" Jan 31 15:06:38 crc kubenswrapper[4751]: I0131 15:06:38.115417 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6578d137-d120-43b2-99e3-71d4f6525d6c-apiservice-cert\") pod \"6578d137-d120-43b2-99e3-71d4f6525d6c\" (UID: \"6578d137-d120-43b2-99e3-71d4f6525d6c\") " Jan 31 15:06:38 crc kubenswrapper[4751]: I0131 15:06:38.115490 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6578d137-d120-43b2-99e3-71d4f6525d6c-webhook-cert\") pod \"6578d137-d120-43b2-99e3-71d4f6525d6c\" (UID: \"6578d137-d120-43b2-99e3-71d4f6525d6c\") " Jan 31 15:06:38 crc kubenswrapper[4751]: I0131 15:06:38.115575 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hq5r\" (UniqueName: \"kubernetes.io/projected/6578d137-d120-43b2-99e3-71d4f6525d6c-kube-api-access-7hq5r\") pod \"6578d137-d120-43b2-99e3-71d4f6525d6c\" (UID: \"6578d137-d120-43b2-99e3-71d4f6525d6c\") " Jan 31 15:06:38 crc kubenswrapper[4751]: I0131 15:06:38.121315 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6578d137-d120-43b2-99e3-71d4f6525d6c-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "6578d137-d120-43b2-99e3-71d4f6525d6c" (UID: "6578d137-d120-43b2-99e3-71d4f6525d6c"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:06:38 crc kubenswrapper[4751]: I0131 15:06:38.122177 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6578d137-d120-43b2-99e3-71d4f6525d6c-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "6578d137-d120-43b2-99e3-71d4f6525d6c" (UID: "6578d137-d120-43b2-99e3-71d4f6525d6c"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:06:38 crc kubenswrapper[4751]: I0131 15:06:38.128862 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6578d137-d120-43b2-99e3-71d4f6525d6c-kube-api-access-7hq5r" (OuterVolumeSpecName: "kube-api-access-7hq5r") pod "6578d137-d120-43b2-99e3-71d4f6525d6c" (UID: "6578d137-d120-43b2-99e3-71d4f6525d6c"). InnerVolumeSpecName "kube-api-access-7hq5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:06:38 crc kubenswrapper[4751]: I0131 15:06:38.159826 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-5tz82" Jan 31 15:06:38 crc kubenswrapper[4751]: I0131 15:06:38.216311 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmd6m\" (UniqueName: \"kubernetes.io/projected/15539f33-874c-45ae-8ee2-7f821c54b267-kube-api-access-rmd6m\") pod \"15539f33-874c-45ae-8ee2-7f821c54b267\" (UID: \"15539f33-874c-45ae-8ee2-7f821c54b267\") " Jan 31 15:06:38 crc kubenswrapper[4751]: I0131 15:06:38.216555 4751 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6578d137-d120-43b2-99e3-71d4f6525d6c-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:38 crc kubenswrapper[4751]: I0131 15:06:38.216575 4751 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6578d137-d120-43b2-99e3-71d4f6525d6c-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:38 crc kubenswrapper[4751]: I0131 15:06:38.216583 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hq5r\" (UniqueName: \"kubernetes.io/projected/6578d137-d120-43b2-99e3-71d4f6525d6c-kube-api-access-7hq5r\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:38 crc kubenswrapper[4751]: I0131 15:06:38.220488 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15539f33-874c-45ae-8ee2-7f821c54b267-kube-api-access-rmd6m" (OuterVolumeSpecName: "kube-api-access-rmd6m") pod "15539f33-874c-45ae-8ee2-7f821c54b267" (UID: "15539f33-874c-45ae-8ee2-7f821c54b267"). InnerVolumeSpecName "kube-api-access-rmd6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:06:38 crc kubenswrapper[4751]: I0131 15:06:38.318145 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmd6m\" (UniqueName: \"kubernetes.io/projected/15539f33-874c-45ae-8ee2-7f821c54b267-kube-api-access-rmd6m\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:38 crc kubenswrapper[4751]: I0131 15:06:38.354112 4751 generic.go:334] "Generic (PLEG): container finished" podID="6578d137-d120-43b2-99e3-71d4f6525d6c" containerID="5dbebc52897c07a2e7dfa38ad3cf5873d3f7ef9969f655b93dd162236a7cbaa8" exitCode=0 Jan 31 15:06:38 crc kubenswrapper[4751]: I0131 15:06:38.354134 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57f67fdff5-45pkl" Jan 31 15:06:38 crc kubenswrapper[4751]: I0131 15:06:38.354215 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57f67fdff5-45pkl" event={"ID":"6578d137-d120-43b2-99e3-71d4f6525d6c","Type":"ContainerDied","Data":"5dbebc52897c07a2e7dfa38ad3cf5873d3f7ef9969f655b93dd162236a7cbaa8"} Jan 31 15:06:38 crc kubenswrapper[4751]: I0131 15:06:38.354267 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57f67fdff5-45pkl" event={"ID":"6578d137-d120-43b2-99e3-71d4f6525d6c","Type":"ContainerDied","Data":"fd210a97bb4f47dccbcdbfba3a6c2101ade7c45f9468d34991d8307e718c3b16"} Jan 31 15:06:38 crc kubenswrapper[4751]: I0131 15:06:38.354288 4751 scope.go:117] "RemoveContainer" containerID="5dbebc52897c07a2e7dfa38ad3cf5873d3f7ef9969f655b93dd162236a7cbaa8" Jan 31 15:06:38 crc kubenswrapper[4751]: I0131 15:06:38.356708 4751 generic.go:334] "Generic (PLEG): container finished" podID="15539f33-874c-45ae-8ee2-7f821c54b267" containerID="2880fdfb3f748e15e74bf089fd52f14043b007613c5d6a9fb47038e2c465f42a" exitCode=0 Jan 31 15:06:38 crc kubenswrapper[4751]: I0131 15:06:38.356754 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-5tz82" Jan 31 15:06:38 crc kubenswrapper[4751]: I0131 15:06:38.356792 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-5tz82" event={"ID":"15539f33-874c-45ae-8ee2-7f821c54b267","Type":"ContainerDied","Data":"2880fdfb3f748e15e74bf089fd52f14043b007613c5d6a9fb47038e2c465f42a"} Jan 31 15:06:38 crc kubenswrapper[4751]: I0131 15:06:38.357151 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-5tz82" event={"ID":"15539f33-874c-45ae-8ee2-7f821c54b267","Type":"ContainerDied","Data":"a32321b4d51d551ed7ea834004f3d66d0ba16e8c6d1b16cfe9fefade795fabc7"} Jan 31 15:06:38 crc kubenswrapper[4751]: I0131 15:06:38.382312 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57f67fdff5-45pkl"] Jan 31 15:06:38 crc kubenswrapper[4751]: I0131 15:06:38.387598 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57f67fdff5-45pkl"] Jan 31 15:06:38 crc kubenswrapper[4751]: I0131 15:06:38.399366 4751 scope.go:117] "RemoveContainer" containerID="5dbebc52897c07a2e7dfa38ad3cf5873d3f7ef9969f655b93dd162236a7cbaa8" Jan 31 15:06:38 crc kubenswrapper[4751]: E0131 15:06:38.400236 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5dbebc52897c07a2e7dfa38ad3cf5873d3f7ef9969f655b93dd162236a7cbaa8\": container with ID starting with 5dbebc52897c07a2e7dfa38ad3cf5873d3f7ef9969f655b93dd162236a7cbaa8 not found: ID does not exist" containerID="5dbebc52897c07a2e7dfa38ad3cf5873d3f7ef9969f655b93dd162236a7cbaa8" Jan 31 15:06:38 crc kubenswrapper[4751]: I0131 15:06:38.400281 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dbebc52897c07a2e7dfa38ad3cf5873d3f7ef9969f655b93dd162236a7cbaa8"} err="failed to get container status \"5dbebc52897c07a2e7dfa38ad3cf5873d3f7ef9969f655b93dd162236a7cbaa8\": rpc error: code = NotFound desc = could not find container \"5dbebc52897c07a2e7dfa38ad3cf5873d3f7ef9969f655b93dd162236a7cbaa8\": container with ID starting with 5dbebc52897c07a2e7dfa38ad3cf5873d3f7ef9969f655b93dd162236a7cbaa8 not found: ID does not exist" Jan 31 15:06:38 crc kubenswrapper[4751]: I0131 15:06:38.400309 4751 scope.go:117] "RemoveContainer" containerID="2880fdfb3f748e15e74bf089fd52f14043b007613c5d6a9fb47038e2c465f42a" Jan 31 15:06:38 crc kubenswrapper[4751]: I0131 15:06:38.403336 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-5tz82"] Jan 31 15:06:38 crc kubenswrapper[4751]: I0131 15:06:38.412278 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29a3b16f-f39d-413a-b623-3ac15aba50cf" path="/var/lib/kubelet/pods/29a3b16f-f39d-413a-b623-3ac15aba50cf/volumes" Jan 31 15:06:38 crc kubenswrapper[4751]: I0131 15:06:38.412864 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6578d137-d120-43b2-99e3-71d4f6525d6c" path="/var/lib/kubelet/pods/6578d137-d120-43b2-99e3-71d4f6525d6c/volumes" Jan 31 15:06:38 crc kubenswrapper[4751]: I0131 15:06:38.413275 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-index-5tz82"] Jan 31 15:06:38 crc kubenswrapper[4751]: I0131 15:06:38.421803 4751 scope.go:117] "RemoveContainer" containerID="2880fdfb3f748e15e74bf089fd52f14043b007613c5d6a9fb47038e2c465f42a" Jan 31 15:06:38 crc kubenswrapper[4751]: E0131 15:06:38.422311 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2880fdfb3f748e15e74bf089fd52f14043b007613c5d6a9fb47038e2c465f42a\": container with ID starting with 2880fdfb3f748e15e74bf089fd52f14043b007613c5d6a9fb47038e2c465f42a not found: ID does not exist" containerID="2880fdfb3f748e15e74bf089fd52f14043b007613c5d6a9fb47038e2c465f42a" Jan 31 15:06:38 crc kubenswrapper[4751]: I0131 15:06:38.422351 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2880fdfb3f748e15e74bf089fd52f14043b007613c5d6a9fb47038e2c465f42a"} err="failed to get container status \"2880fdfb3f748e15e74bf089fd52f14043b007613c5d6a9fb47038e2c465f42a\": rpc error: code = NotFound desc = could not find container \"2880fdfb3f748e15e74bf089fd52f14043b007613c5d6a9fb47038e2c465f42a\": container with ID starting with 2880fdfb3f748e15e74bf089fd52f14043b007613c5d6a9fb47038e2c465f42a not found: ID does not exist" Jan 31 15:06:39 crc kubenswrapper[4751]: I0131 15:06:39.679669 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-65848b4486-qb6hn"] Jan 31 15:06:39 crc kubenswrapper[4751]: I0131 15:06:39.680261 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-controller-manager-65848b4486-qb6hn" podUID="14df28b7-d7cb-466e-aa07-69e320d71620" containerName="manager" containerID="cri-o://335b59bfcc7957562303125aed6f69b84b76a35a78ef760e49817204c373ec41" gracePeriod=10 Jan 31 15:06:39 crc kubenswrapper[4751]: I0131 15:06:39.974581 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-lpshr"] Jan 31 15:06:39 crc kubenswrapper[4751]: I0131 15:06:39.974788 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-index-lpshr" podUID="11fab5ff-3041-45d3-8aab-29e25ed8c6ae" containerName="registry-server" containerID="cri-o://ee8f26396384b77a167d32d6b25c5bda6d0d4b830d367daeecc1c7c591a2ef63" gracePeriod=30 Jan 31 15:06:40 crc kubenswrapper[4751]: I0131 15:06:40.003204 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40b4j2q"] Jan 31 15:06:40 crc kubenswrapper[4751]: I0131 15:06:40.007738 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40b4j2q"] Jan 31 15:06:40 crc kubenswrapper[4751]: I0131 15:06:40.111887 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-65848b4486-qb6hn" Jan 31 15:06:40 crc kubenswrapper[4751]: I0131 15:06:40.241343 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/14df28b7-d7cb-466e-aa07-69e320d71620-webhook-cert\") pod \"14df28b7-d7cb-466e-aa07-69e320d71620\" (UID: \"14df28b7-d7cb-466e-aa07-69e320d71620\") " Jan 31 15:06:40 crc kubenswrapper[4751]: I0131 15:06:40.241457 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zcjm\" (UniqueName: \"kubernetes.io/projected/14df28b7-d7cb-466e-aa07-69e320d71620-kube-api-access-5zcjm\") pod \"14df28b7-d7cb-466e-aa07-69e320d71620\" (UID: \"14df28b7-d7cb-466e-aa07-69e320d71620\") " Jan 31 15:06:40 crc kubenswrapper[4751]: I0131 15:06:40.241503 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/14df28b7-d7cb-466e-aa07-69e320d71620-apiservice-cert\") pod \"14df28b7-d7cb-466e-aa07-69e320d71620\" (UID: \"14df28b7-d7cb-466e-aa07-69e320d71620\") " Jan 31 15:06:40 crc kubenswrapper[4751]: I0131 15:06:40.246776 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14df28b7-d7cb-466e-aa07-69e320d71620-kube-api-access-5zcjm" (OuterVolumeSpecName: "kube-api-access-5zcjm") pod "14df28b7-d7cb-466e-aa07-69e320d71620" (UID: "14df28b7-d7cb-466e-aa07-69e320d71620"). InnerVolumeSpecName "kube-api-access-5zcjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:06:40 crc kubenswrapper[4751]: I0131 15:06:40.256200 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14df28b7-d7cb-466e-aa07-69e320d71620-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "14df28b7-d7cb-466e-aa07-69e320d71620" (UID: "14df28b7-d7cb-466e-aa07-69e320d71620"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:06:40 crc kubenswrapper[4751]: I0131 15:06:40.268189 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14df28b7-d7cb-466e-aa07-69e320d71620-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "14df28b7-d7cb-466e-aa07-69e320d71620" (UID: "14df28b7-d7cb-466e-aa07-69e320d71620"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:06:40 crc kubenswrapper[4751]: I0131 15:06:40.342773 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zcjm\" (UniqueName: \"kubernetes.io/projected/14df28b7-d7cb-466e-aa07-69e320d71620-kube-api-access-5zcjm\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:40 crc kubenswrapper[4751]: I0131 15:06:40.342812 4751 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/14df28b7-d7cb-466e-aa07-69e320d71620-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:40 crc kubenswrapper[4751]: I0131 15:06:40.342822 4751 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/14df28b7-d7cb-466e-aa07-69e320d71620-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:40 crc kubenswrapper[4751]: I0131 15:06:40.355182 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-lpshr" Jan 31 15:06:40 crc kubenswrapper[4751]: I0131 15:06:40.386771 4751 generic.go:334] "Generic (PLEG): container finished" podID="11fab5ff-3041-45d3-8aab-29e25ed8c6ae" containerID="ee8f26396384b77a167d32d6b25c5bda6d0d4b830d367daeecc1c7c591a2ef63" exitCode=0 Jan 31 15:06:40 crc kubenswrapper[4751]: I0131 15:06:40.386840 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-lpshr" Jan 31 15:06:40 crc kubenswrapper[4751]: I0131 15:06:40.386863 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-lpshr" event={"ID":"11fab5ff-3041-45d3-8aab-29e25ed8c6ae","Type":"ContainerDied","Data":"ee8f26396384b77a167d32d6b25c5bda6d0d4b830d367daeecc1c7c591a2ef63"} Jan 31 15:06:40 crc kubenswrapper[4751]: I0131 15:06:40.386898 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-lpshr" event={"ID":"11fab5ff-3041-45d3-8aab-29e25ed8c6ae","Type":"ContainerDied","Data":"713674d4d326d8545cf66e50aee47bded08afa8e12a04a763f8540e3552c31dd"} Jan 31 15:06:40 crc kubenswrapper[4751]: I0131 15:06:40.386922 4751 scope.go:117] "RemoveContainer" containerID="ee8f26396384b77a167d32d6b25c5bda6d0d4b830d367daeecc1c7c591a2ef63" Jan 31 15:06:40 crc kubenswrapper[4751]: I0131 15:06:40.390286 4751 generic.go:334] "Generic (PLEG): container finished" podID="14df28b7-d7cb-466e-aa07-69e320d71620" containerID="335b59bfcc7957562303125aed6f69b84b76a35a78ef760e49817204c373ec41" exitCode=0 Jan 31 15:06:40 crc kubenswrapper[4751]: I0131 15:06:40.390330 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-65848b4486-qb6hn" event={"ID":"14df28b7-d7cb-466e-aa07-69e320d71620","Type":"ContainerDied","Data":"335b59bfcc7957562303125aed6f69b84b76a35a78ef760e49817204c373ec41"} Jan 31 15:06:40 crc kubenswrapper[4751]: I0131 15:06:40.390356 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-65848b4486-qb6hn" event={"ID":"14df28b7-d7cb-466e-aa07-69e320d71620","Type":"ContainerDied","Data":"8d7ddc4e6b1f882339c27c9bee06d6abc3c29498935b356f92bf581f66149e68"} Jan 31 15:06:40 crc kubenswrapper[4751]: I0131 15:06:40.390416 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-65848b4486-qb6hn" Jan 31 15:06:40 crc kubenswrapper[4751]: I0131 15:06:40.409634 4751 scope.go:117] "RemoveContainer" containerID="ee8f26396384b77a167d32d6b25c5bda6d0d4b830d367daeecc1c7c591a2ef63" Jan 31 15:06:40 crc kubenswrapper[4751]: E0131 15:06:40.410626 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee8f26396384b77a167d32d6b25c5bda6d0d4b830d367daeecc1c7c591a2ef63\": container with ID starting with ee8f26396384b77a167d32d6b25c5bda6d0d4b830d367daeecc1c7c591a2ef63 not found: ID does not exist" containerID="ee8f26396384b77a167d32d6b25c5bda6d0d4b830d367daeecc1c7c591a2ef63" Jan 31 15:06:40 crc kubenswrapper[4751]: I0131 15:06:40.410687 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee8f26396384b77a167d32d6b25c5bda6d0d4b830d367daeecc1c7c591a2ef63"} err="failed to get container status \"ee8f26396384b77a167d32d6b25c5bda6d0d4b830d367daeecc1c7c591a2ef63\": rpc error: code = NotFound desc = could not find container \"ee8f26396384b77a167d32d6b25c5bda6d0d4b830d367daeecc1c7c591a2ef63\": container with ID starting with ee8f26396384b77a167d32d6b25c5bda6d0d4b830d367daeecc1c7c591a2ef63 not found: ID does not exist" Jan 31 15:06:40 crc kubenswrapper[4751]: I0131 15:06:40.410722 4751 scope.go:117] "RemoveContainer" containerID="335b59bfcc7957562303125aed6f69b84b76a35a78ef760e49817204c373ec41" Jan 31 15:06:40 crc kubenswrapper[4751]: I0131 15:06:40.415399 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15539f33-874c-45ae-8ee2-7f821c54b267" path="/var/lib/kubelet/pods/15539f33-874c-45ae-8ee2-7f821c54b267/volumes" Jan 31 15:06:40 crc kubenswrapper[4751]: I0131 15:06:40.416253 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="667a6cec-bf73-4340-9be6-f4bc10182004" path="/var/lib/kubelet/pods/667a6cec-bf73-4340-9be6-f4bc10182004/volumes" Jan 31 15:06:40 crc kubenswrapper[4751]: I0131 15:06:40.430588 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-65848b4486-qb6hn"] Jan 31 15:06:40 crc kubenswrapper[4751]: I0131 15:06:40.433653 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-65848b4486-qb6hn"] Jan 31 15:06:40 crc kubenswrapper[4751]: I0131 15:06:40.443433 4751 scope.go:117] "RemoveContainer" containerID="335b59bfcc7957562303125aed6f69b84b76a35a78ef760e49817204c373ec41" Jan 31 15:06:40 crc kubenswrapper[4751]: I0131 15:06:40.443729 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hltq9\" (UniqueName: \"kubernetes.io/projected/11fab5ff-3041-45d3-8aab-29e25ed8c6ae-kube-api-access-hltq9\") pod \"11fab5ff-3041-45d3-8aab-29e25ed8c6ae\" (UID: \"11fab5ff-3041-45d3-8aab-29e25ed8c6ae\") " Jan 31 15:06:40 crc kubenswrapper[4751]: E0131 15:06:40.446219 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"335b59bfcc7957562303125aed6f69b84b76a35a78ef760e49817204c373ec41\": container with ID starting with 335b59bfcc7957562303125aed6f69b84b76a35a78ef760e49817204c373ec41 not found: ID does not exist" containerID="335b59bfcc7957562303125aed6f69b84b76a35a78ef760e49817204c373ec41" Jan 31 15:06:40 crc kubenswrapper[4751]: I0131 15:06:40.446252 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"335b59bfcc7957562303125aed6f69b84b76a35a78ef760e49817204c373ec41"} err="failed to get container status \"335b59bfcc7957562303125aed6f69b84b76a35a78ef760e49817204c373ec41\": rpc error: code = NotFound desc = could not find container \"335b59bfcc7957562303125aed6f69b84b76a35a78ef760e49817204c373ec41\": container with ID starting with 335b59bfcc7957562303125aed6f69b84b76a35a78ef760e49817204c373ec41 not found: ID does not exist" Jan 31 15:06:40 crc kubenswrapper[4751]: I0131 15:06:40.447249 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11fab5ff-3041-45d3-8aab-29e25ed8c6ae-kube-api-access-hltq9" (OuterVolumeSpecName: "kube-api-access-hltq9") pod "11fab5ff-3041-45d3-8aab-29e25ed8c6ae" (UID: "11fab5ff-3041-45d3-8aab-29e25ed8c6ae"). InnerVolumeSpecName "kube-api-access-hltq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:06:40 crc kubenswrapper[4751]: I0131 15:06:40.544847 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hltq9\" (UniqueName: \"kubernetes.io/projected/11fab5ff-3041-45d3-8aab-29e25ed8c6ae-kube-api-access-hltq9\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:40 crc kubenswrapper[4751]: I0131 15:06:40.716120 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-lpshr"] Jan 31 15:06:40 crc kubenswrapper[4751]: I0131 15:06:40.723276 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-index-lpshr"] Jan 31 15:06:42 crc kubenswrapper[4751]: I0131 15:06:42.412354 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11fab5ff-3041-45d3-8aab-29e25ed8c6ae" path="/var/lib/kubelet/pods/11fab5ff-3041-45d3-8aab-29e25ed8c6ae/volumes" Jan 31 15:06:42 crc kubenswrapper[4751]: I0131 15:06:42.413125 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14df28b7-d7cb-466e-aa07-69e320d71620" path="/var/lib/kubelet/pods/14df28b7-d7cb-466e-aa07-69e320d71620/volumes" Jan 31 15:06:46 crc kubenswrapper[4751]: I0131 15:06:46.178522 4751 scope.go:117] "RemoveContainer" containerID="ed9ea3bb8f54f1c0a1685efd692fcb4334fbd2ea55432c305b974a3bf1ca584b" Jan 31 15:06:46 crc kubenswrapper[4751]: I0131 15:06:46.204249 4751 scope.go:117] "RemoveContainer" containerID="eec75dcec16927bdd78c685c8995e59bfecf459a9739faabb410481b5046b1fb" Jan 31 15:06:46 crc kubenswrapper[4751]: I0131 15:06:46.230120 4751 scope.go:117] "RemoveContainer" containerID="ecd0273950524364ff0a405d7ba30af3f5ab2065b0d4986c88176cf55c6327d6" Jan 31 15:06:46 crc kubenswrapper[4751]: I0131 15:06:46.252523 4751 scope.go:117] "RemoveContainer" containerID="914fa7bc157f85f90159778e4a352984883804f817b8f2353eb69568b5c31c21" Jan 31 15:06:46 crc kubenswrapper[4751]: I0131 15:06:46.271541 4751 scope.go:117] "RemoveContainer" containerID="5d457b880e70ab7d7bdcd88eb562c916f03e6b62d577ebf9192cc4974cd177f7" Jan 31 15:06:46 crc kubenswrapper[4751]: I0131 15:06:46.311218 4751 scope.go:117] "RemoveContainer" containerID="8f2f8355ecce67c5c0aa186fe2a2c3a5d75143a19a9cc7d982cad7e44dc2d94f" Jan 31 15:06:46 crc kubenswrapper[4751]: I0131 15:06:46.331853 4751 scope.go:117] "RemoveContainer" containerID="8784247046f02ab2d8c0a52ce8233e64d23a7cd286c98e45a4c36115e6daf6d3" Jan 31 15:06:46 crc kubenswrapper[4751]: I0131 15:06:46.346019 4751 scope.go:117] "RemoveContainer" containerID="3bbca91afaf0c02d15eadfd14c9b7b21724ed7ad9f88766a7c7a0c41fcf118a3" Jan 31 15:06:46 crc kubenswrapper[4751]: I0131 15:06:46.359308 4751 scope.go:117] "RemoveContainer" containerID="1985ee06fa1b0e5b47503229ec369a787fff12bff875d4cad0ea6a84e35d2169" Jan 31 15:06:46 crc kubenswrapper[4751]: I0131 15:06:46.373612 4751 scope.go:117] "RemoveContainer" containerID="522600cf4dfb7197c49e6a2fb7abef1d560bd673fb2da9388c38a54462595db0" Jan 31 15:06:46 crc kubenswrapper[4751]: I0131 15:06:46.414010 4751 scope.go:117] "RemoveContainer" containerID="d7982a0dd9c095e8b3eb11a8ff02587d379ddd34791962ab93f48b60a33bec98" Jan 31 15:06:46 crc kubenswrapper[4751]: I0131 15:06:46.432111 4751 scope.go:117] "RemoveContainer" containerID="be0ffdbf0de55d407a928a375e5355c5f5a9cda93c0fc7ee45e3254cddeefdc8" Jan 31 15:06:46 crc kubenswrapper[4751]: I0131 15:06:46.481330 4751 scope.go:117] "RemoveContainer" containerID="f84e5f08594d3f72dc6ce544065026534e30bfc6f05c4074d6d95900baad7f74" Jan 31 15:06:46 crc kubenswrapper[4751]: I0131 15:06:46.497841 4751 scope.go:117] "RemoveContainer" containerID="48b5fe15e6b5d08f52dd98326462e391e5fafbd3bd396d34d3c7b444efffa146" Jan 31 15:06:46 crc kubenswrapper[4751]: I0131 15:06:46.521733 4751 scope.go:117] "RemoveContainer" containerID="f8a8825d481236aeb9aa96c02aca48495f3689b5e59d7cbdc781d2a43a293e1d" Jan 31 15:06:46 crc kubenswrapper[4751]: I0131 15:06:46.538574 4751 scope.go:117] "RemoveContainer" containerID="8c99859db003b8960447da601e95711f7b0d1554d7ee22f9d6cb9490f3263093" Jan 31 15:06:46 crc kubenswrapper[4751]: I0131 15:06:46.562322 4751 scope.go:117] "RemoveContainer" containerID="24848de7678f7cd58f76b4f47400dce420906e54dfe8d1ef4c220211c4bbb57e" Jan 31 15:06:46 crc kubenswrapper[4751]: I0131 15:06:46.580672 4751 scope.go:117] "RemoveContainer" containerID="30a855aaf2538d16f15d520cfdce2fe3cf7008190e9478d912986cc8f0f389d2" Jan 31 15:06:46 crc kubenswrapper[4751]: I0131 15:06:46.595975 4751 scope.go:117] "RemoveContainer" containerID="fdff4dbce192cc3ad36befaed2781dd7252206ed773249a695ca5b7f5682312b" Jan 31 15:06:46 crc kubenswrapper[4751]: I0131 15:06:46.658092 4751 scope.go:117] "RemoveContainer" containerID="ac644719d568c7b156ce9cbb766a2f8c70e69f2f94ca1bad0488a7736c5cd6c9" Jan 31 15:06:46 crc kubenswrapper[4751]: I0131 15:06:46.672881 4751 scope.go:117] "RemoveContainer" containerID="acab140e6ca6aa95c6844fc3952eecbc060037dc14e3f1b6a536e962fd34fb0c" Jan 31 15:06:46 crc kubenswrapper[4751]: I0131 15:06:46.688717 4751 scope.go:117] "RemoveContainer" containerID="b26b741fdf290763b6328eb1a8c5b1a7f048f2aecba802a031d85386bf813c0e" Jan 31 15:06:46 crc kubenswrapper[4751]: I0131 15:06:46.704791 4751 scope.go:117] "RemoveContainer" containerID="48586bec329cecb88f31df9f626d414b524092e8f0898f91d2fb0a6740d113ca" Jan 31 15:06:46 crc kubenswrapper[4751]: I0131 15:06:46.728359 4751 scope.go:117] "RemoveContainer" containerID="457ea80a5f749ea606e6892b07ad8e22c7b832800f0f223bc54849035a17270d" Jan 31 15:06:46 crc kubenswrapper[4751]: I0131 15:06:46.747487 4751 scope.go:117] "RemoveContainer" containerID="c816a8193fdacfa313315863400dd00d03b42ba5e5ce2524c35985ffd3fa845b" Jan 31 15:06:46 crc kubenswrapper[4751]: I0131 15:06:46.766480 4751 scope.go:117] "RemoveContainer" containerID="1220529350d17a7bb750446818ec08ebb9bc079afd6ac80866f7fe1abd4f1db3" Jan 31 15:06:50 crc kubenswrapper[4751]: I0131 15:06:50.505043 4751 generic.go:334] "Generic (PLEG): container finished" podID="440e5809-7b49-4b21-99dd-668468c84017" containerID="ae11b6c0a7f7893c0ba728593c9e1b6db0bc399ae9c55df1f1023d422fc9333c" exitCode=137 Jan 31 15:06:50 crc kubenswrapper[4751]: I0131 15:06:50.505573 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"440e5809-7b49-4b21-99dd-668468c84017","Type":"ContainerDied","Data":"ae11b6c0a7f7893c0ba728593c9e1b6db0bc399ae9c55df1f1023d422fc9333c"} Jan 31 15:06:50 crc kubenswrapper[4751]: I0131 15:06:50.723229 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-storage-0" Jan 31 15:06:50 crc kubenswrapper[4751]: I0131 15:06:50.783654 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"440e5809-7b49-4b21-99dd-668468c84017\" (UID: \"440e5809-7b49-4b21-99dd-668468c84017\") " Jan 31 15:06:50 crc kubenswrapper[4751]: I0131 15:06:50.783751 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/440e5809-7b49-4b21-99dd-668468c84017-lock\") pod \"440e5809-7b49-4b21-99dd-668468c84017\" (UID: \"440e5809-7b49-4b21-99dd-668468c84017\") " Jan 31 15:06:50 crc kubenswrapper[4751]: I0131 15:06:50.783928 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqvvn\" (UniqueName: \"kubernetes.io/projected/440e5809-7b49-4b21-99dd-668468c84017-kube-api-access-kqvvn\") pod \"440e5809-7b49-4b21-99dd-668468c84017\" (UID: \"440e5809-7b49-4b21-99dd-668468c84017\") " Jan 31 15:06:50 crc kubenswrapper[4751]: I0131 15:06:50.783965 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/440e5809-7b49-4b21-99dd-668468c84017-cache\") pod \"440e5809-7b49-4b21-99dd-668468c84017\" (UID: \"440e5809-7b49-4b21-99dd-668468c84017\") " Jan 31 15:06:50 crc kubenswrapper[4751]: I0131 15:06:50.784050 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/440e5809-7b49-4b21-99dd-668468c84017-etc-swift\") pod \"440e5809-7b49-4b21-99dd-668468c84017\" (UID: \"440e5809-7b49-4b21-99dd-668468c84017\") " Jan 31 15:06:50 crc kubenswrapper[4751]: I0131 15:06:50.784265 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/440e5809-7b49-4b21-99dd-668468c84017-lock" (OuterVolumeSpecName: "lock") pod "440e5809-7b49-4b21-99dd-668468c84017" (UID: "440e5809-7b49-4b21-99dd-668468c84017"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:06:50 crc kubenswrapper[4751]: I0131 15:06:50.784607 4751 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/440e5809-7b49-4b21-99dd-668468c84017-lock\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:50 crc kubenswrapper[4751]: I0131 15:06:50.786800 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/440e5809-7b49-4b21-99dd-668468c84017-cache" (OuterVolumeSpecName: "cache") pod "440e5809-7b49-4b21-99dd-668468c84017" (UID: "440e5809-7b49-4b21-99dd-668468c84017"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:06:50 crc kubenswrapper[4751]: I0131 15:06:50.789249 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage14-crc" (OuterVolumeSpecName: "swift") pod "440e5809-7b49-4b21-99dd-668468c84017" (UID: "440e5809-7b49-4b21-99dd-668468c84017"). InnerVolumeSpecName "local-storage14-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 15:06:50 crc kubenswrapper[4751]: I0131 15:06:50.789257 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/440e5809-7b49-4b21-99dd-668468c84017-kube-api-access-kqvvn" (OuterVolumeSpecName: "kube-api-access-kqvvn") pod "440e5809-7b49-4b21-99dd-668468c84017" (UID: "440e5809-7b49-4b21-99dd-668468c84017"). InnerVolumeSpecName "kube-api-access-kqvvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:06:50 crc kubenswrapper[4751]: I0131 15:06:50.789276 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/440e5809-7b49-4b21-99dd-668468c84017-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "440e5809-7b49-4b21-99dd-668468c84017" (UID: "440e5809-7b49-4b21-99dd-668468c84017"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:06:50 crc kubenswrapper[4751]: I0131 15:06:50.885535 4751 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/440e5809-7b49-4b21-99dd-668468c84017-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:50 crc kubenswrapper[4751]: I0131 15:06:50.885592 4751 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" " Jan 31 15:06:50 crc kubenswrapper[4751]: I0131 15:06:50.885606 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqvvn\" (UniqueName: \"kubernetes.io/projected/440e5809-7b49-4b21-99dd-668468c84017-kube-api-access-kqvvn\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:50 crc kubenswrapper[4751]: I0131 15:06:50.885619 4751 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/440e5809-7b49-4b21-99dd-668468c84017-cache\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:50 crc kubenswrapper[4751]: I0131 15:06:50.895179 4751 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage14-crc" (UniqueName: "kubernetes.io/local-volume/local-storage14-crc") on node "crc" Jan 31 15:06:50 crc kubenswrapper[4751]: I0131 15:06:50.986787 4751 reconciler_common.go:293] "Volume detached for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" DevicePath \"\"" Jan 31 15:06:51 crc kubenswrapper[4751]: I0131 15:06:51.527480 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"440e5809-7b49-4b21-99dd-668468c84017","Type":"ContainerDied","Data":"7955d37d9d1be24fa8d9a015aa2ea953036cee2a0334d1dbf39fdbe1dcef40e5"} Jan 31 15:06:51 crc kubenswrapper[4751]: I0131 15:06:51.527584 4751 scope.go:117] "RemoveContainer" containerID="ae11b6c0a7f7893c0ba728593c9e1b6db0bc399ae9c55df1f1023d422fc9333c" Jan 31 15:06:51 crc kubenswrapper[4751]: I0131 15:06:51.527614 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-storage-0" Jan 31 15:06:51 crc kubenswrapper[4751]: I0131 15:06:51.558042 4751 scope.go:117] "RemoveContainer" containerID="519bd8155f30918b172e24832e84310378bd7ea10e796377a992dd3fe9e7276d" Jan 31 15:06:51 crc kubenswrapper[4751]: I0131 15:06:51.578784 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/swift-storage-0"] Jan 31 15:06:51 crc kubenswrapper[4751]: I0131 15:06:51.585661 4751 scope.go:117] "RemoveContainer" containerID="950232b5b660c70b9100e81003ff993443f745f40d7da6ba8dc037822059cb8e" Jan 31 15:06:51 crc kubenswrapper[4751]: I0131 15:06:51.590312 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/swift-storage-0"] Jan 31 15:06:51 crc kubenswrapper[4751]: I0131 15:06:51.604574 4751 scope.go:117] "RemoveContainer" containerID="71ca1416bdc095b268ec385a4ebcd269b729c80c3aee7f832db2892f4fe6e78a" Jan 31 15:06:51 crc kubenswrapper[4751]: I0131 15:06:51.628312 4751 scope.go:117] "RemoveContainer" containerID="1e2003fe4d2366b583ebedf393e2492c910be0ebf3f2652f5a15b1e8c78961df" Jan 31 15:06:51 crc kubenswrapper[4751]: I0131 15:06:51.649961 4751 scope.go:117] "RemoveContainer" containerID="03b25054db738f38056ec8af2822c9203e252f1a4f95be8c4ab8c1c34de3455c" Jan 31 15:06:51 crc kubenswrapper[4751]: I0131 15:06:51.671370 4751 scope.go:117] "RemoveContainer" containerID="1f74cf8c2ce97cd17f509447e4c986197d8af0e8b2f40e7c6a07653c81e66d3b" Jan 31 15:06:51 crc kubenswrapper[4751]: I0131 15:06:51.700931 4751 scope.go:117] "RemoveContainer" containerID="400722d3dac6cd5b0b727b3e599b127bb527981160049f2561a32e7ada14affd" Jan 31 15:06:51 crc kubenswrapper[4751]: I0131 15:06:51.726116 4751 scope.go:117] "RemoveContainer" containerID="34a87b0cfca857f6a2c07d4713531103b7df75f0fdc3e2be299ecaf554d5d9db" Jan 31 15:06:51 crc kubenswrapper[4751]: I0131 15:06:51.752365 4751 scope.go:117] "RemoveContainer" containerID="03c86cbbc819872662746f2a8384c7c50f07b481c42b5f3d39e0b1e87c7b0557" Jan 31 15:06:51 crc kubenswrapper[4751]: I0131 15:06:51.782755 4751 scope.go:117] "RemoveContainer" containerID="3b4375e902d16ea731761694aa85354dcfcda568f68f1d4210b06b07c701f380" Jan 31 15:06:51 crc kubenswrapper[4751]: I0131 15:06:51.803922 4751 scope.go:117] "RemoveContainer" containerID="a4e14596c5c3a7af2ea9e82736c916fc73b8fcbf27a523b8fe47f9a8e69b1bc2" Jan 31 15:06:51 crc kubenswrapper[4751]: I0131 15:06:51.823358 4751 scope.go:117] "RemoveContainer" containerID="461a1aaa8bc72705195647c97b28e111484e900c69e9a4da07e510a6c451ed4c" Jan 31 15:06:51 crc kubenswrapper[4751]: I0131 15:06:51.842945 4751 scope.go:117] "RemoveContainer" containerID="d0ab6cd06ea2abbd171a5345dc579495df175d9d8a52b30a0139e24e65e43616" Jan 31 15:06:51 crc kubenswrapper[4751]: I0131 15:06:51.857646 4751 scope.go:117] "RemoveContainer" containerID="d93f0c8cc4f4e310c9d207351f924f281c14e44b511b3d4a8f51fed27dbeed8f" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.419187 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="440e5809-7b49-4b21-99dd-668468c84017" path="/var/lib/kubelet/pods/440e5809-7b49-4b21-99dd-668468c84017/volumes" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.560610 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rzlpf/must-gather-k47wq"] Jan 31 15:06:52 crc kubenswrapper[4751]: E0131 15:06:52.560969 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6578d137-d120-43b2-99e3-71d4f6525d6c" containerName="manager" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.560985 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="6578d137-d120-43b2-99e3-71d4f6525d6c" containerName="manager" Jan 31 15:06:52 crc kubenswrapper[4751]: E0131 15:06:52.560998 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08530f42-16c5-4253-a623-2a032aeb95a7" containerName="registry-server" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.561008 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="08530f42-16c5-4253-a623-2a032aeb95a7" containerName="registry-server" Jan 31 15:06:52 crc kubenswrapper[4751]: E0131 15:06:52.561022 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="440e5809-7b49-4b21-99dd-668468c84017" containerName="container-replicator" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.561047 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="440e5809-7b49-4b21-99dd-668468c84017" containerName="container-replicator" Jan 31 15:06:52 crc kubenswrapper[4751]: E0131 15:06:52.561059 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="949efaf4-a5db-405d-9d40-c44d525c603c" containerName="mariadb-account-delete" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.561092 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="949efaf4-a5db-405d-9d40-c44d525c603c" containerName="mariadb-account-delete" Jan 31 15:06:52 crc kubenswrapper[4751]: E0131 15:06:52.561109 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14df28b7-d7cb-466e-aa07-69e320d71620" containerName="manager" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.561118 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="14df28b7-d7cb-466e-aa07-69e320d71620" containerName="manager" Jan 31 15:06:52 crc kubenswrapper[4751]: E0131 15:06:52.561132 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dfaa3fc-8bf7-420f-8581-4e917bf3f41c" containerName="memcached" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.561141 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dfaa3fc-8bf7-420f-8581-4e917bf3f41c" containerName="memcached" Jan 31 15:06:52 crc kubenswrapper[4751]: E0131 15:06:52.561172 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19317a08-b18b-42c9-bdc9-394e1e06257d" containerName="rabbitmq" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.561182 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="19317a08-b18b-42c9-bdc9-394e1e06257d" containerName="rabbitmq" Jan 31 15:06:52 crc kubenswrapper[4751]: E0131 15:06:52.561196 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07a2906d-db30-4578-8b1e-088ca2f20ced" containerName="mysql-bootstrap" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.561203 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="07a2906d-db30-4578-8b1e-088ca2f20ced" containerName="mysql-bootstrap" Jan 31 15:06:52 crc kubenswrapper[4751]: E0131 15:06:52.561214 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d783dd01-73a7-4362-888a-ab84bc8739df" containerName="extract-content" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.561222 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="d783dd01-73a7-4362-888a-ab84bc8739df" containerName="extract-content" Jan 31 15:06:52 crc kubenswrapper[4751]: E0131 15:06:52.561253 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="440e5809-7b49-4b21-99dd-668468c84017" containerName="container-updater" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.561262 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="440e5809-7b49-4b21-99dd-668468c84017" containerName="container-updater" Jan 31 15:06:52 crc kubenswrapper[4751]: E0131 15:06:52.561270 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fcd9bac-c0cb-4de4-b630-0db07f110da7" containerName="galera" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.561280 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fcd9bac-c0cb-4de4-b630-0db07f110da7" containerName="galera" Jan 31 15:06:52 crc kubenswrapper[4751]: E0131 15:06:52.561295 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f70443db-a342-4f5d-81b2-39c01f494cf8" containerName="manager" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.561302 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="f70443db-a342-4f5d-81b2-39c01f494cf8" containerName="manager" Jan 31 15:06:52 crc kubenswrapper[4751]: E0131 15:06:52.561332 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="440e5809-7b49-4b21-99dd-668468c84017" containerName="object-replicator" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.561341 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="440e5809-7b49-4b21-99dd-668468c84017" containerName="object-replicator" Jan 31 15:06:52 crc kubenswrapper[4751]: E0131 15:06:52.561352 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fcd9bac-c0cb-4de4-b630-0db07f110da7" containerName="mysql-bootstrap" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.561360 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fcd9bac-c0cb-4de4-b630-0db07f110da7" containerName="mysql-bootstrap" Jan 31 15:06:52 crc kubenswrapper[4751]: E0131 15:06:52.561372 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49ea8aae-ad89-4383-8f2f-ba35872fd605" containerName="manager" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.561380 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="49ea8aae-ad89-4383-8f2f-ba35872fd605" containerName="manager" Jan 31 15:06:52 crc kubenswrapper[4751]: E0131 15:06:52.561408 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="440e5809-7b49-4b21-99dd-668468c84017" containerName="object-expirer" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.561417 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="440e5809-7b49-4b21-99dd-668468c84017" containerName="object-expirer" Jan 31 15:06:52 crc kubenswrapper[4751]: E0131 15:06:52.561427 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07a2906d-db30-4578-8b1e-088ca2f20ced" containerName="galera" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.561436 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="07a2906d-db30-4578-8b1e-088ca2f20ced" containerName="galera" Jan 31 15:06:52 crc kubenswrapper[4751]: E0131 15:06:52.561446 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d783dd01-73a7-4362-888a-ab84bc8739df" containerName="extract-utilities" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.561455 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="d783dd01-73a7-4362-888a-ab84bc8739df" containerName="extract-utilities" Jan 31 15:06:52 crc kubenswrapper[4751]: E0131 15:06:52.561487 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44c515c1-f30f-44da-8959-cfd2530b46b7" containerName="registry-server" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.561497 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="44c515c1-f30f-44da-8959-cfd2530b46b7" containerName="registry-server" Jan 31 15:06:52 crc kubenswrapper[4751]: E0131 15:06:52.561510 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="440e5809-7b49-4b21-99dd-668468c84017" containerName="object-updater" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.561519 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="440e5809-7b49-4b21-99dd-668468c84017" containerName="object-updater" Jan 31 15:06:52 crc kubenswrapper[4751]: E0131 15:06:52.561533 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="440e5809-7b49-4b21-99dd-668468c84017" containerName="rsync" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.561540 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="440e5809-7b49-4b21-99dd-668468c84017" containerName="rsync" Jan 31 15:06:52 crc kubenswrapper[4751]: E0131 15:06:52.561569 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="440e5809-7b49-4b21-99dd-668468c84017" containerName="object-auditor" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.561579 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="440e5809-7b49-4b21-99dd-668468c84017" containerName="object-auditor" Jan 31 15:06:52 crc kubenswrapper[4751]: E0131 15:06:52.561592 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11fab5ff-3041-45d3-8aab-29e25ed8c6ae" containerName="registry-server" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.561601 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="11fab5ff-3041-45d3-8aab-29e25ed8c6ae" containerName="registry-server" Jan 31 15:06:52 crc kubenswrapper[4751]: E0131 15:06:52.561612 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eacc0c6c-95c4-487f-945e-4a1e3e17c508" containerName="registry-server" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.561619 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="eacc0c6c-95c4-487f-945e-4a1e3e17c508" containerName="registry-server" Jan 31 15:06:52 crc kubenswrapper[4751]: E0131 15:06:52.561660 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="440e5809-7b49-4b21-99dd-668468c84017" containerName="account-replicator" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.561669 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="440e5809-7b49-4b21-99dd-668468c84017" containerName="account-replicator" Jan 31 15:06:52 crc kubenswrapper[4751]: E0131 15:06:52.561680 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="065b8624-7cdb-463c-9636-d3e980119eb7" containerName="registry-server" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.561688 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="065b8624-7cdb-463c-9636-d3e980119eb7" containerName="registry-server" Jan 31 15:06:52 crc kubenswrapper[4751]: E0131 15:06:52.561700 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15539f33-874c-45ae-8ee2-7f821c54b267" containerName="registry-server" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.561709 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="15539f33-874c-45ae-8ee2-7f821c54b267" containerName="registry-server" Jan 31 15:06:52 crc kubenswrapper[4751]: E0131 15:06:52.561742 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22459bcc-672e-4390-89ae-2b5fa48ded71" containerName="galera" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.561750 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="22459bcc-672e-4390-89ae-2b5fa48ded71" containerName="galera" Jan 31 15:06:52 crc kubenswrapper[4751]: E0131 15:06:52.561762 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f3dfaad-d451-448b-a447-47fc7bbff0e5" containerName="manager" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.561770 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f3dfaad-d451-448b-a447-47fc7bbff0e5" containerName="manager" Jan 31 15:06:52 crc kubenswrapper[4751]: E0131 15:06:52.561783 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="440e5809-7b49-4b21-99dd-668468c84017" containerName="swift-recon-cron" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.561791 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="440e5809-7b49-4b21-99dd-668468c84017" containerName="swift-recon-cron" Jan 31 15:06:52 crc kubenswrapper[4751]: E0131 15:06:52.561819 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="440e5809-7b49-4b21-99dd-668468c84017" containerName="account-server" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.561827 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="440e5809-7b49-4b21-99dd-668468c84017" containerName="account-server" Jan 31 15:06:52 crc kubenswrapper[4751]: E0131 15:06:52.561840 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="440e5809-7b49-4b21-99dd-668468c84017" containerName="container-auditor" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.561848 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="440e5809-7b49-4b21-99dd-668468c84017" containerName="container-auditor" Jan 31 15:06:52 crc kubenswrapper[4751]: E0131 15:06:52.561860 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="440e5809-7b49-4b21-99dd-668468c84017" containerName="account-auditor" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.561868 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="440e5809-7b49-4b21-99dd-668468c84017" containerName="account-auditor" Jan 31 15:06:52 crc kubenswrapper[4751]: E0131 15:06:52.561897 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19317a08-b18b-42c9-bdc9-394e1e06257d" containerName="setup-container" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.561905 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="19317a08-b18b-42c9-bdc9-394e1e06257d" containerName="setup-container" Jan 31 15:06:52 crc kubenswrapper[4751]: E0131 15:06:52.561918 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b77f113-f8c0-47b8-ad79-d1be38bf6e09" containerName="operator" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.561926 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b77f113-f8c0-47b8-ad79-d1be38bf6e09" containerName="operator" Jan 31 15:06:52 crc kubenswrapper[4751]: E0131 15:06:52.561939 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="440e5809-7b49-4b21-99dd-668468c84017" containerName="container-server" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.561948 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="440e5809-7b49-4b21-99dd-668468c84017" containerName="container-server" Jan 31 15:06:52 crc kubenswrapper[4751]: E0131 15:06:52.561976 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="440e5809-7b49-4b21-99dd-668468c84017" containerName="account-reaper" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.561984 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="440e5809-7b49-4b21-99dd-668468c84017" containerName="account-reaper" Jan 31 15:06:52 crc kubenswrapper[4751]: E0131 15:06:52.561997 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dabb55da-08db-4d2a-8b2d-ac7b2b657053" containerName="keystone-api" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.562005 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="dabb55da-08db-4d2a-8b2d-ac7b2b657053" containerName="keystone-api" Jan 31 15:06:52 crc kubenswrapper[4751]: E0131 15:06:52.562015 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="440e5809-7b49-4b21-99dd-668468c84017" containerName="object-server" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.562022 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="440e5809-7b49-4b21-99dd-668468c84017" containerName="object-server" Jan 31 15:06:52 crc kubenswrapper[4751]: E0131 15:06:52.562034 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d783dd01-73a7-4362-888a-ab84bc8739df" containerName="registry-server" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.562043 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="d783dd01-73a7-4362-888a-ab84bc8739df" containerName="registry-server" Jan 31 15:06:52 crc kubenswrapper[4751]: E0131 15:06:52.562054 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22459bcc-672e-4390-89ae-2b5fa48ded71" containerName="mysql-bootstrap" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.562092 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="22459bcc-672e-4390-89ae-2b5fa48ded71" containerName="mysql-bootstrap" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.562261 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="440e5809-7b49-4b21-99dd-668468c84017" containerName="account-server" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.562275 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="440e5809-7b49-4b21-99dd-668468c84017" containerName="container-updater" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.562289 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="6578d137-d120-43b2-99e3-71d4f6525d6c" containerName="manager" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.562299 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="19317a08-b18b-42c9-bdc9-394e1e06257d" containerName="rabbitmq" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.562307 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="440e5809-7b49-4b21-99dd-668468c84017" containerName="account-auditor" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.562316 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="440e5809-7b49-4b21-99dd-668468c84017" containerName="container-server" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.562327 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="949efaf4-a5db-405d-9d40-c44d525c603c" containerName="mariadb-account-delete" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.562336 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="44c515c1-f30f-44da-8959-cfd2530b46b7" containerName="registry-server" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.562348 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="d783dd01-73a7-4362-888a-ab84bc8739df" containerName="registry-server" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.562360 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="065b8624-7cdb-463c-9636-d3e980119eb7" containerName="registry-server" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.562371 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="14df28b7-d7cb-466e-aa07-69e320d71620" containerName="manager" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.562382 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="440e5809-7b49-4b21-99dd-668468c84017" containerName="account-reaper" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.562394 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b77f113-f8c0-47b8-ad79-d1be38bf6e09" containerName="operator" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.562405 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="440e5809-7b49-4b21-99dd-668468c84017" containerName="object-auditor" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.562415 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="15539f33-874c-45ae-8ee2-7f821c54b267" containerName="registry-server" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.562424 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="dabb55da-08db-4d2a-8b2d-ac7b2b657053" containerName="keystone-api" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.562453 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="f70443db-a342-4f5d-81b2-39c01f494cf8" containerName="manager" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.562464 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="11fab5ff-3041-45d3-8aab-29e25ed8c6ae" containerName="registry-server" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.562472 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dfaa3fc-8bf7-420f-8581-4e917bf3f41c" containerName="memcached" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.562481 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="440e5809-7b49-4b21-99dd-668468c84017" containerName="swift-recon-cron" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.562492 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="49ea8aae-ad89-4383-8f2f-ba35872fd605" containerName="manager" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.562501 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="440e5809-7b49-4b21-99dd-668468c84017" containerName="account-replicator" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.562510 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="440e5809-7b49-4b21-99dd-668468c84017" containerName="object-replicator" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.562521 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="440e5809-7b49-4b21-99dd-668468c84017" containerName="container-auditor" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.562533 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="440e5809-7b49-4b21-99dd-668468c84017" containerName="rsync" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.562543 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fcd9bac-c0cb-4de4-b630-0db07f110da7" containerName="galera" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.562554 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="22459bcc-672e-4390-89ae-2b5fa48ded71" containerName="galera" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.562563 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="440e5809-7b49-4b21-99dd-668468c84017" containerName="object-expirer" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.562574 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="08530f42-16c5-4253-a623-2a032aeb95a7" containerName="registry-server" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.562583 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f3dfaad-d451-448b-a447-47fc7bbff0e5" containerName="manager" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.562592 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="eacc0c6c-95c4-487f-945e-4a1e3e17c508" containerName="registry-server" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.562603 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="440e5809-7b49-4b21-99dd-668468c84017" containerName="container-replicator" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.562614 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="440e5809-7b49-4b21-99dd-668468c84017" containerName="object-server" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.562626 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="07a2906d-db30-4578-8b1e-088ca2f20ced" containerName="galera" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.562636 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="440e5809-7b49-4b21-99dd-668468c84017" containerName="object-updater" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.563349 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rzlpf/must-gather-k47wq" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.569595 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-rzlpf"/"openshift-service-ca.crt" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.571998 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-rzlpf"/"kube-root-ca.crt" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.584912 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rzlpf/must-gather-k47wq"] Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.613749 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4br7s\" (UniqueName: \"kubernetes.io/projected/e85b3ee2-7979-400f-a052-d00fe6e44fd8-kube-api-access-4br7s\") pod \"must-gather-k47wq\" (UID: \"e85b3ee2-7979-400f-a052-d00fe6e44fd8\") " pod="openshift-must-gather-rzlpf/must-gather-k47wq" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.613884 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e85b3ee2-7979-400f-a052-d00fe6e44fd8-must-gather-output\") pod \"must-gather-k47wq\" (UID: \"e85b3ee2-7979-400f-a052-d00fe6e44fd8\") " pod="openshift-must-gather-rzlpf/must-gather-k47wq" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.715132 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e85b3ee2-7979-400f-a052-d00fe6e44fd8-must-gather-output\") pod \"must-gather-k47wq\" (UID: \"e85b3ee2-7979-400f-a052-d00fe6e44fd8\") " pod="openshift-must-gather-rzlpf/must-gather-k47wq" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.715185 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4br7s\" (UniqueName: \"kubernetes.io/projected/e85b3ee2-7979-400f-a052-d00fe6e44fd8-kube-api-access-4br7s\") pod \"must-gather-k47wq\" (UID: \"e85b3ee2-7979-400f-a052-d00fe6e44fd8\") " pod="openshift-must-gather-rzlpf/must-gather-k47wq" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.715527 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e85b3ee2-7979-400f-a052-d00fe6e44fd8-must-gather-output\") pod \"must-gather-k47wq\" (UID: \"e85b3ee2-7979-400f-a052-d00fe6e44fd8\") " pod="openshift-must-gather-rzlpf/must-gather-k47wq" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.731864 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4br7s\" (UniqueName: \"kubernetes.io/projected/e85b3ee2-7979-400f-a052-d00fe6e44fd8-kube-api-access-4br7s\") pod \"must-gather-k47wq\" (UID: \"e85b3ee2-7979-400f-a052-d00fe6e44fd8\") " pod="openshift-must-gather-rzlpf/must-gather-k47wq" Jan 31 15:06:52 crc kubenswrapper[4751]: I0131 15:06:52.877768 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rzlpf/must-gather-k47wq" Jan 31 15:06:53 crc kubenswrapper[4751]: I0131 15:06:53.295024 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rzlpf/must-gather-k47wq"] Jan 31 15:06:53 crc kubenswrapper[4751]: I0131 15:06:53.557090 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rzlpf/must-gather-k47wq" event={"ID":"e85b3ee2-7979-400f-a052-d00fe6e44fd8","Type":"ContainerStarted","Data":"596e9d64dc5ad58dd65790abe5253956e0af8e7593d0653f2fff18bf0269a2e6"} Jan 31 15:06:56 crc kubenswrapper[4751]: E0131 15:06:56.474041 4751 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-config: configmap "openstack-config" not found Jan 31 15:06:56 crc kubenswrapper[4751]: E0131 15:06:56.474743 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config podName:eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb nodeName:}" failed. No retries permitted until 2026-01-31 15:06:56.974716218 +0000 UTC m=+1521.349429123 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config") pod "openstackclient" (UID: "eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb") : configmap "openstack-config" not found Jan 31 15:06:56 crc kubenswrapper[4751]: E0131 15:06:56.474075 4751 secret.go:188] Couldn't get secret glance-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Jan 31 15:06:56 crc kubenswrapper[4751]: E0131 15:06:56.475278 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config-secret podName:eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb nodeName:}" failed. No retries permitted until 2026-01-31 15:06:56.975258642 +0000 UTC m=+1521.349971547 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config-secret") pod "openstackclient" (UID: "eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb") : secret "openstack-config-secret" not found Jan 31 15:06:56 crc kubenswrapper[4751]: E0131 15:06:56.980885 4751 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-config: configmap "openstack-config" not found Jan 31 15:06:56 crc kubenswrapper[4751]: E0131 15:06:56.980915 4751 secret.go:188] Couldn't get secret glance-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Jan 31 15:06:56 crc kubenswrapper[4751]: E0131 15:06:56.980981 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config podName:eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb nodeName:}" failed. No retries permitted until 2026-01-31 15:06:57.980957603 +0000 UTC m=+1522.355670488 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config") pod "openstackclient" (UID: "eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb") : configmap "openstack-config" not found Jan 31 15:06:56 crc kubenswrapper[4751]: E0131 15:06:56.981003 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config-secret podName:eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb nodeName:}" failed. No retries permitted until 2026-01-31 15:06:57.980994724 +0000 UTC m=+1522.355707729 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config-secret") pod "openstackclient" (UID: "eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb") : secret "openstack-config-secret" not found Jan 31 15:06:57 crc kubenswrapper[4751]: I0131 15:06:57.583784 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rzlpf/must-gather-k47wq" event={"ID":"e85b3ee2-7979-400f-a052-d00fe6e44fd8","Type":"ContainerStarted","Data":"73c40052238eafbfd679d14fc1f9ec13e388944d29049da084027f407ea9e611"} Jan 31 15:06:57 crc kubenswrapper[4751]: I0131 15:06:57.584139 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rzlpf/must-gather-k47wq" event={"ID":"e85b3ee2-7979-400f-a052-d00fe6e44fd8","Type":"ContainerStarted","Data":"6309ccb308ea404e3c296e7d0e15d3d0347ba6fcd799c7b7cbf6ada270683b8a"} Jan 31 15:06:57 crc kubenswrapper[4751]: I0131 15:06:57.598328 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-rzlpf/must-gather-k47wq" podStartSLOduration=1.853880569 podStartE2EDuration="5.598303642s" podCreationTimestamp="2026-01-31 15:06:52 +0000 UTC" firstStartedPulling="2026-01-31 15:06:53.315988132 +0000 UTC m=+1517.690701017" lastFinishedPulling="2026-01-31 15:06:57.060411205 +0000 UTC m=+1521.435124090" observedRunningTime="2026-01-31 15:06:57.596886874 +0000 UTC m=+1521.971599759" watchObservedRunningTime="2026-01-31 15:06:57.598303642 +0000 UTC m=+1521.973016547" Jan 31 15:06:57 crc kubenswrapper[4751]: E0131 15:06:57.994558 4751 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-config: configmap "openstack-config" not found Jan 31 15:06:57 crc kubenswrapper[4751]: E0131 15:06:57.994639 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config podName:eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb nodeName:}" failed. No retries permitted until 2026-01-31 15:06:59.994620816 +0000 UTC m=+1524.369333701 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config") pod "openstackclient" (UID: "eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb") : configmap "openstack-config" not found Jan 31 15:06:57 crc kubenswrapper[4751]: E0131 15:06:57.994638 4751 secret.go:188] Couldn't get secret glance-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Jan 31 15:06:57 crc kubenswrapper[4751]: E0131 15:06:57.994712 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config-secret podName:eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb nodeName:}" failed. No retries permitted until 2026-01-31 15:06:59.994694448 +0000 UTC m=+1524.369407343 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config-secret") pod "openstackclient" (UID: "eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb") : secret "openstack-config-secret" not found Jan 31 15:07:00 crc kubenswrapper[4751]: E0131 15:07:00.028168 4751 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-config: configmap "openstack-config" not found Jan 31 15:07:00 crc kubenswrapper[4751]: E0131 15:07:00.028322 4751 secret.go:188] Couldn't get secret glance-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Jan 31 15:07:00 crc kubenswrapper[4751]: E0131 15:07:00.028473 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config podName:eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb nodeName:}" failed. No retries permitted until 2026-01-31 15:07:04.028457693 +0000 UTC m=+1528.403170578 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config") pod "openstackclient" (UID: "eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb") : configmap "openstack-config" not found Jan 31 15:07:00 crc kubenswrapper[4751]: E0131 15:07:00.028595 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config-secret podName:eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb nodeName:}" failed. No retries permitted until 2026-01-31 15:07:04.028568806 +0000 UTC m=+1528.403281751 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config-secret") pod "openstackclient" (UID: "eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb") : secret "openstack-config-secret" not found Jan 31 15:07:04 crc kubenswrapper[4751]: E0131 15:07:04.079397 4751 secret.go:188] Couldn't get secret glance-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Jan 31 15:07:04 crc kubenswrapper[4751]: E0131 15:07:04.079748 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config-secret podName:eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb nodeName:}" failed. No retries permitted until 2026-01-31 15:07:12.079730762 +0000 UTC m=+1536.454443647 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config-secret") pod "openstackclient" (UID: "eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb") : secret "openstack-config-secret" not found Jan 31 15:07:04 crc kubenswrapper[4751]: E0131 15:07:04.079405 4751 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-config: configmap "openstack-config" not found Jan 31 15:07:04 crc kubenswrapper[4751]: E0131 15:07:04.079850 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config podName:eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb nodeName:}" failed. No retries permitted until 2026-01-31 15:07:12.079833275 +0000 UTC m=+1536.454546160 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config") pod "openstackclient" (UID: "eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb") : configmap "openstack-config" not found Jan 31 15:07:10 crc kubenswrapper[4751]: I0131 15:07:10.965972 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4c48n"] Jan 31 15:07:10 crc kubenswrapper[4751]: I0131 15:07:10.967893 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4c48n" Jan 31 15:07:10 crc kubenswrapper[4751]: I0131 15:07:10.983769 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4c48n"] Jan 31 15:07:11 crc kubenswrapper[4751]: I0131 15:07:11.076683 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50a49e54-556d-487d-8cdf-3fd3dc9442a5-utilities\") pod \"certified-operators-4c48n\" (UID: \"50a49e54-556d-487d-8cdf-3fd3dc9442a5\") " pod="openshift-marketplace/certified-operators-4c48n" Jan 31 15:07:11 crc kubenswrapper[4751]: I0131 15:07:11.076737 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50a49e54-556d-487d-8cdf-3fd3dc9442a5-catalog-content\") pod \"certified-operators-4c48n\" (UID: \"50a49e54-556d-487d-8cdf-3fd3dc9442a5\") " pod="openshift-marketplace/certified-operators-4c48n" Jan 31 15:07:11 crc kubenswrapper[4751]: I0131 15:07:11.076778 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgxk8\" (UniqueName: \"kubernetes.io/projected/50a49e54-556d-487d-8cdf-3fd3dc9442a5-kube-api-access-kgxk8\") pod \"certified-operators-4c48n\" (UID: \"50a49e54-556d-487d-8cdf-3fd3dc9442a5\") " pod="openshift-marketplace/certified-operators-4c48n" Jan 31 15:07:11 crc kubenswrapper[4751]: I0131 15:07:11.178658 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50a49e54-556d-487d-8cdf-3fd3dc9442a5-utilities\") pod \"certified-operators-4c48n\" (UID: \"50a49e54-556d-487d-8cdf-3fd3dc9442a5\") " pod="openshift-marketplace/certified-operators-4c48n" Jan 31 15:07:11 crc kubenswrapper[4751]: I0131 15:07:11.178713 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50a49e54-556d-487d-8cdf-3fd3dc9442a5-catalog-content\") pod \"certified-operators-4c48n\" (UID: \"50a49e54-556d-487d-8cdf-3fd3dc9442a5\") " pod="openshift-marketplace/certified-operators-4c48n" Jan 31 15:07:11 crc kubenswrapper[4751]: I0131 15:07:11.178752 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgxk8\" (UniqueName: \"kubernetes.io/projected/50a49e54-556d-487d-8cdf-3fd3dc9442a5-kube-api-access-kgxk8\") pod \"certified-operators-4c48n\" (UID: \"50a49e54-556d-487d-8cdf-3fd3dc9442a5\") " pod="openshift-marketplace/certified-operators-4c48n" Jan 31 15:07:11 crc kubenswrapper[4751]: I0131 15:07:11.179222 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50a49e54-556d-487d-8cdf-3fd3dc9442a5-utilities\") pod \"certified-operators-4c48n\" (UID: \"50a49e54-556d-487d-8cdf-3fd3dc9442a5\") " pod="openshift-marketplace/certified-operators-4c48n" Jan 31 15:07:11 crc kubenswrapper[4751]: I0131 15:07:11.179491 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50a49e54-556d-487d-8cdf-3fd3dc9442a5-catalog-content\") pod \"certified-operators-4c48n\" (UID: \"50a49e54-556d-487d-8cdf-3fd3dc9442a5\") " pod="openshift-marketplace/certified-operators-4c48n" Jan 31 15:07:11 crc kubenswrapper[4751]: I0131 15:07:11.220330 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgxk8\" (UniqueName: \"kubernetes.io/projected/50a49e54-556d-487d-8cdf-3fd3dc9442a5-kube-api-access-kgxk8\") pod \"certified-operators-4c48n\" (UID: \"50a49e54-556d-487d-8cdf-3fd3dc9442a5\") " pod="openshift-marketplace/certified-operators-4c48n" Jan 31 15:07:11 crc kubenswrapper[4751]: I0131 15:07:11.288043 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4c48n" Jan 31 15:07:11 crc kubenswrapper[4751]: I0131 15:07:11.744814 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4c48n"] Jan 31 15:07:11 crc kubenswrapper[4751]: W0131 15:07:11.749326 4751 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50a49e54_556d_487d_8cdf_3fd3dc9442a5.slice/crio-257e8fb19ff2e9763f697cfaf1e9176eb8fd5fdd2c4c4d5de33e938223dd6f02 WatchSource:0}: Error finding container 257e8fb19ff2e9763f697cfaf1e9176eb8fd5fdd2c4c4d5de33e938223dd6f02: Status 404 returned error can't find the container with id 257e8fb19ff2e9763f697cfaf1e9176eb8fd5fdd2c4c4d5de33e938223dd6f02 Jan 31 15:07:12 crc kubenswrapper[4751]: E0131 15:07:12.089854 4751 secret.go:188] Couldn't get secret glance-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Jan 31 15:07:12 crc kubenswrapper[4751]: E0131 15:07:12.090734 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config-secret podName:eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb nodeName:}" failed. No retries permitted until 2026-01-31 15:07:28.090717387 +0000 UTC m=+1552.465430262 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config-secret") pod "openstackclient" (UID: "eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb") : secret "openstack-config-secret" not found Jan 31 15:07:12 crc kubenswrapper[4751]: E0131 15:07:12.089850 4751 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-config: configmap "openstack-config" not found Jan 31 15:07:12 crc kubenswrapper[4751]: E0131 15:07:12.090887 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config podName:eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb nodeName:}" failed. No retries permitted until 2026-01-31 15:07:28.090877311 +0000 UTC m=+1552.465590196 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config") pod "openstackclient" (UID: "eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb") : configmap "openstack-config" not found Jan 31 15:07:12 crc kubenswrapper[4751]: I0131 15:07:12.681173 4751 generic.go:334] "Generic (PLEG): container finished" podID="50a49e54-556d-487d-8cdf-3fd3dc9442a5" containerID="c14d46d646ea8b4cc9763419a629506b15e87c33ed47f956ba8d19439f02c98f" exitCode=0 Jan 31 15:07:12 crc kubenswrapper[4751]: I0131 15:07:12.681218 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4c48n" event={"ID":"50a49e54-556d-487d-8cdf-3fd3dc9442a5","Type":"ContainerDied","Data":"c14d46d646ea8b4cc9763419a629506b15e87c33ed47f956ba8d19439f02c98f"} Jan 31 15:07:12 crc kubenswrapper[4751]: I0131 15:07:12.681242 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4c48n" event={"ID":"50a49e54-556d-487d-8cdf-3fd3dc9442a5","Type":"ContainerStarted","Data":"257e8fb19ff2e9763f697cfaf1e9176eb8fd5fdd2c4c4d5de33e938223dd6f02"} Jan 31 15:07:14 crc kubenswrapper[4751]: I0131 15:07:14.701958 4751 generic.go:334] "Generic (PLEG): container finished" podID="50a49e54-556d-487d-8cdf-3fd3dc9442a5" containerID="cb3b940cfe06f226fc954ce923f02b1e0af3b36fac9a300e9eaafb89bb90805e" exitCode=0 Jan 31 15:07:14 crc kubenswrapper[4751]: I0131 15:07:14.702025 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4c48n" event={"ID":"50a49e54-556d-487d-8cdf-3fd3dc9442a5","Type":"ContainerDied","Data":"cb3b940cfe06f226fc954ce923f02b1e0af3b36fac9a300e9eaafb89bb90805e"} Jan 31 15:07:15 crc kubenswrapper[4751]: I0131 15:07:15.716509 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4c48n" event={"ID":"50a49e54-556d-487d-8cdf-3fd3dc9442a5","Type":"ContainerStarted","Data":"90a4c2e5bfaff4cb9a64ddde999d46f6a1c7891174e7e8800ba33b52b355c79f"} Jan 31 15:07:15 crc kubenswrapper[4751]: I0131 15:07:15.753917 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4c48n" podStartSLOduration=3.364306317 podStartE2EDuration="5.753897151s" podCreationTimestamp="2026-01-31 15:07:10 +0000 UTC" firstStartedPulling="2026-01-31 15:07:12.682971588 +0000 UTC m=+1537.057684473" lastFinishedPulling="2026-01-31 15:07:15.072562422 +0000 UTC m=+1539.447275307" observedRunningTime="2026-01-31 15:07:15.74895928 +0000 UTC m=+1540.123672185" watchObservedRunningTime="2026-01-31 15:07:15.753897151 +0000 UTC m=+1540.128610046" Jan 31 15:07:20 crc kubenswrapper[4751]: I0131 15:07:20.910709 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-76zwv"] Jan 31 15:07:20 crc kubenswrapper[4751]: I0131 15:07:20.913209 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-76zwv" Jan 31 15:07:20 crc kubenswrapper[4751]: I0131 15:07:20.930310 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-76zwv"] Jan 31 15:07:21 crc kubenswrapper[4751]: I0131 15:07:21.015824 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cwxr\" (UniqueName: \"kubernetes.io/projected/ca5a5a5e-fdc7-409c-b452-44b84779eba2-kube-api-access-6cwxr\") pod \"redhat-marketplace-76zwv\" (UID: \"ca5a5a5e-fdc7-409c-b452-44b84779eba2\") " pod="openshift-marketplace/redhat-marketplace-76zwv" Jan 31 15:07:21 crc kubenswrapper[4751]: I0131 15:07:21.015897 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca5a5a5e-fdc7-409c-b452-44b84779eba2-utilities\") pod \"redhat-marketplace-76zwv\" (UID: \"ca5a5a5e-fdc7-409c-b452-44b84779eba2\") " pod="openshift-marketplace/redhat-marketplace-76zwv" Jan 31 15:07:21 crc kubenswrapper[4751]: I0131 15:07:21.016195 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca5a5a5e-fdc7-409c-b452-44b84779eba2-catalog-content\") pod \"redhat-marketplace-76zwv\" (UID: \"ca5a5a5e-fdc7-409c-b452-44b84779eba2\") " pod="openshift-marketplace/redhat-marketplace-76zwv" Jan 31 15:07:21 crc kubenswrapper[4751]: I0131 15:07:21.117755 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca5a5a5e-fdc7-409c-b452-44b84779eba2-catalog-content\") pod \"redhat-marketplace-76zwv\" (UID: \"ca5a5a5e-fdc7-409c-b452-44b84779eba2\") " pod="openshift-marketplace/redhat-marketplace-76zwv" Jan 31 15:07:21 crc kubenswrapper[4751]: I0131 15:07:21.117841 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cwxr\" (UniqueName: \"kubernetes.io/projected/ca5a5a5e-fdc7-409c-b452-44b84779eba2-kube-api-access-6cwxr\") pod \"redhat-marketplace-76zwv\" (UID: \"ca5a5a5e-fdc7-409c-b452-44b84779eba2\") " pod="openshift-marketplace/redhat-marketplace-76zwv" Jan 31 15:07:21 crc kubenswrapper[4751]: I0131 15:07:21.117905 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca5a5a5e-fdc7-409c-b452-44b84779eba2-utilities\") pod \"redhat-marketplace-76zwv\" (UID: \"ca5a5a5e-fdc7-409c-b452-44b84779eba2\") " pod="openshift-marketplace/redhat-marketplace-76zwv" Jan 31 15:07:21 crc kubenswrapper[4751]: I0131 15:07:21.118653 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca5a5a5e-fdc7-409c-b452-44b84779eba2-utilities\") pod \"redhat-marketplace-76zwv\" (UID: \"ca5a5a5e-fdc7-409c-b452-44b84779eba2\") " pod="openshift-marketplace/redhat-marketplace-76zwv" Jan 31 15:07:21 crc kubenswrapper[4751]: I0131 15:07:21.119011 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca5a5a5e-fdc7-409c-b452-44b84779eba2-catalog-content\") pod \"redhat-marketplace-76zwv\" (UID: \"ca5a5a5e-fdc7-409c-b452-44b84779eba2\") " pod="openshift-marketplace/redhat-marketplace-76zwv" Jan 31 15:07:21 crc kubenswrapper[4751]: I0131 15:07:21.146309 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cwxr\" (UniqueName: \"kubernetes.io/projected/ca5a5a5e-fdc7-409c-b452-44b84779eba2-kube-api-access-6cwxr\") pod \"redhat-marketplace-76zwv\" (UID: \"ca5a5a5e-fdc7-409c-b452-44b84779eba2\") " pod="openshift-marketplace/redhat-marketplace-76zwv" Jan 31 15:07:21 crc kubenswrapper[4751]: I0131 15:07:21.230288 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-76zwv" Jan 31 15:07:21 crc kubenswrapper[4751]: I0131 15:07:21.288746 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4c48n" Jan 31 15:07:21 crc kubenswrapper[4751]: I0131 15:07:21.289039 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4c48n" Jan 31 15:07:21 crc kubenswrapper[4751]: I0131 15:07:21.337125 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4c48n" Jan 31 15:07:21 crc kubenswrapper[4751]: I0131 15:07:21.672121 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-76zwv"] Jan 31 15:07:21 crc kubenswrapper[4751]: I0131 15:07:21.755436 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-76zwv" event={"ID":"ca5a5a5e-fdc7-409c-b452-44b84779eba2","Type":"ContainerStarted","Data":"304775135fd2604e43538146d5fba66160f453d9567898a2957c9c65dc840cad"} Jan 31 15:07:21 crc kubenswrapper[4751]: I0131 15:07:21.798287 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4c48n" Jan 31 15:07:22 crc kubenswrapper[4751]: I0131 15:07:22.764227 4751 generic.go:334] "Generic (PLEG): container finished" podID="ca5a5a5e-fdc7-409c-b452-44b84779eba2" containerID="23d1a9651161a01cd1bee04fc08b679a060106e1e39fd6accef9df2a385409fb" exitCode=0 Jan 31 15:07:22 crc kubenswrapper[4751]: I0131 15:07:22.765344 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-76zwv" event={"ID":"ca5a5a5e-fdc7-409c-b452-44b84779eba2","Type":"ContainerDied","Data":"23d1a9651161a01cd1bee04fc08b679a060106e1e39fd6accef9df2a385409fb"} Jan 31 15:07:23 crc kubenswrapper[4751]: I0131 15:07:23.692722 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4c48n"] Jan 31 15:07:23 crc kubenswrapper[4751]: I0131 15:07:23.773128 4751 generic.go:334] "Generic (PLEG): container finished" podID="ca5a5a5e-fdc7-409c-b452-44b84779eba2" containerID="108d9e539af6c029e8ea242e46655d6d091d2b9ce6876bc3e568fb564b94984a" exitCode=0 Jan 31 15:07:23 crc kubenswrapper[4751]: I0131 15:07:23.773171 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-76zwv" event={"ID":"ca5a5a5e-fdc7-409c-b452-44b84779eba2","Type":"ContainerDied","Data":"108d9e539af6c029e8ea242e46655d6d091d2b9ce6876bc3e568fb564b94984a"} Jan 31 15:07:23 crc kubenswrapper[4751]: I0131 15:07:23.773333 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4c48n" podUID="50a49e54-556d-487d-8cdf-3fd3dc9442a5" containerName="registry-server" containerID="cri-o://90a4c2e5bfaff4cb9a64ddde999d46f6a1c7891174e7e8800ba33b52b355c79f" gracePeriod=2 Jan 31 15:07:24 crc kubenswrapper[4751]: I0131 15:07:24.176850 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4c48n" Jan 31 15:07:24 crc kubenswrapper[4751]: I0131 15:07:24.269602 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgxk8\" (UniqueName: \"kubernetes.io/projected/50a49e54-556d-487d-8cdf-3fd3dc9442a5-kube-api-access-kgxk8\") pod \"50a49e54-556d-487d-8cdf-3fd3dc9442a5\" (UID: \"50a49e54-556d-487d-8cdf-3fd3dc9442a5\") " Jan 31 15:07:24 crc kubenswrapper[4751]: I0131 15:07:24.269694 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50a49e54-556d-487d-8cdf-3fd3dc9442a5-catalog-content\") pod \"50a49e54-556d-487d-8cdf-3fd3dc9442a5\" (UID: \"50a49e54-556d-487d-8cdf-3fd3dc9442a5\") " Jan 31 15:07:24 crc kubenswrapper[4751]: I0131 15:07:24.269730 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50a49e54-556d-487d-8cdf-3fd3dc9442a5-utilities\") pod \"50a49e54-556d-487d-8cdf-3fd3dc9442a5\" (UID: \"50a49e54-556d-487d-8cdf-3fd3dc9442a5\") " Jan 31 15:07:24 crc kubenswrapper[4751]: I0131 15:07:24.270905 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50a49e54-556d-487d-8cdf-3fd3dc9442a5-utilities" (OuterVolumeSpecName: "utilities") pod "50a49e54-556d-487d-8cdf-3fd3dc9442a5" (UID: "50a49e54-556d-487d-8cdf-3fd3dc9442a5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:07:24 crc kubenswrapper[4751]: I0131 15:07:24.274740 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50a49e54-556d-487d-8cdf-3fd3dc9442a5-kube-api-access-kgxk8" (OuterVolumeSpecName: "kube-api-access-kgxk8") pod "50a49e54-556d-487d-8cdf-3fd3dc9442a5" (UID: "50a49e54-556d-487d-8cdf-3fd3dc9442a5"). InnerVolumeSpecName "kube-api-access-kgxk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:07:24 crc kubenswrapper[4751]: I0131 15:07:24.329919 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50a49e54-556d-487d-8cdf-3fd3dc9442a5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "50a49e54-556d-487d-8cdf-3fd3dc9442a5" (UID: "50a49e54-556d-487d-8cdf-3fd3dc9442a5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:07:24 crc kubenswrapper[4751]: I0131 15:07:24.371407 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgxk8\" (UniqueName: \"kubernetes.io/projected/50a49e54-556d-487d-8cdf-3fd3dc9442a5-kube-api-access-kgxk8\") on node \"crc\" DevicePath \"\"" Jan 31 15:07:24 crc kubenswrapper[4751]: I0131 15:07:24.371440 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50a49e54-556d-487d-8cdf-3fd3dc9442a5-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 15:07:24 crc kubenswrapper[4751]: I0131 15:07:24.371449 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50a49e54-556d-487d-8cdf-3fd3dc9442a5-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 15:07:24 crc kubenswrapper[4751]: I0131 15:07:24.782208 4751 generic.go:334] "Generic (PLEG): container finished" podID="50a49e54-556d-487d-8cdf-3fd3dc9442a5" containerID="90a4c2e5bfaff4cb9a64ddde999d46f6a1c7891174e7e8800ba33b52b355c79f" exitCode=0 Jan 31 15:07:24 crc kubenswrapper[4751]: I0131 15:07:24.782294 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4c48n" Jan 31 15:07:24 crc kubenswrapper[4751]: I0131 15:07:24.782299 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4c48n" event={"ID":"50a49e54-556d-487d-8cdf-3fd3dc9442a5","Type":"ContainerDied","Data":"90a4c2e5bfaff4cb9a64ddde999d46f6a1c7891174e7e8800ba33b52b355c79f"} Jan 31 15:07:24 crc kubenswrapper[4751]: I0131 15:07:24.782701 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4c48n" event={"ID":"50a49e54-556d-487d-8cdf-3fd3dc9442a5","Type":"ContainerDied","Data":"257e8fb19ff2e9763f697cfaf1e9176eb8fd5fdd2c4c4d5de33e938223dd6f02"} Jan 31 15:07:24 crc kubenswrapper[4751]: I0131 15:07:24.782731 4751 scope.go:117] "RemoveContainer" containerID="90a4c2e5bfaff4cb9a64ddde999d46f6a1c7891174e7e8800ba33b52b355c79f" Jan 31 15:07:24 crc kubenswrapper[4751]: I0131 15:07:24.786016 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-76zwv" event={"ID":"ca5a5a5e-fdc7-409c-b452-44b84779eba2","Type":"ContainerStarted","Data":"313ed8d08566eef3d80118fd1b73dbf2b891937d2a28d957adf4fe8bc1065d48"} Jan 31 15:07:24 crc kubenswrapper[4751]: I0131 15:07:24.797334 4751 scope.go:117] "RemoveContainer" containerID="cb3b940cfe06f226fc954ce923f02b1e0af3b36fac9a300e9eaafb89bb90805e" Jan 31 15:07:24 crc kubenswrapper[4751]: I0131 15:07:24.802618 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4c48n"] Jan 31 15:07:24 crc kubenswrapper[4751]: I0131 15:07:24.812213 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4c48n"] Jan 31 15:07:24 crc kubenswrapper[4751]: I0131 15:07:24.815909 4751 scope.go:117] "RemoveContainer" containerID="c14d46d646ea8b4cc9763419a629506b15e87c33ed47f956ba8d19439f02c98f" Jan 31 15:07:24 crc kubenswrapper[4751]: I0131 15:07:24.831830 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-76zwv" podStartSLOduration=3.392602762 podStartE2EDuration="4.831815094s" podCreationTimestamp="2026-01-31 15:07:20 +0000 UTC" firstStartedPulling="2026-01-31 15:07:22.766342016 +0000 UTC m=+1547.141054891" lastFinishedPulling="2026-01-31 15:07:24.205554328 +0000 UTC m=+1548.580267223" observedRunningTime="2026-01-31 15:07:24.827788226 +0000 UTC m=+1549.202501111" watchObservedRunningTime="2026-01-31 15:07:24.831815094 +0000 UTC m=+1549.206527979" Jan 31 15:07:24 crc kubenswrapper[4751]: I0131 15:07:24.833325 4751 scope.go:117] "RemoveContainer" containerID="90a4c2e5bfaff4cb9a64ddde999d46f6a1c7891174e7e8800ba33b52b355c79f" Jan 31 15:07:24 crc kubenswrapper[4751]: E0131 15:07:24.834410 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90a4c2e5bfaff4cb9a64ddde999d46f6a1c7891174e7e8800ba33b52b355c79f\": container with ID starting with 90a4c2e5bfaff4cb9a64ddde999d46f6a1c7891174e7e8800ba33b52b355c79f not found: ID does not exist" containerID="90a4c2e5bfaff4cb9a64ddde999d46f6a1c7891174e7e8800ba33b52b355c79f" Jan 31 15:07:24 crc kubenswrapper[4751]: I0131 15:07:24.834447 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90a4c2e5bfaff4cb9a64ddde999d46f6a1c7891174e7e8800ba33b52b355c79f"} err="failed to get container status \"90a4c2e5bfaff4cb9a64ddde999d46f6a1c7891174e7e8800ba33b52b355c79f\": rpc error: code = NotFound desc = could not find container \"90a4c2e5bfaff4cb9a64ddde999d46f6a1c7891174e7e8800ba33b52b355c79f\": container with ID starting with 90a4c2e5bfaff4cb9a64ddde999d46f6a1c7891174e7e8800ba33b52b355c79f not found: ID does not exist" Jan 31 15:07:24 crc kubenswrapper[4751]: I0131 15:07:24.834477 4751 scope.go:117] "RemoveContainer" containerID="cb3b940cfe06f226fc954ce923f02b1e0af3b36fac9a300e9eaafb89bb90805e" Jan 31 15:07:24 crc kubenswrapper[4751]: E0131 15:07:24.834798 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb3b940cfe06f226fc954ce923f02b1e0af3b36fac9a300e9eaafb89bb90805e\": container with ID starting with cb3b940cfe06f226fc954ce923f02b1e0af3b36fac9a300e9eaafb89bb90805e not found: ID does not exist" containerID="cb3b940cfe06f226fc954ce923f02b1e0af3b36fac9a300e9eaafb89bb90805e" Jan 31 15:07:24 crc kubenswrapper[4751]: I0131 15:07:24.834904 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb3b940cfe06f226fc954ce923f02b1e0af3b36fac9a300e9eaafb89bb90805e"} err="failed to get container status \"cb3b940cfe06f226fc954ce923f02b1e0af3b36fac9a300e9eaafb89bb90805e\": rpc error: code = NotFound desc = could not find container \"cb3b940cfe06f226fc954ce923f02b1e0af3b36fac9a300e9eaafb89bb90805e\": container with ID starting with cb3b940cfe06f226fc954ce923f02b1e0af3b36fac9a300e9eaafb89bb90805e not found: ID does not exist" Jan 31 15:07:24 crc kubenswrapper[4751]: I0131 15:07:24.834996 4751 scope.go:117] "RemoveContainer" containerID="c14d46d646ea8b4cc9763419a629506b15e87c33ed47f956ba8d19439f02c98f" Jan 31 15:07:24 crc kubenswrapper[4751]: E0131 15:07:24.835385 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c14d46d646ea8b4cc9763419a629506b15e87c33ed47f956ba8d19439f02c98f\": container with ID starting with c14d46d646ea8b4cc9763419a629506b15e87c33ed47f956ba8d19439f02c98f not found: ID does not exist" containerID="c14d46d646ea8b4cc9763419a629506b15e87c33ed47f956ba8d19439f02c98f" Jan 31 15:07:24 crc kubenswrapper[4751]: I0131 15:07:24.835416 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c14d46d646ea8b4cc9763419a629506b15e87c33ed47f956ba8d19439f02c98f"} err="failed to get container status \"c14d46d646ea8b4cc9763419a629506b15e87c33ed47f956ba8d19439f02c98f\": rpc error: code = NotFound desc = could not find container \"c14d46d646ea8b4cc9763419a629506b15e87c33ed47f956ba8d19439f02c98f\": container with ID starting with c14d46d646ea8b4cc9763419a629506b15e87c33ed47f956ba8d19439f02c98f not found: ID does not exist" Jan 31 15:07:26 crc kubenswrapper[4751]: I0131 15:07:26.415350 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50a49e54-556d-487d-8cdf-3fd3dc9442a5" path="/var/lib/kubelet/pods/50a49e54-556d-487d-8cdf-3fd3dc9442a5/volumes" Jan 31 15:07:28 crc kubenswrapper[4751]: E0131 15:07:28.119927 4751 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-config: configmap "openstack-config" not found Jan 31 15:07:28 crc kubenswrapper[4751]: E0131 15:07:28.120352 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config podName:eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb nodeName:}" failed. No retries permitted until 2026-01-31 15:08:00.120330439 +0000 UTC m=+1584.495043334 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config") pod "openstackclient" (UID: "eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb") : configmap "openstack-config" not found Jan 31 15:07:28 crc kubenswrapper[4751]: E0131 15:07:28.119977 4751 secret.go:188] Couldn't get secret glance-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Jan 31 15:07:28 crc kubenswrapper[4751]: E0131 15:07:28.120470 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config-secret podName:eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb nodeName:}" failed. No retries permitted until 2026-01-31 15:08:00.120441062 +0000 UTC m=+1584.495153987 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config-secret") pod "openstackclient" (UID: "eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb") : secret "openstack-config-secret" not found Jan 31 15:07:31 crc kubenswrapper[4751]: I0131 15:07:31.230599 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-76zwv" Jan 31 15:07:31 crc kubenswrapper[4751]: I0131 15:07:31.230973 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-76zwv" Jan 31 15:07:31 crc kubenswrapper[4751]: I0131 15:07:31.280796 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-76zwv" Jan 31 15:07:31 crc kubenswrapper[4751]: I0131 15:07:31.878629 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-76zwv" Jan 31 15:07:31 crc kubenswrapper[4751]: I0131 15:07:31.923139 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-76zwv"] Jan 31 15:07:33 crc kubenswrapper[4751]: I0131 15:07:33.859491 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-76zwv" podUID="ca5a5a5e-fdc7-409c-b452-44b84779eba2" containerName="registry-server" containerID="cri-o://313ed8d08566eef3d80118fd1b73dbf2b891937d2a28d957adf4fe8bc1065d48" gracePeriod=2 Jan 31 15:07:35 crc kubenswrapper[4751]: I0131 15:07:34.751776 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-76zwv" Jan 31 15:07:35 crc kubenswrapper[4751]: I0131 15:07:34.815873 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca5a5a5e-fdc7-409c-b452-44b84779eba2-catalog-content\") pod \"ca5a5a5e-fdc7-409c-b452-44b84779eba2\" (UID: \"ca5a5a5e-fdc7-409c-b452-44b84779eba2\") " Jan 31 15:07:35 crc kubenswrapper[4751]: I0131 15:07:34.815980 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca5a5a5e-fdc7-409c-b452-44b84779eba2-utilities\") pod \"ca5a5a5e-fdc7-409c-b452-44b84779eba2\" (UID: \"ca5a5a5e-fdc7-409c-b452-44b84779eba2\") " Jan 31 15:07:35 crc kubenswrapper[4751]: I0131 15:07:34.816016 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cwxr\" (UniqueName: \"kubernetes.io/projected/ca5a5a5e-fdc7-409c-b452-44b84779eba2-kube-api-access-6cwxr\") pod \"ca5a5a5e-fdc7-409c-b452-44b84779eba2\" (UID: \"ca5a5a5e-fdc7-409c-b452-44b84779eba2\") " Jan 31 15:07:35 crc kubenswrapper[4751]: I0131 15:07:34.817289 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca5a5a5e-fdc7-409c-b452-44b84779eba2-utilities" (OuterVolumeSpecName: "utilities") pod "ca5a5a5e-fdc7-409c-b452-44b84779eba2" (UID: "ca5a5a5e-fdc7-409c-b452-44b84779eba2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:07:35 crc kubenswrapper[4751]: I0131 15:07:34.839310 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca5a5a5e-fdc7-409c-b452-44b84779eba2-kube-api-access-6cwxr" (OuterVolumeSpecName: "kube-api-access-6cwxr") pod "ca5a5a5e-fdc7-409c-b452-44b84779eba2" (UID: "ca5a5a5e-fdc7-409c-b452-44b84779eba2"). InnerVolumeSpecName "kube-api-access-6cwxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:07:35 crc kubenswrapper[4751]: I0131 15:07:34.841369 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca5a5a5e-fdc7-409c-b452-44b84779eba2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ca5a5a5e-fdc7-409c-b452-44b84779eba2" (UID: "ca5a5a5e-fdc7-409c-b452-44b84779eba2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:07:35 crc kubenswrapper[4751]: I0131 15:07:34.866424 4751 generic.go:334] "Generic (PLEG): container finished" podID="ca5a5a5e-fdc7-409c-b452-44b84779eba2" containerID="313ed8d08566eef3d80118fd1b73dbf2b891937d2a28d957adf4fe8bc1065d48" exitCode=0 Jan 31 15:07:35 crc kubenswrapper[4751]: I0131 15:07:34.866459 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-76zwv" event={"ID":"ca5a5a5e-fdc7-409c-b452-44b84779eba2","Type":"ContainerDied","Data":"313ed8d08566eef3d80118fd1b73dbf2b891937d2a28d957adf4fe8bc1065d48"} Jan 31 15:07:35 crc kubenswrapper[4751]: I0131 15:07:34.866482 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-76zwv" event={"ID":"ca5a5a5e-fdc7-409c-b452-44b84779eba2","Type":"ContainerDied","Data":"304775135fd2604e43538146d5fba66160f453d9567898a2957c9c65dc840cad"} Jan 31 15:07:35 crc kubenswrapper[4751]: I0131 15:07:34.866499 4751 scope.go:117] "RemoveContainer" containerID="313ed8d08566eef3d80118fd1b73dbf2b891937d2a28d957adf4fe8bc1065d48" Jan 31 15:07:35 crc kubenswrapper[4751]: I0131 15:07:34.866539 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-76zwv" Jan 31 15:07:35 crc kubenswrapper[4751]: I0131 15:07:34.898656 4751 scope.go:117] "RemoveContainer" containerID="108d9e539af6c029e8ea242e46655d6d091d2b9ce6876bc3e568fb564b94984a" Jan 31 15:07:35 crc kubenswrapper[4751]: I0131 15:07:34.905063 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-76zwv"] Jan 31 15:07:35 crc kubenswrapper[4751]: I0131 15:07:34.910543 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-76zwv"] Jan 31 15:07:35 crc kubenswrapper[4751]: I0131 15:07:34.925992 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca5a5a5e-fdc7-409c-b452-44b84779eba2-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 15:07:35 crc kubenswrapper[4751]: I0131 15:07:34.926025 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca5a5a5e-fdc7-409c-b452-44b84779eba2-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 15:07:35 crc kubenswrapper[4751]: I0131 15:07:34.926041 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6cwxr\" (UniqueName: \"kubernetes.io/projected/ca5a5a5e-fdc7-409c-b452-44b84779eba2-kube-api-access-6cwxr\") on node \"crc\" DevicePath \"\"" Jan 31 15:07:35 crc kubenswrapper[4751]: I0131 15:07:34.926279 4751 scope.go:117] "RemoveContainer" containerID="23d1a9651161a01cd1bee04fc08b679a060106e1e39fd6accef9df2a385409fb" Jan 31 15:07:35 crc kubenswrapper[4751]: I0131 15:07:34.946767 4751 scope.go:117] "RemoveContainer" containerID="313ed8d08566eef3d80118fd1b73dbf2b891937d2a28d957adf4fe8bc1065d48" Jan 31 15:07:35 crc kubenswrapper[4751]: E0131 15:07:34.947299 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"313ed8d08566eef3d80118fd1b73dbf2b891937d2a28d957adf4fe8bc1065d48\": container with ID starting with 313ed8d08566eef3d80118fd1b73dbf2b891937d2a28d957adf4fe8bc1065d48 not found: ID does not exist" containerID="313ed8d08566eef3d80118fd1b73dbf2b891937d2a28d957adf4fe8bc1065d48" Jan 31 15:07:35 crc kubenswrapper[4751]: I0131 15:07:34.947339 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"313ed8d08566eef3d80118fd1b73dbf2b891937d2a28d957adf4fe8bc1065d48"} err="failed to get container status \"313ed8d08566eef3d80118fd1b73dbf2b891937d2a28d957adf4fe8bc1065d48\": rpc error: code = NotFound desc = could not find container \"313ed8d08566eef3d80118fd1b73dbf2b891937d2a28d957adf4fe8bc1065d48\": container with ID starting with 313ed8d08566eef3d80118fd1b73dbf2b891937d2a28d957adf4fe8bc1065d48 not found: ID does not exist" Jan 31 15:07:35 crc kubenswrapper[4751]: I0131 15:07:34.947371 4751 scope.go:117] "RemoveContainer" containerID="108d9e539af6c029e8ea242e46655d6d091d2b9ce6876bc3e568fb564b94984a" Jan 31 15:07:35 crc kubenswrapper[4751]: E0131 15:07:34.947817 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"108d9e539af6c029e8ea242e46655d6d091d2b9ce6876bc3e568fb564b94984a\": container with ID starting with 108d9e539af6c029e8ea242e46655d6d091d2b9ce6876bc3e568fb564b94984a not found: ID does not exist" containerID="108d9e539af6c029e8ea242e46655d6d091d2b9ce6876bc3e568fb564b94984a" Jan 31 15:07:35 crc kubenswrapper[4751]: I0131 15:07:34.947865 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"108d9e539af6c029e8ea242e46655d6d091d2b9ce6876bc3e568fb564b94984a"} err="failed to get container status \"108d9e539af6c029e8ea242e46655d6d091d2b9ce6876bc3e568fb564b94984a\": rpc error: code = NotFound desc = could not find container \"108d9e539af6c029e8ea242e46655d6d091d2b9ce6876bc3e568fb564b94984a\": container with ID starting with 108d9e539af6c029e8ea242e46655d6d091d2b9ce6876bc3e568fb564b94984a not found: ID does not exist" Jan 31 15:07:35 crc kubenswrapper[4751]: I0131 15:07:34.947891 4751 scope.go:117] "RemoveContainer" containerID="23d1a9651161a01cd1bee04fc08b679a060106e1e39fd6accef9df2a385409fb" Jan 31 15:07:35 crc kubenswrapper[4751]: E0131 15:07:34.948182 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23d1a9651161a01cd1bee04fc08b679a060106e1e39fd6accef9df2a385409fb\": container with ID starting with 23d1a9651161a01cd1bee04fc08b679a060106e1e39fd6accef9df2a385409fb not found: ID does not exist" containerID="23d1a9651161a01cd1bee04fc08b679a060106e1e39fd6accef9df2a385409fb" Jan 31 15:07:35 crc kubenswrapper[4751]: I0131 15:07:34.948226 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23d1a9651161a01cd1bee04fc08b679a060106e1e39fd6accef9df2a385409fb"} err="failed to get container status \"23d1a9651161a01cd1bee04fc08b679a060106e1e39fd6accef9df2a385409fb\": rpc error: code = NotFound desc = could not find container \"23d1a9651161a01cd1bee04fc08b679a060106e1e39fd6accef9df2a385409fb\": container with ID starting with 23d1a9651161a01cd1bee04fc08b679a060106e1e39fd6accef9df2a385409fb not found: ID does not exist" Jan 31 15:07:35 crc kubenswrapper[4751]: I0131 15:07:35.244793 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926c857f_eec59a88-8f4d-4482-aa2a-11a508cc3a79/util/0.log" Jan 31 15:07:35 crc kubenswrapper[4751]: I0131 15:07:35.444898 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926c857f_eec59a88-8f4d-4482-aa2a-11a508cc3a79/util/0.log" Jan 31 15:07:35 crc kubenswrapper[4751]: I0131 15:07:35.490511 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926c857f_eec59a88-8f4d-4482-aa2a-11a508cc3a79/pull/0.log" Jan 31 15:07:35 crc kubenswrapper[4751]: I0131 15:07:35.502571 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926c857f_eec59a88-8f4d-4482-aa2a-11a508cc3a79/pull/0.log" Jan 31 15:07:35 crc kubenswrapper[4751]: I0131 15:07:35.636789 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926c857f_eec59a88-8f4d-4482-aa2a-11a508cc3a79/util/0.log" Jan 31 15:07:35 crc kubenswrapper[4751]: I0131 15:07:35.649084 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926c857f_eec59a88-8f4d-4482-aa2a-11a508cc3a79/extract/0.log" Jan 31 15:07:35 crc kubenswrapper[4751]: I0131 15:07:35.672517 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926c857f_eec59a88-8f4d-4482-aa2a-11a508cc3a79/pull/0.log" Jan 31 15:07:35 crc kubenswrapper[4751]: I0131 15:07:35.781510 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d4cb5b58-r8xn7_91cc4333-403a-4ce4-a347-8b475ad0169a/manager/0.log" Jan 31 15:07:35 crc kubenswrapper[4751]: I0131 15:07:35.850528 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-index-vjs56_95bedc09-cab6-4e6b-a210-8cb1f8b39601/registry-server/0.log" Jan 31 15:07:36 crc kubenswrapper[4751]: I0131 15:07:36.416604 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca5a5a5e-fdc7-409c-b452-44b84779eba2" path="/var/lib/kubelet/pods/ca5a5a5e-fdc7-409c-b452-44b84779eba2/volumes" Jan 31 15:07:47 crc kubenswrapper[4751]: I0131 15:07:47.173278 4751 scope.go:117] "RemoveContainer" containerID="f2d3ac70f8ddad94f9d969f2045d3e2ecc9acc9f7ef1ceb69fe7a6e69910af4e" Jan 31 15:07:47 crc kubenswrapper[4751]: I0131 15:07:47.216624 4751 scope.go:117] "RemoveContainer" containerID="7e789eeabd8afc4f9d1d5096f902a1d03746cbe8acdf7df1c1fc6d2741b5975c" Jan 31 15:07:47 crc kubenswrapper[4751]: I0131 15:07:47.235312 4751 scope.go:117] "RemoveContainer" containerID="9903c977627bd13e9ad2f5f25c1001bf58623795a6fa400f5ca5b3724b524577" Jan 31 15:07:47 crc kubenswrapper[4751]: I0131 15:07:47.255151 4751 scope.go:117] "RemoveContainer" containerID="1b08739497c3b40bf4675eac8a3f77cfbe93709c363b0f7d316a1a53ab0f3eab" Jan 31 15:07:47 crc kubenswrapper[4751]: I0131 15:07:47.278396 4751 scope.go:117] "RemoveContainer" containerID="488f4cd159917294625dbe3f504270e4c6cae704ed670c29ddb28b43bab332ff" Jan 31 15:07:47 crc kubenswrapper[4751]: I0131 15:07:47.292109 4751 scope.go:117] "RemoveContainer" containerID="5da73e1408c3942c575e820ab3bbf5f7e673d6aadac72064d98cb22aab529aa9" Jan 31 15:07:47 crc kubenswrapper[4751]: I0131 15:07:47.316208 4751 scope.go:117] "RemoveContainer" containerID="2def03042cdbf5505276d6eb76695378d7a0c3b7b97a2d260b2bb7c00d1d66d9" Jan 31 15:07:47 crc kubenswrapper[4751]: I0131 15:07:47.337128 4751 scope.go:117] "RemoveContainer" containerID="4fd861ffb49593c05c4ca2dc031ea7913a88e0c31f7cbaf913eca6a5819336ff" Jan 31 15:07:47 crc kubenswrapper[4751]: I0131 15:07:47.988700 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-h4drr_5c630253-f658-44fb-891d-f560f1e2b577/control-plane-machine-set-operator/0.log" Jan 31 15:07:48 crc kubenswrapper[4751]: I0131 15:07:48.117750 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-4gqrl_bcd7a932-6db9-4cca-b619-852242324725/kube-rbac-proxy/0.log" Jan 31 15:07:48 crc kubenswrapper[4751]: I0131 15:07:48.164174 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-4gqrl_bcd7a932-6db9-4cca-b619-852242324725/machine-api-operator/0.log" Jan 31 15:08:00 crc kubenswrapper[4751]: E0131 15:08:00.176422 4751 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-config: configmap "openstack-config" not found Jan 31 15:08:00 crc kubenswrapper[4751]: E0131 15:08:00.176448 4751 secret.go:188] Couldn't get secret glance-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Jan 31 15:08:00 crc kubenswrapper[4751]: E0131 15:08:00.177287 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config podName:eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb nodeName:}" failed. No retries permitted until 2026-01-31 15:09:04.177257883 +0000 UTC m=+1648.551970798 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config") pod "openstackclient" (UID: "eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb") : configmap "openstack-config" not found Jan 31 15:08:00 crc kubenswrapper[4751]: E0131 15:08:00.177370 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config-secret podName:eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb nodeName:}" failed. No retries permitted until 2026-01-31 15:09:04.177338385 +0000 UTC m=+1648.552051310 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config-secret") pod "openstackclient" (UID: "eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb") : secret "openstack-config-secret" not found Jan 31 15:08:15 crc kubenswrapper[4751]: I0131 15:08:15.334138 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-6dhf9_6b667c31-e911-496a-9c8b-12c906e724ec/kube-rbac-proxy/0.log" Jan 31 15:08:15 crc kubenswrapper[4751]: I0131 15:08:15.368110 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-6dhf9_6b667c31-e911-496a-9c8b-12c906e724ec/controller/0.log" Jan 31 15:08:15 crc kubenswrapper[4751]: I0131 15:08:15.544388 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9z9n2_b1f214e9-14db-462f-900c-3652ec7908e5/cp-frr-files/0.log" Jan 31 15:08:15 crc kubenswrapper[4751]: I0131 15:08:15.740328 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9z9n2_b1f214e9-14db-462f-900c-3652ec7908e5/cp-metrics/0.log" Jan 31 15:08:15 crc kubenswrapper[4751]: I0131 15:08:15.757113 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9z9n2_b1f214e9-14db-462f-900c-3652ec7908e5/cp-reloader/0.log" Jan 31 15:08:15 crc kubenswrapper[4751]: I0131 15:08:15.782344 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9z9n2_b1f214e9-14db-462f-900c-3652ec7908e5/cp-frr-files/0.log" Jan 31 15:08:15 crc kubenswrapper[4751]: I0131 15:08:15.783797 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9z9n2_b1f214e9-14db-462f-900c-3652ec7908e5/cp-reloader/0.log" Jan 31 15:08:15 crc kubenswrapper[4751]: I0131 15:08:15.997230 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9z9n2_b1f214e9-14db-462f-900c-3652ec7908e5/cp-reloader/0.log" Jan 31 15:08:15 crc kubenswrapper[4751]: I0131 15:08:15.999121 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9z9n2_b1f214e9-14db-462f-900c-3652ec7908e5/cp-frr-files/0.log" Jan 31 15:08:16 crc kubenswrapper[4751]: I0131 15:08:16.007627 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9z9n2_b1f214e9-14db-462f-900c-3652ec7908e5/cp-metrics/0.log" Jan 31 15:08:16 crc kubenswrapper[4751]: I0131 15:08:16.036295 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9z9n2_b1f214e9-14db-462f-900c-3652ec7908e5/cp-metrics/0.log" Jan 31 15:08:16 crc kubenswrapper[4751]: I0131 15:08:16.166340 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9z9n2_b1f214e9-14db-462f-900c-3652ec7908e5/cp-frr-files/0.log" Jan 31 15:08:16 crc kubenswrapper[4751]: I0131 15:08:16.166560 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9z9n2_b1f214e9-14db-462f-900c-3652ec7908e5/cp-metrics/0.log" Jan 31 15:08:16 crc kubenswrapper[4751]: I0131 15:08:16.168925 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9z9n2_b1f214e9-14db-462f-900c-3652ec7908e5/cp-reloader/0.log" Jan 31 15:08:16 crc kubenswrapper[4751]: I0131 15:08:16.220781 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9z9n2_b1f214e9-14db-462f-900c-3652ec7908e5/controller/0.log" Jan 31 15:08:16 crc kubenswrapper[4751]: I0131 15:08:16.336957 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9z9n2_b1f214e9-14db-462f-900c-3652ec7908e5/kube-rbac-proxy/0.log" Jan 31 15:08:16 crc kubenswrapper[4751]: I0131 15:08:16.367368 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9z9n2_b1f214e9-14db-462f-900c-3652ec7908e5/frr-metrics/0.log" Jan 31 15:08:16 crc kubenswrapper[4751]: I0131 15:08:16.410948 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9z9n2_b1f214e9-14db-462f-900c-3652ec7908e5/kube-rbac-proxy-frr/0.log" Jan 31 15:08:16 crc kubenswrapper[4751]: I0131 15:08:16.567239 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9z9n2_b1f214e9-14db-462f-900c-3652ec7908e5/reloader/0.log" Jan 31 15:08:16 crc kubenswrapper[4751]: I0131 15:08:16.646744 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-qf86j_94655b12-be6a-4043-8f7c-80d1b7fb1a2f/frr-k8s-webhook-server/0.log" Jan 31 15:08:16 crc kubenswrapper[4751]: I0131 15:08:16.751336 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6b999687d7-vf7mn_bd60e998-83e4-442a-98ac-c4e33d4b4765/manager/0.log" Jan 31 15:08:16 crc kubenswrapper[4751]: I0131 15:08:16.992483 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5c46dd7d46-8xt78_01320eb9-ccb5-4593-866a-f49553fa7262/webhook-server/0.log" Jan 31 15:08:17 crc kubenswrapper[4751]: I0131 15:08:17.082687 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9z9n2_b1f214e9-14db-462f-900c-3652ec7908e5/frr/0.log" Jan 31 15:08:17 crc kubenswrapper[4751]: I0131 15:08:17.097583 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-qv6gh_7e8b57b9-8a0b-4ef2-83c9-85f9e60c9adc/kube-rbac-proxy/0.log" Jan 31 15:08:17 crc kubenswrapper[4751]: I0131 15:08:17.308704 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-qv6gh_7e8b57b9-8a0b-4ef2-83c9-85f9e60c9adc/speaker/0.log" Jan 31 15:08:29 crc kubenswrapper[4751]: I0131 15:08:29.846752 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstackclient_eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb/openstackclient/0.log" Jan 31 15:08:38 crc kubenswrapper[4751]: I0131 15:08:38.896832 4751 patch_prober.go:28] interesting pod/machine-config-daemon-2wpj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 15:08:38 crc kubenswrapper[4751]: I0131 15:08:38.897472 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podUID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 15:08:42 crc kubenswrapper[4751]: I0131 15:08:42.197787 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczsbcz_f3380dc7-49d9-4d61-a0bb-003c1c5e2742/util/0.log" Jan 31 15:08:42 crc kubenswrapper[4751]: I0131 15:08:42.420524 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczsbcz_f3380dc7-49d9-4d61-a0bb-003c1c5e2742/util/0.log" Jan 31 15:08:42 crc kubenswrapper[4751]: I0131 15:08:42.444111 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczsbcz_f3380dc7-49d9-4d61-a0bb-003c1c5e2742/pull/0.log" Jan 31 15:08:42 crc kubenswrapper[4751]: I0131 15:08:42.453277 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczsbcz_f3380dc7-49d9-4d61-a0bb-003c1c5e2742/pull/0.log" Jan 31 15:08:42 crc kubenswrapper[4751]: I0131 15:08:42.621132 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczsbcz_f3380dc7-49d9-4d61-a0bb-003c1c5e2742/util/0.log" Jan 31 15:08:42 crc kubenswrapper[4751]: I0131 15:08:42.632043 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczsbcz_f3380dc7-49d9-4d61-a0bb-003c1c5e2742/pull/0.log" Jan 31 15:08:42 crc kubenswrapper[4751]: I0131 15:08:42.640605 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczsbcz_f3380dc7-49d9-4d61-a0bb-003c1c5e2742/extract/0.log" Jan 31 15:08:42 crc kubenswrapper[4751]: I0131 15:08:42.795017 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qcs7h_c83f0a10-f56b-4795-93b9-ee224d439648/extract-utilities/0.log" Jan 31 15:08:42 crc kubenswrapper[4751]: I0131 15:08:42.960914 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qcs7h_c83f0a10-f56b-4795-93b9-ee224d439648/extract-content/0.log" Jan 31 15:08:42 crc kubenswrapper[4751]: I0131 15:08:42.988870 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qcs7h_c83f0a10-f56b-4795-93b9-ee224d439648/extract-utilities/0.log" Jan 31 15:08:42 crc kubenswrapper[4751]: I0131 15:08:42.993194 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qcs7h_c83f0a10-f56b-4795-93b9-ee224d439648/extract-content/0.log" Jan 31 15:08:43 crc kubenswrapper[4751]: I0131 15:08:43.116599 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qcs7h_c83f0a10-f56b-4795-93b9-ee224d439648/extract-content/0.log" Jan 31 15:08:43 crc kubenswrapper[4751]: I0131 15:08:43.136181 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qcs7h_c83f0a10-f56b-4795-93b9-ee224d439648/extract-utilities/0.log" Jan 31 15:08:43 crc kubenswrapper[4751]: I0131 15:08:43.318921 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gr5gf_2eb5e3aa-17fa-49a0-a422-bc69a8a410fb/extract-utilities/0.log" Jan 31 15:08:43 crc kubenswrapper[4751]: I0131 15:08:43.508684 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gr5gf_2eb5e3aa-17fa-49a0-a422-bc69a8a410fb/extract-utilities/0.log" Jan 31 15:08:43 crc kubenswrapper[4751]: I0131 15:08:43.508726 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gr5gf_2eb5e3aa-17fa-49a0-a422-bc69a8a410fb/extract-content/0.log" Jan 31 15:08:43 crc kubenswrapper[4751]: I0131 15:08:43.540843 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gr5gf_2eb5e3aa-17fa-49a0-a422-bc69a8a410fb/extract-content/0.log" Jan 31 15:08:43 crc kubenswrapper[4751]: I0131 15:08:43.579115 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qcs7h_c83f0a10-f56b-4795-93b9-ee224d439648/registry-server/0.log" Jan 31 15:08:43 crc kubenswrapper[4751]: I0131 15:08:43.685040 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gr5gf_2eb5e3aa-17fa-49a0-a422-bc69a8a410fb/extract-content/0.log" Jan 31 15:08:43 crc kubenswrapper[4751]: I0131 15:08:43.690950 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gr5gf_2eb5e3aa-17fa-49a0-a422-bc69a8a410fb/extract-utilities/0.log" Jan 31 15:08:43 crc kubenswrapper[4751]: I0131 15:08:43.849255 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-jv94g_9853dd16-26f9-4fe4-9468-52d39dd4dd1f/marketplace-operator/0.log" Jan 31 15:08:43 crc kubenswrapper[4751]: I0131 15:08:43.908824 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-22krg_affc293d-ac4e-49ad-be4a-bc13d7c056a7/extract-utilities/0.log" Jan 31 15:08:44 crc kubenswrapper[4751]: I0131 15:08:44.121869 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gr5gf_2eb5e3aa-17fa-49a0-a422-bc69a8a410fb/registry-server/0.log" Jan 31 15:08:44 crc kubenswrapper[4751]: I0131 15:08:44.123445 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-22krg_affc293d-ac4e-49ad-be4a-bc13d7c056a7/extract-utilities/0.log" Jan 31 15:08:44 crc kubenswrapper[4751]: I0131 15:08:44.142191 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-22krg_affc293d-ac4e-49ad-be4a-bc13d7c056a7/extract-content/0.log" Jan 31 15:08:44 crc kubenswrapper[4751]: I0131 15:08:44.190534 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-22krg_affc293d-ac4e-49ad-be4a-bc13d7c056a7/extract-content/0.log" Jan 31 15:08:44 crc kubenswrapper[4751]: I0131 15:08:44.326128 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-22krg_affc293d-ac4e-49ad-be4a-bc13d7c056a7/extract-utilities/0.log" Jan 31 15:08:44 crc kubenswrapper[4751]: I0131 15:08:44.337525 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-22krg_affc293d-ac4e-49ad-be4a-bc13d7c056a7/extract-content/0.log" Jan 31 15:08:44 crc kubenswrapper[4751]: I0131 15:08:44.437569 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-22krg_affc293d-ac4e-49ad-be4a-bc13d7c056a7/registry-server/0.log" Jan 31 15:08:44 crc kubenswrapper[4751]: I0131 15:08:44.531384 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-678m7_43fbbbf2-c128-46a4-9cc3-99e46c617027/extract-utilities/0.log" Jan 31 15:08:44 crc kubenswrapper[4751]: I0131 15:08:44.665916 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-678m7_43fbbbf2-c128-46a4-9cc3-99e46c617027/extract-content/0.log" Jan 31 15:08:44 crc kubenswrapper[4751]: I0131 15:08:44.670654 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-678m7_43fbbbf2-c128-46a4-9cc3-99e46c617027/extract-utilities/0.log" Jan 31 15:08:44 crc kubenswrapper[4751]: I0131 15:08:44.678799 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-678m7_43fbbbf2-c128-46a4-9cc3-99e46c617027/extract-content/0.log" Jan 31 15:08:44 crc kubenswrapper[4751]: I0131 15:08:44.866062 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-678m7_43fbbbf2-c128-46a4-9cc3-99e46c617027/extract-content/0.log" Jan 31 15:08:44 crc kubenswrapper[4751]: I0131 15:08:44.882487 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-678m7_43fbbbf2-c128-46a4-9cc3-99e46c617027/extract-utilities/0.log" Jan 31 15:08:45 crc kubenswrapper[4751]: I0131 15:08:45.297004 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-678m7_43fbbbf2-c128-46a4-9cc3-99e46c617027/registry-server/0.log" Jan 31 15:08:47 crc kubenswrapper[4751]: I0131 15:08:47.467124 4751 scope.go:117] "RemoveContainer" containerID="f05a5057693bfdfb7d9c10870add4a18c1b97e05d99f428268b3b93785058feb" Jan 31 15:08:47 crc kubenswrapper[4751]: I0131 15:08:47.487338 4751 scope.go:117] "RemoveContainer" containerID="f75375e8e6ad82f0f02e30825660a61882c0595e19792c2979a8125e9bf94686" Jan 31 15:08:47 crc kubenswrapper[4751]: I0131 15:08:47.527773 4751 scope.go:117] "RemoveContainer" containerID="79a10f8ac34beb7889999938f10a1f8fd98e243cac870f30f1dc184e88a0e786" Jan 31 15:08:47 crc kubenswrapper[4751]: I0131 15:08:47.539994 4751 scope.go:117] "RemoveContainer" containerID="2d17a7d49f7479975731597d7e17ac81d17ead2b622a0ef7093e781d499f7009" Jan 31 15:08:47 crc kubenswrapper[4751]: I0131 15:08:47.566887 4751 scope.go:117] "RemoveContainer" containerID="0e1d80ca3a8421336cb1b11f5bd0a2d183f47c5e60dedbf720f6c08836e3d291" Jan 31 15:08:47 crc kubenswrapper[4751]: I0131 15:08:47.612594 4751 scope.go:117] "RemoveContainer" containerID="748bb24fed6fe40319dbeeaf8bdfc4e48c0cf8e80d0e06626f9b2a7dd29a8843" Jan 31 15:08:47 crc kubenswrapper[4751]: I0131 15:08:47.628841 4751 scope.go:117] "RemoveContainer" containerID="a97088bc226d5155802489f7ac6a208ee9b1cacfbbb954588201d395f2a07500" Jan 31 15:08:47 crc kubenswrapper[4751]: I0131 15:08:47.649776 4751 scope.go:117] "RemoveContainer" containerID="2145d899923840c33715fa17628307294c5047421a38139dabc06cf3d05cb997" Jan 31 15:09:04 crc kubenswrapper[4751]: E0131 15:09:04.185715 4751 secret.go:188] Couldn't get secret glance-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Jan 31 15:09:04 crc kubenswrapper[4751]: E0131 15:09:04.186222 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config-secret podName:eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb nodeName:}" failed. No retries permitted until 2026-01-31 15:11:06.186203452 +0000 UTC m=+1770.560916347 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config-secret") pod "openstackclient" (UID: "eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb") : secret "openstack-config-secret" not found Jan 31 15:09:04 crc kubenswrapper[4751]: E0131 15:09:04.185722 4751 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-config: configmap "openstack-config" not found Jan 31 15:09:04 crc kubenswrapper[4751]: E0131 15:09:04.186374 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config podName:eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb nodeName:}" failed. No retries permitted until 2026-01-31 15:11:06.186340486 +0000 UTC m=+1770.561053401 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config") pod "openstackclient" (UID: "eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb") : configmap "openstack-config" not found Jan 31 15:09:08 crc kubenswrapper[4751]: I0131 15:09:08.896737 4751 patch_prober.go:28] interesting pod/machine-config-daemon-2wpj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 15:09:08 crc kubenswrapper[4751]: I0131 15:09:08.897405 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podUID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 15:09:38 crc kubenswrapper[4751]: I0131 15:09:38.897113 4751 patch_prober.go:28] interesting pod/machine-config-daemon-2wpj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 15:09:38 crc kubenswrapper[4751]: I0131 15:09:38.898276 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podUID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 15:09:38 crc kubenswrapper[4751]: I0131 15:09:38.898373 4751 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" Jan 31 15:09:38 crc kubenswrapper[4751]: I0131 15:09:38.898828 4751 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1e91cdb1164832b2457470e9c5a6b63801cdbe4c69db7c3369929241eb60a130"} pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 31 15:09:38 crc kubenswrapper[4751]: I0131 15:09:38.898952 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podUID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" containerName="machine-config-daemon" containerID="cri-o://1e91cdb1164832b2457470e9c5a6b63801cdbe4c69db7c3369929241eb60a130" gracePeriod=600 Jan 31 15:09:39 crc kubenswrapper[4751]: E0131 15:09:39.020455 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wpj7_openshift-machine-config-operator(b4c170e8-22c9-43a9-8b34-9d626c2ccddc)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podUID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" Jan 31 15:09:39 crc kubenswrapper[4751]: I0131 15:09:39.637364 4751 generic.go:334] "Generic (PLEG): container finished" podID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" containerID="1e91cdb1164832b2457470e9c5a6b63801cdbe4c69db7c3369929241eb60a130" exitCode=0 Jan 31 15:09:39 crc kubenswrapper[4751]: I0131 15:09:39.637746 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" event={"ID":"b4c170e8-22c9-43a9-8b34-9d626c2ccddc","Type":"ContainerDied","Data":"1e91cdb1164832b2457470e9c5a6b63801cdbe4c69db7c3369929241eb60a130"} Jan 31 15:09:39 crc kubenswrapper[4751]: I0131 15:09:39.637927 4751 scope.go:117] "RemoveContainer" containerID="89a88ddaeae8a6fe7859be79e45bc66e157a0d02a03f5daf69e0ab6320bd15be" Jan 31 15:09:39 crc kubenswrapper[4751]: I0131 15:09:39.638710 4751 scope.go:117] "RemoveContainer" containerID="1e91cdb1164832b2457470e9c5a6b63801cdbe4c69db7c3369929241eb60a130" Jan 31 15:09:39 crc kubenswrapper[4751]: E0131 15:09:39.639109 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wpj7_openshift-machine-config-operator(b4c170e8-22c9-43a9-8b34-9d626c2ccddc)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podUID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" Jan 31 15:09:47 crc kubenswrapper[4751]: I0131 15:09:47.733969 4751 scope.go:117] "RemoveContainer" containerID="15a7c13661f7a9dc9cca48ea38cbda46b049856ab09d05ef63e1d7c0a14b8bb5" Jan 31 15:09:47 crc kubenswrapper[4751]: I0131 15:09:47.773855 4751 scope.go:117] "RemoveContainer" containerID="25f84d0f51f45c02503d2025ee5bcd86d54fb4126f654afe8e8c27150f9da926" Jan 31 15:09:47 crc kubenswrapper[4751]: I0131 15:09:47.789474 4751 scope.go:117] "RemoveContainer" containerID="2e90cc31ff36ceaadebe3379b42c48741b099a92854117d92e72b66bca77ad69" Jan 31 15:09:47 crc kubenswrapper[4751]: I0131 15:09:47.809760 4751 scope.go:117] "RemoveContainer" containerID="04a2620fed6cde572c43eab031fe61d9c4a7478ffe007510ee4e0e1e7a876ff4" Jan 31 15:09:53 crc kubenswrapper[4751]: I0131 15:09:53.405916 4751 scope.go:117] "RemoveContainer" containerID="1e91cdb1164832b2457470e9c5a6b63801cdbe4c69db7c3369929241eb60a130" Jan 31 15:09:53 crc kubenswrapper[4751]: E0131 15:09:53.406552 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wpj7_openshift-machine-config-operator(b4c170e8-22c9-43a9-8b34-9d626c2ccddc)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podUID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" Jan 31 15:09:54 crc kubenswrapper[4751]: I0131 15:09:54.741736 4751 generic.go:334] "Generic (PLEG): container finished" podID="e85b3ee2-7979-400f-a052-d00fe6e44fd8" containerID="6309ccb308ea404e3c296e7d0e15d3d0347ba6fcd799c7b7cbf6ada270683b8a" exitCode=0 Jan 31 15:09:54 crc kubenswrapper[4751]: I0131 15:09:54.741813 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rzlpf/must-gather-k47wq" event={"ID":"e85b3ee2-7979-400f-a052-d00fe6e44fd8","Type":"ContainerDied","Data":"6309ccb308ea404e3c296e7d0e15d3d0347ba6fcd799c7b7cbf6ada270683b8a"} Jan 31 15:09:54 crc kubenswrapper[4751]: I0131 15:09:54.742626 4751 scope.go:117] "RemoveContainer" containerID="6309ccb308ea404e3c296e7d0e15d3d0347ba6fcd799c7b7cbf6ada270683b8a" Jan 31 15:09:55 crc kubenswrapper[4751]: I0131 15:09:55.032871 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rzlpf_must-gather-k47wq_e85b3ee2-7979-400f-a052-d00fe6e44fd8/gather/0.log" Jan 31 15:10:02 crc kubenswrapper[4751]: I0131 15:10:02.066166 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-rzlpf/must-gather-k47wq"] Jan 31 15:10:02 crc kubenswrapper[4751]: I0131 15:10:02.066902 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-rzlpf/must-gather-k47wq" podUID="e85b3ee2-7979-400f-a052-d00fe6e44fd8" containerName="copy" containerID="cri-o://73c40052238eafbfd679d14fc1f9ec13e388944d29049da084027f407ea9e611" gracePeriod=2 Jan 31 15:10:02 crc kubenswrapper[4751]: I0131 15:10:02.070153 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-rzlpf/must-gather-k47wq"] Jan 31 15:10:02 crc kubenswrapper[4751]: I0131 15:10:02.522131 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rzlpf_must-gather-k47wq_e85b3ee2-7979-400f-a052-d00fe6e44fd8/copy/0.log" Jan 31 15:10:02 crc kubenswrapper[4751]: I0131 15:10:02.522742 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rzlpf/must-gather-k47wq" Jan 31 15:10:02 crc kubenswrapper[4751]: I0131 15:10:02.687949 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4br7s\" (UniqueName: \"kubernetes.io/projected/e85b3ee2-7979-400f-a052-d00fe6e44fd8-kube-api-access-4br7s\") pod \"e85b3ee2-7979-400f-a052-d00fe6e44fd8\" (UID: \"e85b3ee2-7979-400f-a052-d00fe6e44fd8\") " Jan 31 15:10:02 crc kubenswrapper[4751]: I0131 15:10:02.688052 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e85b3ee2-7979-400f-a052-d00fe6e44fd8-must-gather-output\") pod \"e85b3ee2-7979-400f-a052-d00fe6e44fd8\" (UID: \"e85b3ee2-7979-400f-a052-d00fe6e44fd8\") " Jan 31 15:10:02 crc kubenswrapper[4751]: I0131 15:10:02.692794 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e85b3ee2-7979-400f-a052-d00fe6e44fd8-kube-api-access-4br7s" (OuterVolumeSpecName: "kube-api-access-4br7s") pod "e85b3ee2-7979-400f-a052-d00fe6e44fd8" (UID: "e85b3ee2-7979-400f-a052-d00fe6e44fd8"). InnerVolumeSpecName "kube-api-access-4br7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:10:02 crc kubenswrapper[4751]: I0131 15:10:02.751131 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e85b3ee2-7979-400f-a052-d00fe6e44fd8-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "e85b3ee2-7979-400f-a052-d00fe6e44fd8" (UID: "e85b3ee2-7979-400f-a052-d00fe6e44fd8"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:10:02 crc kubenswrapper[4751]: I0131 15:10:02.789741 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4br7s\" (UniqueName: \"kubernetes.io/projected/e85b3ee2-7979-400f-a052-d00fe6e44fd8-kube-api-access-4br7s\") on node \"crc\" DevicePath \"\"" Jan 31 15:10:02 crc kubenswrapper[4751]: I0131 15:10:02.789777 4751 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e85b3ee2-7979-400f-a052-d00fe6e44fd8-must-gather-output\") on node \"crc\" DevicePath \"\"" Jan 31 15:10:02 crc kubenswrapper[4751]: I0131 15:10:02.799803 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rzlpf_must-gather-k47wq_e85b3ee2-7979-400f-a052-d00fe6e44fd8/copy/0.log" Jan 31 15:10:02 crc kubenswrapper[4751]: I0131 15:10:02.800220 4751 generic.go:334] "Generic (PLEG): container finished" podID="e85b3ee2-7979-400f-a052-d00fe6e44fd8" containerID="73c40052238eafbfd679d14fc1f9ec13e388944d29049da084027f407ea9e611" exitCode=143 Jan 31 15:10:02 crc kubenswrapper[4751]: I0131 15:10:02.800269 4751 scope.go:117] "RemoveContainer" containerID="73c40052238eafbfd679d14fc1f9ec13e388944d29049da084027f407ea9e611" Jan 31 15:10:02 crc kubenswrapper[4751]: I0131 15:10:02.800273 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rzlpf/must-gather-k47wq" Jan 31 15:10:02 crc kubenswrapper[4751]: I0131 15:10:02.814058 4751 scope.go:117] "RemoveContainer" containerID="6309ccb308ea404e3c296e7d0e15d3d0347ba6fcd799c7b7cbf6ada270683b8a" Jan 31 15:10:02 crc kubenswrapper[4751]: I0131 15:10:02.863593 4751 scope.go:117] "RemoveContainer" containerID="73c40052238eafbfd679d14fc1f9ec13e388944d29049da084027f407ea9e611" Jan 31 15:10:02 crc kubenswrapper[4751]: E0131 15:10:02.864051 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73c40052238eafbfd679d14fc1f9ec13e388944d29049da084027f407ea9e611\": container with ID starting with 73c40052238eafbfd679d14fc1f9ec13e388944d29049da084027f407ea9e611 not found: ID does not exist" containerID="73c40052238eafbfd679d14fc1f9ec13e388944d29049da084027f407ea9e611" Jan 31 15:10:02 crc kubenswrapper[4751]: I0131 15:10:02.864099 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73c40052238eafbfd679d14fc1f9ec13e388944d29049da084027f407ea9e611"} err="failed to get container status \"73c40052238eafbfd679d14fc1f9ec13e388944d29049da084027f407ea9e611\": rpc error: code = NotFound desc = could not find container \"73c40052238eafbfd679d14fc1f9ec13e388944d29049da084027f407ea9e611\": container with ID starting with 73c40052238eafbfd679d14fc1f9ec13e388944d29049da084027f407ea9e611 not found: ID does not exist" Jan 31 15:10:02 crc kubenswrapper[4751]: I0131 15:10:02.864125 4751 scope.go:117] "RemoveContainer" containerID="6309ccb308ea404e3c296e7d0e15d3d0347ba6fcd799c7b7cbf6ada270683b8a" Jan 31 15:10:02 crc kubenswrapper[4751]: E0131 15:10:02.864812 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6309ccb308ea404e3c296e7d0e15d3d0347ba6fcd799c7b7cbf6ada270683b8a\": container with ID starting with 6309ccb308ea404e3c296e7d0e15d3d0347ba6fcd799c7b7cbf6ada270683b8a not found: ID does not exist" containerID="6309ccb308ea404e3c296e7d0e15d3d0347ba6fcd799c7b7cbf6ada270683b8a" Jan 31 15:10:02 crc kubenswrapper[4751]: I0131 15:10:02.864837 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6309ccb308ea404e3c296e7d0e15d3d0347ba6fcd799c7b7cbf6ada270683b8a"} err="failed to get container status \"6309ccb308ea404e3c296e7d0e15d3d0347ba6fcd799c7b7cbf6ada270683b8a\": rpc error: code = NotFound desc = could not find container \"6309ccb308ea404e3c296e7d0e15d3d0347ba6fcd799c7b7cbf6ada270683b8a\": container with ID starting with 6309ccb308ea404e3c296e7d0e15d3d0347ba6fcd799c7b7cbf6ada270683b8a not found: ID does not exist" Jan 31 15:10:04 crc kubenswrapper[4751]: I0131 15:10:04.414706 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e85b3ee2-7979-400f-a052-d00fe6e44fd8" path="/var/lib/kubelet/pods/e85b3ee2-7979-400f-a052-d00fe6e44fd8/volumes" Jan 31 15:10:05 crc kubenswrapper[4751]: I0131 15:10:05.405949 4751 scope.go:117] "RemoveContainer" containerID="1e91cdb1164832b2457470e9c5a6b63801cdbe4c69db7c3369929241eb60a130" Jan 31 15:10:05 crc kubenswrapper[4751]: E0131 15:10:05.406220 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wpj7_openshift-machine-config-operator(b4c170e8-22c9-43a9-8b34-9d626c2ccddc)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podUID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" Jan 31 15:10:20 crc kubenswrapper[4751]: I0131 15:10:20.405858 4751 scope.go:117] "RemoveContainer" containerID="1e91cdb1164832b2457470e9c5a6b63801cdbe4c69db7c3369929241eb60a130" Jan 31 15:10:20 crc kubenswrapper[4751]: E0131 15:10:20.407137 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wpj7_openshift-machine-config-operator(b4c170e8-22c9-43a9-8b34-9d626c2ccddc)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podUID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" Jan 31 15:10:31 crc kubenswrapper[4751]: I0131 15:10:31.407320 4751 scope.go:117] "RemoveContainer" containerID="1e91cdb1164832b2457470e9c5a6b63801cdbe4c69db7c3369929241eb60a130" Jan 31 15:10:31 crc kubenswrapper[4751]: E0131 15:10:31.408409 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wpj7_openshift-machine-config-operator(b4c170e8-22c9-43a9-8b34-9d626c2ccddc)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podUID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" Jan 31 15:10:43 crc kubenswrapper[4751]: I0131 15:10:43.406439 4751 scope.go:117] "RemoveContainer" containerID="1e91cdb1164832b2457470e9c5a6b63801cdbe4c69db7c3369929241eb60a130" Jan 31 15:10:43 crc kubenswrapper[4751]: E0131 15:10:43.407305 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wpj7_openshift-machine-config-operator(b4c170e8-22c9-43a9-8b34-9d626c2ccddc)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podUID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" Jan 31 15:10:47 crc kubenswrapper[4751]: I0131 15:10:47.893165 4751 scope.go:117] "RemoveContainer" containerID="77d9f01225cc43eac33fe40d8bc014694a35ae20a7b11d1e4c070bd741ce303a" Jan 31 15:10:47 crc kubenswrapper[4751]: I0131 15:10:47.931405 4751 scope.go:117] "RemoveContainer" containerID="c486e82ff06dabf3bbaf584cc05f4bf167ea45034bb1b4f577adb93e884d0e62" Jan 31 15:10:47 crc kubenswrapper[4751]: I0131 15:10:47.964268 4751 scope.go:117] "RemoveContainer" containerID="a87d6cd135483f11e653acd5122adb7f7e32f94e7051f9157fa4ae04850a4813" Jan 31 15:10:58 crc kubenswrapper[4751]: I0131 15:10:58.406308 4751 scope.go:117] "RemoveContainer" containerID="1e91cdb1164832b2457470e9c5a6b63801cdbe4c69db7c3369929241eb60a130" Jan 31 15:10:58 crc kubenswrapper[4751]: E0131 15:10:58.407619 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wpj7_openshift-machine-config-operator(b4c170e8-22c9-43a9-8b34-9d626c2ccddc)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podUID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" Jan 31 15:11:06 crc kubenswrapper[4751]: E0131 15:11:06.186667 4751 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-config: configmap "openstack-config" not found Jan 31 15:11:06 crc kubenswrapper[4751]: E0131 15:11:06.186702 4751 secret.go:188] Couldn't get secret glance-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Jan 31 15:11:06 crc kubenswrapper[4751]: E0131 15:11:06.186991 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config podName:eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb nodeName:}" failed. No retries permitted until 2026-01-31 15:13:08.186969408 +0000 UTC m=+1892.561682323 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config") pod "openstackclient" (UID: "eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb") : configmap "openstack-config" not found Jan 31 15:11:06 crc kubenswrapper[4751]: E0131 15:11:06.187174 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config-secret podName:eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb nodeName:}" failed. No retries permitted until 2026-01-31 15:13:08.187125332 +0000 UTC m=+1892.561838217 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config-secret") pod "openstackclient" (UID: "eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb") : secret "openstack-config-secret" not found Jan 31 15:11:12 crc kubenswrapper[4751]: I0131 15:11:12.406062 4751 scope.go:117] "RemoveContainer" containerID="1e91cdb1164832b2457470e9c5a6b63801cdbe4c69db7c3369929241eb60a130" Jan 31 15:11:12 crc kubenswrapper[4751]: E0131 15:11:12.406843 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wpj7_openshift-machine-config-operator(b4c170e8-22c9-43a9-8b34-9d626c2ccddc)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podUID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" Jan 31 15:11:24 crc kubenswrapper[4751]: I0131 15:11:24.406778 4751 scope.go:117] "RemoveContainer" containerID="1e91cdb1164832b2457470e9c5a6b63801cdbe4c69db7c3369929241eb60a130" Jan 31 15:11:24 crc kubenswrapper[4751]: E0131 15:11:24.407775 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wpj7_openshift-machine-config-operator(b4c170e8-22c9-43a9-8b34-9d626c2ccddc)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podUID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" Jan 31 15:11:38 crc kubenswrapper[4751]: I0131 15:11:38.407401 4751 scope.go:117] "RemoveContainer" containerID="1e91cdb1164832b2457470e9c5a6b63801cdbe4c69db7c3369929241eb60a130" Jan 31 15:11:38 crc kubenswrapper[4751]: E0131 15:11:38.408558 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wpj7_openshift-machine-config-operator(b4c170e8-22c9-43a9-8b34-9d626c2ccddc)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podUID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" Jan 31 15:11:48 crc kubenswrapper[4751]: I0131 15:11:48.032395 4751 scope.go:117] "RemoveContainer" containerID="8c9ca246c6d8d22550b0a337fe277ae3824da506216668e0ce9b2ebcd4cee908" Jan 31 15:11:52 crc kubenswrapper[4751]: I0131 15:11:52.406369 4751 scope.go:117] "RemoveContainer" containerID="1e91cdb1164832b2457470e9c5a6b63801cdbe4c69db7c3369929241eb60a130" Jan 31 15:11:52 crc kubenswrapper[4751]: E0131 15:11:52.407010 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wpj7_openshift-machine-config-operator(b4c170e8-22c9-43a9-8b34-9d626c2ccddc)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podUID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" Jan 31 15:12:06 crc kubenswrapper[4751]: I0131 15:12:06.413644 4751 scope.go:117] "RemoveContainer" containerID="1e91cdb1164832b2457470e9c5a6b63801cdbe4c69db7c3369929241eb60a130" Jan 31 15:12:06 crc kubenswrapper[4751]: E0131 15:12:06.415357 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wpj7_openshift-machine-config-operator(b4c170e8-22c9-43a9-8b34-9d626c2ccddc)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podUID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" Jan 31 15:12:21 crc kubenswrapper[4751]: I0131 15:12:21.406505 4751 scope.go:117] "RemoveContainer" containerID="1e91cdb1164832b2457470e9c5a6b63801cdbe4c69db7c3369929241eb60a130" Jan 31 15:12:21 crc kubenswrapper[4751]: E0131 15:12:21.407469 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wpj7_openshift-machine-config-operator(b4c170e8-22c9-43a9-8b34-9d626c2ccddc)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podUID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" Jan 31 15:12:30 crc kubenswrapper[4751]: I0131 15:12:30.845434 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-n5n7n/must-gather-kfmqg"] Jan 31 15:12:30 crc kubenswrapper[4751]: E0131 15:12:30.846199 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50a49e54-556d-487d-8cdf-3fd3dc9442a5" containerName="extract-content" Jan 31 15:12:30 crc kubenswrapper[4751]: I0131 15:12:30.846214 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="50a49e54-556d-487d-8cdf-3fd3dc9442a5" containerName="extract-content" Jan 31 15:12:30 crc kubenswrapper[4751]: E0131 15:12:30.846231 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50a49e54-556d-487d-8cdf-3fd3dc9442a5" containerName="extract-utilities" Jan 31 15:12:30 crc kubenswrapper[4751]: I0131 15:12:30.846238 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="50a49e54-556d-487d-8cdf-3fd3dc9442a5" containerName="extract-utilities" Jan 31 15:12:30 crc kubenswrapper[4751]: E0131 15:12:30.846248 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50a49e54-556d-487d-8cdf-3fd3dc9442a5" containerName="registry-server" Jan 31 15:12:30 crc kubenswrapper[4751]: I0131 15:12:30.846257 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="50a49e54-556d-487d-8cdf-3fd3dc9442a5" containerName="registry-server" Jan 31 15:12:30 crc kubenswrapper[4751]: E0131 15:12:30.846266 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca5a5a5e-fdc7-409c-b452-44b84779eba2" containerName="extract-utilities" Jan 31 15:12:30 crc kubenswrapper[4751]: I0131 15:12:30.846271 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca5a5a5e-fdc7-409c-b452-44b84779eba2" containerName="extract-utilities" Jan 31 15:12:30 crc kubenswrapper[4751]: E0131 15:12:30.846282 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca5a5a5e-fdc7-409c-b452-44b84779eba2" containerName="extract-content" Jan 31 15:12:30 crc kubenswrapper[4751]: I0131 15:12:30.846288 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca5a5a5e-fdc7-409c-b452-44b84779eba2" containerName="extract-content" Jan 31 15:12:30 crc kubenswrapper[4751]: E0131 15:12:30.846296 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca5a5a5e-fdc7-409c-b452-44b84779eba2" containerName="registry-server" Jan 31 15:12:30 crc kubenswrapper[4751]: I0131 15:12:30.846302 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca5a5a5e-fdc7-409c-b452-44b84779eba2" containerName="registry-server" Jan 31 15:12:30 crc kubenswrapper[4751]: E0131 15:12:30.846316 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e85b3ee2-7979-400f-a052-d00fe6e44fd8" containerName="gather" Jan 31 15:12:30 crc kubenswrapper[4751]: I0131 15:12:30.846321 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="e85b3ee2-7979-400f-a052-d00fe6e44fd8" containerName="gather" Jan 31 15:12:30 crc kubenswrapper[4751]: E0131 15:12:30.846331 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e85b3ee2-7979-400f-a052-d00fe6e44fd8" containerName="copy" Jan 31 15:12:30 crc kubenswrapper[4751]: I0131 15:12:30.846338 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="e85b3ee2-7979-400f-a052-d00fe6e44fd8" containerName="copy" Jan 31 15:12:30 crc kubenswrapper[4751]: I0131 15:12:30.846430 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="50a49e54-556d-487d-8cdf-3fd3dc9442a5" containerName="registry-server" Jan 31 15:12:30 crc kubenswrapper[4751]: I0131 15:12:30.846441 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="e85b3ee2-7979-400f-a052-d00fe6e44fd8" containerName="copy" Jan 31 15:12:30 crc kubenswrapper[4751]: I0131 15:12:30.846449 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca5a5a5e-fdc7-409c-b452-44b84779eba2" containerName="registry-server" Jan 31 15:12:30 crc kubenswrapper[4751]: I0131 15:12:30.846456 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="e85b3ee2-7979-400f-a052-d00fe6e44fd8" containerName="gather" Jan 31 15:12:30 crc kubenswrapper[4751]: I0131 15:12:30.847001 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n5n7n/must-gather-kfmqg" Jan 31 15:12:30 crc kubenswrapper[4751]: I0131 15:12:30.849621 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-n5n7n"/"kube-root-ca.crt" Jan 31 15:12:30 crc kubenswrapper[4751]: I0131 15:12:30.850942 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-n5n7n"/"openshift-service-ca.crt" Jan 31 15:12:30 crc kubenswrapper[4751]: I0131 15:12:30.857294 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-n5n7n/must-gather-kfmqg"] Jan 31 15:12:30 crc kubenswrapper[4751]: I0131 15:12:30.940348 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8169bcf4-4b12-4458-af34-08e57ab8e72a-must-gather-output\") pod \"must-gather-kfmqg\" (UID: \"8169bcf4-4b12-4458-af34-08e57ab8e72a\") " pod="openshift-must-gather-n5n7n/must-gather-kfmqg" Jan 31 15:12:30 crc kubenswrapper[4751]: I0131 15:12:30.940543 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh4fz\" (UniqueName: \"kubernetes.io/projected/8169bcf4-4b12-4458-af34-08e57ab8e72a-kube-api-access-fh4fz\") pod \"must-gather-kfmqg\" (UID: \"8169bcf4-4b12-4458-af34-08e57ab8e72a\") " pod="openshift-must-gather-n5n7n/must-gather-kfmqg" Jan 31 15:12:31 crc kubenswrapper[4751]: I0131 15:12:31.042125 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fh4fz\" (UniqueName: \"kubernetes.io/projected/8169bcf4-4b12-4458-af34-08e57ab8e72a-kube-api-access-fh4fz\") pod \"must-gather-kfmqg\" (UID: \"8169bcf4-4b12-4458-af34-08e57ab8e72a\") " pod="openshift-must-gather-n5n7n/must-gather-kfmqg" Jan 31 15:12:31 crc kubenswrapper[4751]: I0131 15:12:31.042498 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8169bcf4-4b12-4458-af34-08e57ab8e72a-must-gather-output\") pod \"must-gather-kfmqg\" (UID: \"8169bcf4-4b12-4458-af34-08e57ab8e72a\") " pod="openshift-must-gather-n5n7n/must-gather-kfmqg" Jan 31 15:12:31 crc kubenswrapper[4751]: I0131 15:12:31.042938 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8169bcf4-4b12-4458-af34-08e57ab8e72a-must-gather-output\") pod \"must-gather-kfmqg\" (UID: \"8169bcf4-4b12-4458-af34-08e57ab8e72a\") " pod="openshift-must-gather-n5n7n/must-gather-kfmqg" Jan 31 15:12:31 crc kubenswrapper[4751]: I0131 15:12:31.061947 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh4fz\" (UniqueName: \"kubernetes.io/projected/8169bcf4-4b12-4458-af34-08e57ab8e72a-kube-api-access-fh4fz\") pod \"must-gather-kfmqg\" (UID: \"8169bcf4-4b12-4458-af34-08e57ab8e72a\") " pod="openshift-must-gather-n5n7n/must-gather-kfmqg" Jan 31 15:12:31 crc kubenswrapper[4751]: I0131 15:12:31.170004 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n5n7n/must-gather-kfmqg" Jan 31 15:12:31 crc kubenswrapper[4751]: I0131 15:12:31.548952 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-n5n7n/must-gather-kfmqg"] Jan 31 15:12:32 crc kubenswrapper[4751]: I0131 15:12:32.011873 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n5n7n/must-gather-kfmqg" event={"ID":"8169bcf4-4b12-4458-af34-08e57ab8e72a","Type":"ContainerStarted","Data":"d92eb14a52593d6fc22bb08aec8df39b0e7882a4fa59d23aac4f2c37f671f001"} Jan 31 15:12:32 crc kubenswrapper[4751]: I0131 15:12:32.011926 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n5n7n/must-gather-kfmqg" event={"ID":"8169bcf4-4b12-4458-af34-08e57ab8e72a","Type":"ContainerStarted","Data":"cc356eac62cc8a9dc1d2dd948da59d8ed49fe19d58d0b369f5e6af812f05dc08"} Jan 31 15:12:32 crc kubenswrapper[4751]: I0131 15:12:32.011941 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n5n7n/must-gather-kfmqg" event={"ID":"8169bcf4-4b12-4458-af34-08e57ab8e72a","Type":"ContainerStarted","Data":"f5a96fe58f3e9aa570743df67af39e2d59170e23b6208375ca96cb810eba42fc"} Jan 31 15:12:32 crc kubenswrapper[4751]: I0131 15:12:32.026996 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-n5n7n/must-gather-kfmqg" podStartSLOduration=2.026979815 podStartE2EDuration="2.026979815s" podCreationTimestamp="2026-01-31 15:12:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 15:12:32.024014725 +0000 UTC m=+1856.398727610" watchObservedRunningTime="2026-01-31 15:12:32.026979815 +0000 UTC m=+1856.401692700" Jan 31 15:12:33 crc kubenswrapper[4751]: I0131 15:12:33.406342 4751 scope.go:117] "RemoveContainer" containerID="1e91cdb1164832b2457470e9c5a6b63801cdbe4c69db7c3369929241eb60a130" Jan 31 15:12:33 crc kubenswrapper[4751]: E0131 15:12:33.406848 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wpj7_openshift-machine-config-operator(b4c170e8-22c9-43a9-8b34-9d626c2ccddc)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podUID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" Jan 31 15:12:47 crc kubenswrapper[4751]: I0131 15:12:47.405879 4751 scope.go:117] "RemoveContainer" containerID="1e91cdb1164832b2457470e9c5a6b63801cdbe4c69db7c3369929241eb60a130" Jan 31 15:12:47 crc kubenswrapper[4751]: E0131 15:12:47.406620 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wpj7_openshift-machine-config-operator(b4c170e8-22c9-43a9-8b34-9d626c2ccddc)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podUID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" Jan 31 15:12:48 crc kubenswrapper[4751]: I0131 15:12:48.090722 4751 scope.go:117] "RemoveContainer" containerID="d083c8910ad51529e38c062770d9b1a45be20502e7c76553d0090ac7a9898be5" Jan 31 15:12:48 crc kubenswrapper[4751]: I0131 15:12:48.108086 4751 scope.go:117] "RemoveContainer" containerID="b5cb3ee4032129b568b4ee0fa56e2f13d4d48986ad6a3c19ca00fa7b56b0e716" Jan 31 15:12:59 crc kubenswrapper[4751]: I0131 15:12:59.405269 4751 scope.go:117] "RemoveContainer" containerID="1e91cdb1164832b2457470e9c5a6b63801cdbe4c69db7c3369929241eb60a130" Jan 31 15:12:59 crc kubenswrapper[4751]: E0131 15:12:59.405722 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wpj7_openshift-machine-config-operator(b4c170e8-22c9-43a9-8b34-9d626c2ccddc)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podUID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" Jan 31 15:13:08 crc kubenswrapper[4751]: E0131 15:13:08.240750 4751 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-config: configmap "openstack-config" not found Jan 31 15:13:08 crc kubenswrapper[4751]: E0131 15:13:08.241333 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config podName:eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb nodeName:}" failed. No retries permitted until 2026-01-31 15:15:10.241315453 +0000 UTC m=+2014.616028338 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config") pod "openstackclient" (UID: "eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb") : configmap "openstack-config" not found Jan 31 15:13:08 crc kubenswrapper[4751]: E0131 15:13:08.240816 4751 secret.go:188] Couldn't get secret glance-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Jan 31 15:13:08 crc kubenswrapper[4751]: E0131 15:13:08.241443 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config-secret podName:eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb nodeName:}" failed. No retries permitted until 2026-01-31 15:15:10.241423386 +0000 UTC m=+2014.616136271 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config-secret") pod "openstackclient" (UID: "eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb") : secret "openstack-config-secret" not found Jan 31 15:13:08 crc kubenswrapper[4751]: I0131 15:13:08.297379 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926c857f_eec59a88-8f4d-4482-aa2a-11a508cc3a79/util/0.log" Jan 31 15:13:08 crc kubenswrapper[4751]: I0131 15:13:08.401050 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926c857f_eec59a88-8f4d-4482-aa2a-11a508cc3a79/pull/0.log" Jan 31 15:13:08 crc kubenswrapper[4751]: I0131 15:13:08.419250 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926c857f_eec59a88-8f4d-4482-aa2a-11a508cc3a79/util/0.log" Jan 31 15:13:08 crc kubenswrapper[4751]: I0131 15:13:08.457584 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926c857f_eec59a88-8f4d-4482-aa2a-11a508cc3a79/pull/0.log" Jan 31 15:13:08 crc kubenswrapper[4751]: I0131 15:13:08.754394 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926c857f_eec59a88-8f4d-4482-aa2a-11a508cc3a79/pull/0.log" Jan 31 15:13:08 crc kubenswrapper[4751]: I0131 15:13:08.759544 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926c857f_eec59a88-8f4d-4482-aa2a-11a508cc3a79/util/0.log" Jan 31 15:13:08 crc kubenswrapper[4751]: I0131 15:13:08.820153 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926c857f_eec59a88-8f4d-4482-aa2a-11a508cc3a79/extract/0.log" Jan 31 15:13:08 crc kubenswrapper[4751]: I0131 15:13:08.911159 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d4cb5b58-r8xn7_91cc4333-403a-4ce4-a347-8b475ad0169a/manager/0.log" Jan 31 15:13:08 crc kubenswrapper[4751]: I0131 15:13:08.985341 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-index-vjs56_95bedc09-cab6-4e6b-a210-8cb1f8b39601/registry-server/0.log" Jan 31 15:13:12 crc kubenswrapper[4751]: I0131 15:13:12.406394 4751 scope.go:117] "RemoveContainer" containerID="1e91cdb1164832b2457470e9c5a6b63801cdbe4c69db7c3369929241eb60a130" Jan 31 15:13:12 crc kubenswrapper[4751]: E0131 15:13:12.406941 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wpj7_openshift-machine-config-operator(b4c170e8-22c9-43a9-8b34-9d626c2ccddc)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podUID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" Jan 31 15:13:22 crc kubenswrapper[4751]: I0131 15:13:22.256876 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-h4drr_5c630253-f658-44fb-891d-f560f1e2b577/control-plane-machine-set-operator/0.log" Jan 31 15:13:22 crc kubenswrapper[4751]: I0131 15:13:22.430371 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-4gqrl_bcd7a932-6db9-4cca-b619-852242324725/kube-rbac-proxy/0.log" Jan 31 15:13:22 crc kubenswrapper[4751]: I0131 15:13:22.433770 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-4gqrl_bcd7a932-6db9-4cca-b619-852242324725/machine-api-operator/0.log" Jan 31 15:13:23 crc kubenswrapper[4751]: I0131 15:13:23.406290 4751 scope.go:117] "RemoveContainer" containerID="1e91cdb1164832b2457470e9c5a6b63801cdbe4c69db7c3369929241eb60a130" Jan 31 15:13:23 crc kubenswrapper[4751]: E0131 15:13:23.406532 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wpj7_openshift-machine-config-operator(b4c170e8-22c9-43a9-8b34-9d626c2ccddc)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podUID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" Jan 31 15:13:35 crc kubenswrapper[4751]: I0131 15:13:35.405964 4751 scope.go:117] "RemoveContainer" containerID="1e91cdb1164832b2457470e9c5a6b63801cdbe4c69db7c3369929241eb60a130" Jan 31 15:13:35 crc kubenswrapper[4751]: E0131 15:13:35.406811 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wpj7_openshift-machine-config-operator(b4c170e8-22c9-43a9-8b34-9d626c2ccddc)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podUID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" Jan 31 15:13:48 crc kubenswrapper[4751]: I0131 15:13:48.405761 4751 scope.go:117] "RemoveContainer" containerID="1e91cdb1164832b2457470e9c5a6b63801cdbe4c69db7c3369929241eb60a130" Jan 31 15:13:48 crc kubenswrapper[4751]: E0131 15:13:48.406454 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wpj7_openshift-machine-config-operator(b4c170e8-22c9-43a9-8b34-9d626c2ccddc)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podUID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" Jan 31 15:13:50 crc kubenswrapper[4751]: I0131 15:13:50.817553 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-6dhf9_6b667c31-e911-496a-9c8b-12c906e724ec/kube-rbac-proxy/0.log" Jan 31 15:13:50 crc kubenswrapper[4751]: I0131 15:13:50.843221 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-6dhf9_6b667c31-e911-496a-9c8b-12c906e724ec/controller/0.log" Jan 31 15:13:51 crc kubenswrapper[4751]: I0131 15:13:51.014418 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9z9n2_b1f214e9-14db-462f-900c-3652ec7908e5/cp-frr-files/0.log" Jan 31 15:13:51 crc kubenswrapper[4751]: I0131 15:13:51.162193 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9z9n2_b1f214e9-14db-462f-900c-3652ec7908e5/cp-frr-files/0.log" Jan 31 15:13:51 crc kubenswrapper[4751]: I0131 15:13:51.174540 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9z9n2_b1f214e9-14db-462f-900c-3652ec7908e5/cp-reloader/0.log" Jan 31 15:13:51 crc kubenswrapper[4751]: I0131 15:13:51.189698 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9z9n2_b1f214e9-14db-462f-900c-3652ec7908e5/cp-metrics/0.log" Jan 31 15:13:51 crc kubenswrapper[4751]: I0131 15:13:51.203887 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9z9n2_b1f214e9-14db-462f-900c-3652ec7908e5/cp-reloader/0.log" Jan 31 15:13:51 crc kubenswrapper[4751]: I0131 15:13:51.370927 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9z9n2_b1f214e9-14db-462f-900c-3652ec7908e5/cp-frr-files/0.log" Jan 31 15:13:51 crc kubenswrapper[4751]: I0131 15:13:51.385035 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9z9n2_b1f214e9-14db-462f-900c-3652ec7908e5/cp-metrics/0.log" Jan 31 15:13:51 crc kubenswrapper[4751]: I0131 15:13:51.385642 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9z9n2_b1f214e9-14db-462f-900c-3652ec7908e5/cp-reloader/0.log" Jan 31 15:13:51 crc kubenswrapper[4751]: I0131 15:13:51.411893 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9z9n2_b1f214e9-14db-462f-900c-3652ec7908e5/cp-metrics/0.log" Jan 31 15:13:51 crc kubenswrapper[4751]: I0131 15:13:51.596786 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9z9n2_b1f214e9-14db-462f-900c-3652ec7908e5/cp-frr-files/0.log" Jan 31 15:13:51 crc kubenswrapper[4751]: I0131 15:13:51.620021 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9z9n2_b1f214e9-14db-462f-900c-3652ec7908e5/cp-metrics/0.log" Jan 31 15:13:51 crc kubenswrapper[4751]: I0131 15:13:51.625100 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9z9n2_b1f214e9-14db-462f-900c-3652ec7908e5/controller/0.log" Jan 31 15:13:51 crc kubenswrapper[4751]: I0131 15:13:51.625816 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9z9n2_b1f214e9-14db-462f-900c-3652ec7908e5/cp-reloader/0.log" Jan 31 15:13:51 crc kubenswrapper[4751]: I0131 15:13:51.788788 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9z9n2_b1f214e9-14db-462f-900c-3652ec7908e5/frr-metrics/0.log" Jan 31 15:13:51 crc kubenswrapper[4751]: I0131 15:13:51.822583 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9z9n2_b1f214e9-14db-462f-900c-3652ec7908e5/kube-rbac-proxy/0.log" Jan 31 15:13:51 crc kubenswrapper[4751]: I0131 15:13:51.879885 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9z9n2_b1f214e9-14db-462f-900c-3652ec7908e5/kube-rbac-proxy-frr/0.log" Jan 31 15:13:52 crc kubenswrapper[4751]: I0131 15:13:52.004939 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9z9n2_b1f214e9-14db-462f-900c-3652ec7908e5/reloader/0.log" Jan 31 15:13:52 crc kubenswrapper[4751]: I0131 15:13:52.063293 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-qf86j_94655b12-be6a-4043-8f7c-80d1b7fb1a2f/frr-k8s-webhook-server/0.log" Jan 31 15:13:52 crc kubenswrapper[4751]: I0131 15:13:52.341135 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6b999687d7-vf7mn_bd60e998-83e4-442a-98ac-c4e33d4b4765/manager/0.log" Jan 31 15:13:52 crc kubenswrapper[4751]: I0131 15:13:52.352741 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5c46dd7d46-8xt78_01320eb9-ccb5-4593-866a-f49553fa7262/webhook-server/0.log" Jan 31 15:13:52 crc kubenswrapper[4751]: I0131 15:13:52.490255 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9z9n2_b1f214e9-14db-462f-900c-3652ec7908e5/frr/0.log" Jan 31 15:13:52 crc kubenswrapper[4751]: I0131 15:13:52.531679 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-qv6gh_7e8b57b9-8a0b-4ef2-83c9-85f9e60c9adc/kube-rbac-proxy/0.log" Jan 31 15:13:52 crc kubenswrapper[4751]: I0131 15:13:52.745205 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-qv6gh_7e8b57b9-8a0b-4ef2-83c9-85f9e60c9adc/speaker/0.log" Jan 31 15:14:02 crc kubenswrapper[4751]: I0131 15:14:02.406449 4751 scope.go:117] "RemoveContainer" containerID="1e91cdb1164832b2457470e9c5a6b63801cdbe4c69db7c3369929241eb60a130" Jan 31 15:14:02 crc kubenswrapper[4751]: E0131 15:14:02.407585 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wpj7_openshift-machine-config-operator(b4c170e8-22c9-43a9-8b34-9d626c2ccddc)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podUID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" Jan 31 15:14:05 crc kubenswrapper[4751]: I0131 15:14:05.057545 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstackclient_eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb/openstackclient/0.log" Jan 31 15:14:14 crc kubenswrapper[4751]: I0131 15:14:14.406187 4751 scope.go:117] "RemoveContainer" containerID="1e91cdb1164832b2457470e9c5a6b63801cdbe4c69db7c3369929241eb60a130" Jan 31 15:14:14 crc kubenswrapper[4751]: E0131 15:14:14.407144 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wpj7_openshift-machine-config-operator(b4c170e8-22c9-43a9-8b34-9d626c2ccddc)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podUID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" Jan 31 15:14:17 crc kubenswrapper[4751]: I0131 15:14:17.399137 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczsbcz_f3380dc7-49d9-4d61-a0bb-003c1c5e2742/util/0.log" Jan 31 15:14:17 crc kubenswrapper[4751]: I0131 15:14:17.517935 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczsbcz_f3380dc7-49d9-4d61-a0bb-003c1c5e2742/util/0.log" Jan 31 15:14:17 crc kubenswrapper[4751]: I0131 15:14:17.547593 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczsbcz_f3380dc7-49d9-4d61-a0bb-003c1c5e2742/pull/0.log" Jan 31 15:14:17 crc kubenswrapper[4751]: I0131 15:14:17.591112 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczsbcz_f3380dc7-49d9-4d61-a0bb-003c1c5e2742/pull/0.log" Jan 31 15:14:17 crc kubenswrapper[4751]: I0131 15:14:17.718450 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczsbcz_f3380dc7-49d9-4d61-a0bb-003c1c5e2742/util/0.log" Jan 31 15:14:17 crc kubenswrapper[4751]: I0131 15:14:17.734058 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczsbcz_f3380dc7-49d9-4d61-a0bb-003c1c5e2742/extract/0.log" Jan 31 15:14:17 crc kubenswrapper[4751]: I0131 15:14:17.765494 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczsbcz_f3380dc7-49d9-4d61-a0bb-003c1c5e2742/pull/0.log" Jan 31 15:14:17 crc kubenswrapper[4751]: I0131 15:14:17.860267 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qcs7h_c83f0a10-f56b-4795-93b9-ee224d439648/extract-utilities/0.log" Jan 31 15:14:18 crc kubenswrapper[4751]: I0131 15:14:18.035033 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qcs7h_c83f0a10-f56b-4795-93b9-ee224d439648/extract-utilities/0.log" Jan 31 15:14:18 crc kubenswrapper[4751]: I0131 15:14:18.073625 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qcs7h_c83f0a10-f56b-4795-93b9-ee224d439648/extract-content/0.log" Jan 31 15:14:18 crc kubenswrapper[4751]: I0131 15:14:18.079821 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qcs7h_c83f0a10-f56b-4795-93b9-ee224d439648/extract-content/0.log" Jan 31 15:14:18 crc kubenswrapper[4751]: I0131 15:14:18.225662 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qcs7h_c83f0a10-f56b-4795-93b9-ee224d439648/extract-content/0.log" Jan 31 15:14:18 crc kubenswrapper[4751]: I0131 15:14:18.233743 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qcs7h_c83f0a10-f56b-4795-93b9-ee224d439648/extract-utilities/0.log" Jan 31 15:14:18 crc kubenswrapper[4751]: I0131 15:14:18.432275 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gr5gf_2eb5e3aa-17fa-49a0-a422-bc69a8a410fb/extract-utilities/0.log" Jan 31 15:14:18 crc kubenswrapper[4751]: I0131 15:14:18.587149 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gr5gf_2eb5e3aa-17fa-49a0-a422-bc69a8a410fb/extract-utilities/0.log" Jan 31 15:14:18 crc kubenswrapper[4751]: I0131 15:14:18.597775 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gr5gf_2eb5e3aa-17fa-49a0-a422-bc69a8a410fb/extract-content/0.log" Jan 31 15:14:18 crc kubenswrapper[4751]: I0131 15:14:18.639435 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gr5gf_2eb5e3aa-17fa-49a0-a422-bc69a8a410fb/extract-content/0.log" Jan 31 15:14:18 crc kubenswrapper[4751]: I0131 15:14:18.678347 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qcs7h_c83f0a10-f56b-4795-93b9-ee224d439648/registry-server/0.log" Jan 31 15:14:18 crc kubenswrapper[4751]: I0131 15:14:18.806319 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gr5gf_2eb5e3aa-17fa-49a0-a422-bc69a8a410fb/extract-utilities/0.log" Jan 31 15:14:18 crc kubenswrapper[4751]: I0131 15:14:18.818451 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gr5gf_2eb5e3aa-17fa-49a0-a422-bc69a8a410fb/extract-content/0.log" Jan 31 15:14:19 crc kubenswrapper[4751]: I0131 15:14:19.056683 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-jv94g_9853dd16-26f9-4fe4-9468-52d39dd4dd1f/marketplace-operator/0.log" Jan 31 15:14:19 crc kubenswrapper[4751]: I0131 15:14:19.086777 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-22krg_affc293d-ac4e-49ad-be4a-bc13d7c056a7/extract-utilities/0.log" Jan 31 15:14:19 crc kubenswrapper[4751]: I0131 15:14:19.190768 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gr5gf_2eb5e3aa-17fa-49a0-a422-bc69a8a410fb/registry-server/0.log" Jan 31 15:14:19 crc kubenswrapper[4751]: I0131 15:14:19.319682 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-22krg_affc293d-ac4e-49ad-be4a-bc13d7c056a7/extract-utilities/0.log" Jan 31 15:14:19 crc kubenswrapper[4751]: I0131 15:14:19.327166 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-22krg_affc293d-ac4e-49ad-be4a-bc13d7c056a7/extract-content/0.log" Jan 31 15:14:19 crc kubenswrapper[4751]: I0131 15:14:19.345811 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-22krg_affc293d-ac4e-49ad-be4a-bc13d7c056a7/extract-content/0.log" Jan 31 15:14:19 crc kubenswrapper[4751]: I0131 15:14:19.469335 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-22krg_affc293d-ac4e-49ad-be4a-bc13d7c056a7/extract-utilities/0.log" Jan 31 15:14:19 crc kubenswrapper[4751]: I0131 15:14:19.470620 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-22krg_affc293d-ac4e-49ad-be4a-bc13d7c056a7/extract-content/0.log" Jan 31 15:14:19 crc kubenswrapper[4751]: I0131 15:14:19.597076 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-22krg_affc293d-ac4e-49ad-be4a-bc13d7c056a7/registry-server/0.log" Jan 31 15:14:19 crc kubenswrapper[4751]: I0131 15:14:19.640924 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-678m7_43fbbbf2-c128-46a4-9cc3-99e46c617027/extract-utilities/0.log" Jan 31 15:14:19 crc kubenswrapper[4751]: I0131 15:14:19.809625 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-678m7_43fbbbf2-c128-46a4-9cc3-99e46c617027/extract-content/0.log" Jan 31 15:14:19 crc kubenswrapper[4751]: I0131 15:14:19.816107 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-678m7_43fbbbf2-c128-46a4-9cc3-99e46c617027/extract-utilities/0.log" Jan 31 15:14:19 crc kubenswrapper[4751]: I0131 15:14:19.859429 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-678m7_43fbbbf2-c128-46a4-9cc3-99e46c617027/extract-content/0.log" Jan 31 15:14:19 crc kubenswrapper[4751]: I0131 15:14:19.953861 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-678m7_43fbbbf2-c128-46a4-9cc3-99e46c617027/extract-utilities/0.log" Jan 31 15:14:20 crc kubenswrapper[4751]: I0131 15:14:20.019199 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-678m7_43fbbbf2-c128-46a4-9cc3-99e46c617027/extract-content/0.log" Jan 31 15:14:20 crc kubenswrapper[4751]: I0131 15:14:20.521278 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-678m7_43fbbbf2-c128-46a4-9cc3-99e46c617027/registry-server/0.log" Jan 31 15:14:28 crc kubenswrapper[4751]: I0131 15:14:28.406427 4751 scope.go:117] "RemoveContainer" containerID="1e91cdb1164832b2457470e9c5a6b63801cdbe4c69db7c3369929241eb60a130" Jan 31 15:14:28 crc kubenswrapper[4751]: E0131 15:14:28.407183 4751 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wpj7_openshift-machine-config-operator(b4c170e8-22c9-43a9-8b34-9d626c2ccddc)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podUID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" Jan 31 15:14:43 crc kubenswrapper[4751]: I0131 15:14:43.405741 4751 scope.go:117] "RemoveContainer" containerID="1e91cdb1164832b2457470e9c5a6b63801cdbe4c69db7c3369929241eb60a130" Jan 31 15:14:43 crc kubenswrapper[4751]: I0131 15:14:43.814711 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" event={"ID":"b4c170e8-22c9-43a9-8b34-9d626c2ccddc","Type":"ContainerStarted","Data":"1255c884133e00fc9c5d808129089de90e3ff1b6af74e3a15a0350ae021f2f6b"} Jan 31 15:15:00 crc kubenswrapper[4751]: I0131 15:15:00.152677 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497875-rng7x"] Jan 31 15:15:00 crc kubenswrapper[4751]: I0131 15:15:00.154626 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497875-rng7x" Jan 31 15:15:00 crc kubenswrapper[4751]: I0131 15:15:00.160381 4751 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 31 15:15:00 crc kubenswrapper[4751]: I0131 15:15:00.160437 4751 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 31 15:15:00 crc kubenswrapper[4751]: I0131 15:15:00.161651 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497875-rng7x"] Jan 31 15:15:00 crc kubenswrapper[4751]: I0131 15:15:00.322154 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ce076099-f84a-49c6-9566-cae17c8efd6d-config-volume\") pod \"collect-profiles-29497875-rng7x\" (UID: \"ce076099-f84a-49c6-9566-cae17c8efd6d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497875-rng7x" Jan 31 15:15:00 crc kubenswrapper[4751]: I0131 15:15:00.322208 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6kqz\" (UniqueName: \"kubernetes.io/projected/ce076099-f84a-49c6-9566-cae17c8efd6d-kube-api-access-j6kqz\") pod \"collect-profiles-29497875-rng7x\" (UID: \"ce076099-f84a-49c6-9566-cae17c8efd6d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497875-rng7x" Jan 31 15:15:00 crc kubenswrapper[4751]: I0131 15:15:00.322290 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ce076099-f84a-49c6-9566-cae17c8efd6d-secret-volume\") pod \"collect-profiles-29497875-rng7x\" (UID: \"ce076099-f84a-49c6-9566-cae17c8efd6d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497875-rng7x" Jan 31 15:15:00 crc kubenswrapper[4751]: I0131 15:15:00.423662 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ce076099-f84a-49c6-9566-cae17c8efd6d-secret-volume\") pod \"collect-profiles-29497875-rng7x\" (UID: \"ce076099-f84a-49c6-9566-cae17c8efd6d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497875-rng7x" Jan 31 15:15:00 crc kubenswrapper[4751]: I0131 15:15:00.423779 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ce076099-f84a-49c6-9566-cae17c8efd6d-config-volume\") pod \"collect-profiles-29497875-rng7x\" (UID: \"ce076099-f84a-49c6-9566-cae17c8efd6d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497875-rng7x" Jan 31 15:15:00 crc kubenswrapper[4751]: I0131 15:15:00.423800 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6kqz\" (UniqueName: \"kubernetes.io/projected/ce076099-f84a-49c6-9566-cae17c8efd6d-kube-api-access-j6kqz\") pod \"collect-profiles-29497875-rng7x\" (UID: \"ce076099-f84a-49c6-9566-cae17c8efd6d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497875-rng7x" Jan 31 15:15:00 crc kubenswrapper[4751]: I0131 15:15:00.425651 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ce076099-f84a-49c6-9566-cae17c8efd6d-config-volume\") pod \"collect-profiles-29497875-rng7x\" (UID: \"ce076099-f84a-49c6-9566-cae17c8efd6d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497875-rng7x" Jan 31 15:15:00 crc kubenswrapper[4751]: I0131 15:15:00.443906 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ce076099-f84a-49c6-9566-cae17c8efd6d-secret-volume\") pod \"collect-profiles-29497875-rng7x\" (UID: \"ce076099-f84a-49c6-9566-cae17c8efd6d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497875-rng7x" Jan 31 15:15:00 crc kubenswrapper[4751]: I0131 15:15:00.464267 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6kqz\" (UniqueName: \"kubernetes.io/projected/ce076099-f84a-49c6-9566-cae17c8efd6d-kube-api-access-j6kqz\") pod \"collect-profiles-29497875-rng7x\" (UID: \"ce076099-f84a-49c6-9566-cae17c8efd6d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497875-rng7x" Jan 31 15:15:00 crc kubenswrapper[4751]: I0131 15:15:00.479110 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497875-rng7x" Jan 31 15:15:00 crc kubenswrapper[4751]: I0131 15:15:00.730420 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497875-rng7x"] Jan 31 15:15:00 crc kubenswrapper[4751]: I0131 15:15:00.935650 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497875-rng7x" event={"ID":"ce076099-f84a-49c6-9566-cae17c8efd6d","Type":"ContainerStarted","Data":"6779d327b09eddc653cf10f40050a18611bafc3b2fcb0394ad4d3ee6ba27c365"} Jan 31 15:15:00 crc kubenswrapper[4751]: I0131 15:15:00.935882 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497875-rng7x" event={"ID":"ce076099-f84a-49c6-9566-cae17c8efd6d","Type":"ContainerStarted","Data":"ce54e21704ea948301d201e015fb2c4d5600e26e25a29f067f6f22ad0e132991"} Jan 31 15:15:00 crc kubenswrapper[4751]: I0131 15:15:00.956481 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29497875-rng7x" podStartSLOduration=0.956461328 podStartE2EDuration="956.461328ms" podCreationTimestamp="2026-01-31 15:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 15:15:00.955472112 +0000 UTC m=+2005.330184997" watchObservedRunningTime="2026-01-31 15:15:00.956461328 +0000 UTC m=+2005.331174213" Jan 31 15:15:01 crc kubenswrapper[4751]: I0131 15:15:01.942964 4751 generic.go:334] "Generic (PLEG): container finished" podID="ce076099-f84a-49c6-9566-cae17c8efd6d" containerID="6779d327b09eddc653cf10f40050a18611bafc3b2fcb0394ad4d3ee6ba27c365" exitCode=0 Jan 31 15:15:01 crc kubenswrapper[4751]: I0131 15:15:01.943119 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497875-rng7x" event={"ID":"ce076099-f84a-49c6-9566-cae17c8efd6d","Type":"ContainerDied","Data":"6779d327b09eddc653cf10f40050a18611bafc3b2fcb0394ad4d3ee6ba27c365"} Jan 31 15:15:03 crc kubenswrapper[4751]: I0131 15:15:03.174636 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497875-rng7x" Jan 31 15:15:03 crc kubenswrapper[4751]: I0131 15:15:03.363753 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ce076099-f84a-49c6-9566-cae17c8efd6d-config-volume\") pod \"ce076099-f84a-49c6-9566-cae17c8efd6d\" (UID: \"ce076099-f84a-49c6-9566-cae17c8efd6d\") " Jan 31 15:15:03 crc kubenswrapper[4751]: I0131 15:15:03.364035 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ce076099-f84a-49c6-9566-cae17c8efd6d-secret-volume\") pod \"ce076099-f84a-49c6-9566-cae17c8efd6d\" (UID: \"ce076099-f84a-49c6-9566-cae17c8efd6d\") " Jan 31 15:15:03 crc kubenswrapper[4751]: I0131 15:15:03.364355 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce076099-f84a-49c6-9566-cae17c8efd6d-config-volume" (OuterVolumeSpecName: "config-volume") pod "ce076099-f84a-49c6-9566-cae17c8efd6d" (UID: "ce076099-f84a-49c6-9566-cae17c8efd6d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 15:15:03 crc kubenswrapper[4751]: I0131 15:15:03.365132 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6kqz\" (UniqueName: \"kubernetes.io/projected/ce076099-f84a-49c6-9566-cae17c8efd6d-kube-api-access-j6kqz\") pod \"ce076099-f84a-49c6-9566-cae17c8efd6d\" (UID: \"ce076099-f84a-49c6-9566-cae17c8efd6d\") " Jan 31 15:15:03 crc kubenswrapper[4751]: I0131 15:15:03.365520 4751 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ce076099-f84a-49c6-9566-cae17c8efd6d-config-volume\") on node \"crc\" DevicePath \"\"" Jan 31 15:15:03 crc kubenswrapper[4751]: I0131 15:15:03.369141 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce076099-f84a-49c6-9566-cae17c8efd6d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ce076099-f84a-49c6-9566-cae17c8efd6d" (UID: "ce076099-f84a-49c6-9566-cae17c8efd6d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 15:15:03 crc kubenswrapper[4751]: I0131 15:15:03.370091 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce076099-f84a-49c6-9566-cae17c8efd6d-kube-api-access-j6kqz" (OuterVolumeSpecName: "kube-api-access-j6kqz") pod "ce076099-f84a-49c6-9566-cae17c8efd6d" (UID: "ce076099-f84a-49c6-9566-cae17c8efd6d"). InnerVolumeSpecName "kube-api-access-j6kqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:15:03 crc kubenswrapper[4751]: I0131 15:15:03.467137 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6kqz\" (UniqueName: \"kubernetes.io/projected/ce076099-f84a-49c6-9566-cae17c8efd6d-kube-api-access-j6kqz\") on node \"crc\" DevicePath \"\"" Jan 31 15:15:03 crc kubenswrapper[4751]: I0131 15:15:03.467179 4751 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ce076099-f84a-49c6-9566-cae17c8efd6d-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 31 15:15:03 crc kubenswrapper[4751]: I0131 15:15:03.958352 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497875-rng7x" event={"ID":"ce076099-f84a-49c6-9566-cae17c8efd6d","Type":"ContainerDied","Data":"ce54e21704ea948301d201e015fb2c4d5600e26e25a29f067f6f22ad0e132991"} Jan 31 15:15:03 crc kubenswrapper[4751]: I0131 15:15:03.958391 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497875-rng7x" Jan 31 15:15:03 crc kubenswrapper[4751]: I0131 15:15:03.958411 4751 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce54e21704ea948301d201e015fb2c4d5600e26e25a29f067f6f22ad0e132991" Jan 31 15:15:04 crc kubenswrapper[4751]: I0131 15:15:04.255169 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497830-rpwmp"] Jan 31 15:15:04 crc kubenswrapper[4751]: I0131 15:15:04.262916 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497830-rpwmp"] Jan 31 15:15:04 crc kubenswrapper[4751]: I0131 15:15:04.420454 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eade01dc-846b-42a8-a6ed-8cf0a0663e82" path="/var/lib/kubelet/pods/eade01dc-846b-42a8-a6ed-8cf0a0663e82/volumes" Jan 31 15:15:10 crc kubenswrapper[4751]: E0131 15:15:10.269385 4751 secret.go:188] Couldn't get secret glance-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Jan 31 15:15:10 crc kubenswrapper[4751]: E0131 15:15:10.270421 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config-secret podName:eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb nodeName:}" failed. No retries permitted until 2026-01-31 15:17:12.270380754 +0000 UTC m=+2136.645093679 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config-secret") pod "openstackclient" (UID: "eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb") : secret "openstack-config-secret" not found Jan 31 15:15:10 crc kubenswrapper[4751]: E0131 15:15:10.271366 4751 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-config: configmap "openstack-config" not found Jan 31 15:15:10 crc kubenswrapper[4751]: E0131 15:15:10.271458 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config podName:eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb nodeName:}" failed. No retries permitted until 2026-01-31 15:17:12.271433142 +0000 UTC m=+2136.646146067 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config") pod "openstackclient" (UID: "eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb") : configmap "openstack-config" not found Jan 31 15:15:32 crc kubenswrapper[4751]: I0131 15:15:32.187180 4751 generic.go:334] "Generic (PLEG): container finished" podID="8169bcf4-4b12-4458-af34-08e57ab8e72a" containerID="cc356eac62cc8a9dc1d2dd948da59d8ed49fe19d58d0b369f5e6af812f05dc08" exitCode=0 Jan 31 15:15:32 crc kubenswrapper[4751]: I0131 15:15:32.187627 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n5n7n/must-gather-kfmqg" event={"ID":"8169bcf4-4b12-4458-af34-08e57ab8e72a","Type":"ContainerDied","Data":"cc356eac62cc8a9dc1d2dd948da59d8ed49fe19d58d0b369f5e6af812f05dc08"} Jan 31 15:15:32 crc kubenswrapper[4751]: I0131 15:15:32.188195 4751 scope.go:117] "RemoveContainer" containerID="cc356eac62cc8a9dc1d2dd948da59d8ed49fe19d58d0b369f5e6af812f05dc08" Jan 31 15:15:33 crc kubenswrapper[4751]: I0131 15:15:33.046697 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-n5n7n_must-gather-kfmqg_8169bcf4-4b12-4458-af34-08e57ab8e72a/gather/0.log" Jan 31 15:15:41 crc kubenswrapper[4751]: I0131 15:15:41.744696 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-n5n7n/must-gather-kfmqg"] Jan 31 15:15:41 crc kubenswrapper[4751]: I0131 15:15:41.745754 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-n5n7n/must-gather-kfmqg" podUID="8169bcf4-4b12-4458-af34-08e57ab8e72a" containerName="copy" containerID="cri-o://d92eb14a52593d6fc22bb08aec8df39b0e7882a4fa59d23aac4f2c37f671f001" gracePeriod=2 Jan 31 15:15:41 crc kubenswrapper[4751]: I0131 15:15:41.753942 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-n5n7n/must-gather-kfmqg"] Jan 31 15:15:42 crc kubenswrapper[4751]: I0131 15:15:42.256706 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-n5n7n_must-gather-kfmqg_8169bcf4-4b12-4458-af34-08e57ab8e72a/copy/0.log" Jan 31 15:15:42 crc kubenswrapper[4751]: I0131 15:15:42.257800 4751 generic.go:334] "Generic (PLEG): container finished" podID="8169bcf4-4b12-4458-af34-08e57ab8e72a" containerID="d92eb14a52593d6fc22bb08aec8df39b0e7882a4fa59d23aac4f2c37f671f001" exitCode=143 Jan 31 15:15:42 crc kubenswrapper[4751]: I0131 15:15:42.301755 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-n5n7n_must-gather-kfmqg_8169bcf4-4b12-4458-af34-08e57ab8e72a/copy/0.log" Jan 31 15:15:42 crc kubenswrapper[4751]: I0131 15:15:42.302141 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n5n7n/must-gather-kfmqg" Jan 31 15:15:42 crc kubenswrapper[4751]: I0131 15:15:42.335276 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fh4fz\" (UniqueName: \"kubernetes.io/projected/8169bcf4-4b12-4458-af34-08e57ab8e72a-kube-api-access-fh4fz\") pod \"8169bcf4-4b12-4458-af34-08e57ab8e72a\" (UID: \"8169bcf4-4b12-4458-af34-08e57ab8e72a\") " Jan 31 15:15:42 crc kubenswrapper[4751]: I0131 15:15:42.335395 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8169bcf4-4b12-4458-af34-08e57ab8e72a-must-gather-output\") pod \"8169bcf4-4b12-4458-af34-08e57ab8e72a\" (UID: \"8169bcf4-4b12-4458-af34-08e57ab8e72a\") " Jan 31 15:15:42 crc kubenswrapper[4751]: I0131 15:15:42.355713 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8169bcf4-4b12-4458-af34-08e57ab8e72a-kube-api-access-fh4fz" (OuterVolumeSpecName: "kube-api-access-fh4fz") pod "8169bcf4-4b12-4458-af34-08e57ab8e72a" (UID: "8169bcf4-4b12-4458-af34-08e57ab8e72a"). InnerVolumeSpecName "kube-api-access-fh4fz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:15:42 crc kubenswrapper[4751]: I0131 15:15:42.418776 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8169bcf4-4b12-4458-af34-08e57ab8e72a-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "8169bcf4-4b12-4458-af34-08e57ab8e72a" (UID: "8169bcf4-4b12-4458-af34-08e57ab8e72a"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:15:42 crc kubenswrapper[4751]: I0131 15:15:42.422520 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8169bcf4-4b12-4458-af34-08e57ab8e72a" path="/var/lib/kubelet/pods/8169bcf4-4b12-4458-af34-08e57ab8e72a/volumes" Jan 31 15:15:42 crc kubenswrapper[4751]: I0131 15:15:42.438275 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fh4fz\" (UniqueName: \"kubernetes.io/projected/8169bcf4-4b12-4458-af34-08e57ab8e72a-kube-api-access-fh4fz\") on node \"crc\" DevicePath \"\"" Jan 31 15:15:42 crc kubenswrapper[4751]: I0131 15:15:42.438304 4751 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8169bcf4-4b12-4458-af34-08e57ab8e72a-must-gather-output\") on node \"crc\" DevicePath \"\"" Jan 31 15:15:42 crc kubenswrapper[4751]: I0131 15:15:42.537871 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7q8qd"] Jan 31 15:15:42 crc kubenswrapper[4751]: E0131 15:15:42.538181 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8169bcf4-4b12-4458-af34-08e57ab8e72a" containerName="gather" Jan 31 15:15:42 crc kubenswrapper[4751]: I0131 15:15:42.538197 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="8169bcf4-4b12-4458-af34-08e57ab8e72a" containerName="gather" Jan 31 15:15:42 crc kubenswrapper[4751]: E0131 15:15:42.538216 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8169bcf4-4b12-4458-af34-08e57ab8e72a" containerName="copy" Jan 31 15:15:42 crc kubenswrapper[4751]: I0131 15:15:42.538224 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="8169bcf4-4b12-4458-af34-08e57ab8e72a" containerName="copy" Jan 31 15:15:42 crc kubenswrapper[4751]: E0131 15:15:42.538249 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce076099-f84a-49c6-9566-cae17c8efd6d" containerName="collect-profiles" Jan 31 15:15:42 crc kubenswrapper[4751]: I0131 15:15:42.538257 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce076099-f84a-49c6-9566-cae17c8efd6d" containerName="collect-profiles" Jan 31 15:15:42 crc kubenswrapper[4751]: I0131 15:15:42.538392 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce076099-f84a-49c6-9566-cae17c8efd6d" containerName="collect-profiles" Jan 31 15:15:42 crc kubenswrapper[4751]: I0131 15:15:42.538412 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="8169bcf4-4b12-4458-af34-08e57ab8e72a" containerName="gather" Jan 31 15:15:42 crc kubenswrapper[4751]: I0131 15:15:42.538422 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="8169bcf4-4b12-4458-af34-08e57ab8e72a" containerName="copy" Jan 31 15:15:42 crc kubenswrapper[4751]: I0131 15:15:42.539381 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7q8qd" Jan 31 15:15:42 crc kubenswrapper[4751]: I0131 15:15:42.551307 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7q8qd"] Jan 31 15:15:42 crc kubenswrapper[4751]: I0131 15:15:42.640465 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0de97a3-8d1b-4cb9-baf2-94cdac6fa611-catalog-content\") pod \"community-operators-7q8qd\" (UID: \"f0de97a3-8d1b-4cb9-baf2-94cdac6fa611\") " pod="openshift-marketplace/community-operators-7q8qd" Jan 31 15:15:42 crc kubenswrapper[4751]: I0131 15:15:42.640509 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qz9xp\" (UniqueName: \"kubernetes.io/projected/f0de97a3-8d1b-4cb9-baf2-94cdac6fa611-kube-api-access-qz9xp\") pod \"community-operators-7q8qd\" (UID: \"f0de97a3-8d1b-4cb9-baf2-94cdac6fa611\") " pod="openshift-marketplace/community-operators-7q8qd" Jan 31 15:15:42 crc kubenswrapper[4751]: I0131 15:15:42.640560 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0de97a3-8d1b-4cb9-baf2-94cdac6fa611-utilities\") pod \"community-operators-7q8qd\" (UID: \"f0de97a3-8d1b-4cb9-baf2-94cdac6fa611\") " pod="openshift-marketplace/community-operators-7q8qd" Jan 31 15:15:42 crc kubenswrapper[4751]: I0131 15:15:42.741765 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0de97a3-8d1b-4cb9-baf2-94cdac6fa611-catalog-content\") pod \"community-operators-7q8qd\" (UID: \"f0de97a3-8d1b-4cb9-baf2-94cdac6fa611\") " pod="openshift-marketplace/community-operators-7q8qd" Jan 31 15:15:42 crc kubenswrapper[4751]: I0131 15:15:42.741818 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qz9xp\" (UniqueName: \"kubernetes.io/projected/f0de97a3-8d1b-4cb9-baf2-94cdac6fa611-kube-api-access-qz9xp\") pod \"community-operators-7q8qd\" (UID: \"f0de97a3-8d1b-4cb9-baf2-94cdac6fa611\") " pod="openshift-marketplace/community-operators-7q8qd" Jan 31 15:15:42 crc kubenswrapper[4751]: I0131 15:15:42.741894 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0de97a3-8d1b-4cb9-baf2-94cdac6fa611-utilities\") pod \"community-operators-7q8qd\" (UID: \"f0de97a3-8d1b-4cb9-baf2-94cdac6fa611\") " pod="openshift-marketplace/community-operators-7q8qd" Jan 31 15:15:42 crc kubenswrapper[4751]: I0131 15:15:42.742287 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0de97a3-8d1b-4cb9-baf2-94cdac6fa611-catalog-content\") pod \"community-operators-7q8qd\" (UID: \"f0de97a3-8d1b-4cb9-baf2-94cdac6fa611\") " pod="openshift-marketplace/community-operators-7q8qd" Jan 31 15:15:42 crc kubenswrapper[4751]: I0131 15:15:42.742379 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0de97a3-8d1b-4cb9-baf2-94cdac6fa611-utilities\") pod \"community-operators-7q8qd\" (UID: \"f0de97a3-8d1b-4cb9-baf2-94cdac6fa611\") " pod="openshift-marketplace/community-operators-7q8qd" Jan 31 15:15:42 crc kubenswrapper[4751]: I0131 15:15:42.758836 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qz9xp\" (UniqueName: \"kubernetes.io/projected/f0de97a3-8d1b-4cb9-baf2-94cdac6fa611-kube-api-access-qz9xp\") pod \"community-operators-7q8qd\" (UID: \"f0de97a3-8d1b-4cb9-baf2-94cdac6fa611\") " pod="openshift-marketplace/community-operators-7q8qd" Jan 31 15:15:42 crc kubenswrapper[4751]: I0131 15:15:42.860227 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7q8qd" Jan 31 15:15:43 crc kubenswrapper[4751]: I0131 15:15:43.199921 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7q8qd"] Jan 31 15:15:43 crc kubenswrapper[4751]: I0131 15:15:43.264204 4751 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-n5n7n_must-gather-kfmqg_8169bcf4-4b12-4458-af34-08e57ab8e72a/copy/0.log" Jan 31 15:15:43 crc kubenswrapper[4751]: I0131 15:15:43.264553 4751 scope.go:117] "RemoveContainer" containerID="d92eb14a52593d6fc22bb08aec8df39b0e7882a4fa59d23aac4f2c37f671f001" Jan 31 15:15:43 crc kubenswrapper[4751]: I0131 15:15:43.264621 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n5n7n/must-gather-kfmqg" Jan 31 15:15:43 crc kubenswrapper[4751]: I0131 15:15:43.265481 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7q8qd" event={"ID":"f0de97a3-8d1b-4cb9-baf2-94cdac6fa611","Type":"ContainerStarted","Data":"afe835c1ad53315ecbe4f222bec28e72de4c1ccdcb10edb681720dd2cc8c1f4c"} Jan 31 15:15:43 crc kubenswrapper[4751]: I0131 15:15:43.279195 4751 scope.go:117] "RemoveContainer" containerID="cc356eac62cc8a9dc1d2dd948da59d8ed49fe19d58d0b369f5e6af812f05dc08" Jan 31 15:15:44 crc kubenswrapper[4751]: I0131 15:15:44.274583 4751 generic.go:334] "Generic (PLEG): container finished" podID="f0de97a3-8d1b-4cb9-baf2-94cdac6fa611" containerID="abd04023c388da640bc843dc513c7f9d80f0d9a758b339a8165a2bc9c26df42f" exitCode=0 Jan 31 15:15:44 crc kubenswrapper[4751]: I0131 15:15:44.274695 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7q8qd" event={"ID":"f0de97a3-8d1b-4cb9-baf2-94cdac6fa611","Type":"ContainerDied","Data":"abd04023c388da640bc843dc513c7f9d80f0d9a758b339a8165a2bc9c26df42f"} Jan 31 15:15:44 crc kubenswrapper[4751]: I0131 15:15:44.278240 4751 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 31 15:15:46 crc kubenswrapper[4751]: I0131 15:15:46.293961 4751 generic.go:334] "Generic (PLEG): container finished" podID="f0de97a3-8d1b-4cb9-baf2-94cdac6fa611" containerID="07a29af813c56d3015d9196299ce1c32648d906dd1592919368710f2b8adff3d" exitCode=0 Jan 31 15:15:46 crc kubenswrapper[4751]: I0131 15:15:46.294063 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7q8qd" event={"ID":"f0de97a3-8d1b-4cb9-baf2-94cdac6fa611","Type":"ContainerDied","Data":"07a29af813c56d3015d9196299ce1c32648d906dd1592919368710f2b8adff3d"} Jan 31 15:15:47 crc kubenswrapper[4751]: I0131 15:15:47.303476 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7q8qd" event={"ID":"f0de97a3-8d1b-4cb9-baf2-94cdac6fa611","Type":"ContainerStarted","Data":"3a1afa3565b79940e6783ac512f7eb65596e794ecce6cdfddc4f941b25daaa31"} Jan 31 15:15:47 crc kubenswrapper[4751]: I0131 15:15:47.340744 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7q8qd" podStartSLOduration=2.912751546 podStartE2EDuration="5.340719813s" podCreationTimestamp="2026-01-31 15:15:42 +0000 UTC" firstStartedPulling="2026-01-31 15:15:44.277892486 +0000 UTC m=+2048.652605381" lastFinishedPulling="2026-01-31 15:15:46.705860753 +0000 UTC m=+2051.080573648" observedRunningTime="2026-01-31 15:15:47.335354619 +0000 UTC m=+2051.710067514" watchObservedRunningTime="2026-01-31 15:15:47.340719813 +0000 UTC m=+2051.715432698" Jan 31 15:15:48 crc kubenswrapper[4751]: I0131 15:15:48.183375 4751 scope.go:117] "RemoveContainer" containerID="9d26a6d6092efc3cfe1b53bda2539e32fc75d0f27a288ecda4b2062254a0fc73" Jan 31 15:15:52 crc kubenswrapper[4751]: I0131 15:15:52.860569 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7q8qd" Jan 31 15:15:52 crc kubenswrapper[4751]: I0131 15:15:52.861100 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7q8qd" Jan 31 15:15:52 crc kubenswrapper[4751]: I0131 15:15:52.912809 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7q8qd" Jan 31 15:15:53 crc kubenswrapper[4751]: I0131 15:15:53.389712 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7q8qd" Jan 31 15:15:53 crc kubenswrapper[4751]: I0131 15:15:53.434986 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7q8qd"] Jan 31 15:15:55 crc kubenswrapper[4751]: I0131 15:15:55.352664 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7q8qd" podUID="f0de97a3-8d1b-4cb9-baf2-94cdac6fa611" containerName="registry-server" containerID="cri-o://3a1afa3565b79940e6783ac512f7eb65596e794ecce6cdfddc4f941b25daaa31" gracePeriod=2 Jan 31 15:15:55 crc kubenswrapper[4751]: I0131 15:15:55.711014 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7q8qd" Jan 31 15:15:55 crc kubenswrapper[4751]: I0131 15:15:55.816300 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qz9xp\" (UniqueName: \"kubernetes.io/projected/f0de97a3-8d1b-4cb9-baf2-94cdac6fa611-kube-api-access-qz9xp\") pod \"f0de97a3-8d1b-4cb9-baf2-94cdac6fa611\" (UID: \"f0de97a3-8d1b-4cb9-baf2-94cdac6fa611\") " Jan 31 15:15:55 crc kubenswrapper[4751]: I0131 15:15:55.816443 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0de97a3-8d1b-4cb9-baf2-94cdac6fa611-catalog-content\") pod \"f0de97a3-8d1b-4cb9-baf2-94cdac6fa611\" (UID: \"f0de97a3-8d1b-4cb9-baf2-94cdac6fa611\") " Jan 31 15:15:55 crc kubenswrapper[4751]: I0131 15:15:55.817138 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0de97a3-8d1b-4cb9-baf2-94cdac6fa611-utilities\") pod \"f0de97a3-8d1b-4cb9-baf2-94cdac6fa611\" (UID: \"f0de97a3-8d1b-4cb9-baf2-94cdac6fa611\") " Jan 31 15:15:55 crc kubenswrapper[4751]: I0131 15:15:55.817550 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0de97a3-8d1b-4cb9-baf2-94cdac6fa611-utilities" (OuterVolumeSpecName: "utilities") pod "f0de97a3-8d1b-4cb9-baf2-94cdac6fa611" (UID: "f0de97a3-8d1b-4cb9-baf2-94cdac6fa611"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:15:55 crc kubenswrapper[4751]: I0131 15:15:55.821940 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0de97a3-8d1b-4cb9-baf2-94cdac6fa611-kube-api-access-qz9xp" (OuterVolumeSpecName: "kube-api-access-qz9xp") pod "f0de97a3-8d1b-4cb9-baf2-94cdac6fa611" (UID: "f0de97a3-8d1b-4cb9-baf2-94cdac6fa611"). InnerVolumeSpecName "kube-api-access-qz9xp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:15:55 crc kubenswrapper[4751]: I0131 15:15:55.868350 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0de97a3-8d1b-4cb9-baf2-94cdac6fa611-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f0de97a3-8d1b-4cb9-baf2-94cdac6fa611" (UID: "f0de97a3-8d1b-4cb9-baf2-94cdac6fa611"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:15:55 crc kubenswrapper[4751]: I0131 15:15:55.918727 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qz9xp\" (UniqueName: \"kubernetes.io/projected/f0de97a3-8d1b-4cb9-baf2-94cdac6fa611-kube-api-access-qz9xp\") on node \"crc\" DevicePath \"\"" Jan 31 15:15:55 crc kubenswrapper[4751]: I0131 15:15:55.918774 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0de97a3-8d1b-4cb9-baf2-94cdac6fa611-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 15:15:55 crc kubenswrapper[4751]: I0131 15:15:55.918792 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0de97a3-8d1b-4cb9-baf2-94cdac6fa611-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 15:15:56 crc kubenswrapper[4751]: I0131 15:15:56.364007 4751 generic.go:334] "Generic (PLEG): container finished" podID="f0de97a3-8d1b-4cb9-baf2-94cdac6fa611" containerID="3a1afa3565b79940e6783ac512f7eb65596e794ecce6cdfddc4f941b25daaa31" exitCode=0 Jan 31 15:15:56 crc kubenswrapper[4751]: I0131 15:15:56.364057 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7q8qd" event={"ID":"f0de97a3-8d1b-4cb9-baf2-94cdac6fa611","Type":"ContainerDied","Data":"3a1afa3565b79940e6783ac512f7eb65596e794ecce6cdfddc4f941b25daaa31"} Jan 31 15:15:56 crc kubenswrapper[4751]: I0131 15:15:56.364110 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7q8qd" event={"ID":"f0de97a3-8d1b-4cb9-baf2-94cdac6fa611","Type":"ContainerDied","Data":"afe835c1ad53315ecbe4f222bec28e72de4c1ccdcb10edb681720dd2cc8c1f4c"} Jan 31 15:15:56 crc kubenswrapper[4751]: I0131 15:15:56.364151 4751 scope.go:117] "RemoveContainer" containerID="3a1afa3565b79940e6783ac512f7eb65596e794ecce6cdfddc4f941b25daaa31" Jan 31 15:15:56 crc kubenswrapper[4751]: I0131 15:15:56.364197 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7q8qd" Jan 31 15:15:56 crc kubenswrapper[4751]: I0131 15:15:56.397917 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7q8qd"] Jan 31 15:15:56 crc kubenswrapper[4751]: I0131 15:15:56.400237 4751 scope.go:117] "RemoveContainer" containerID="07a29af813c56d3015d9196299ce1c32648d906dd1592919368710f2b8adff3d" Jan 31 15:15:56 crc kubenswrapper[4751]: I0131 15:15:56.402763 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7q8qd"] Jan 31 15:15:56 crc kubenswrapper[4751]: I0131 15:15:56.413920 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0de97a3-8d1b-4cb9-baf2-94cdac6fa611" path="/var/lib/kubelet/pods/f0de97a3-8d1b-4cb9-baf2-94cdac6fa611/volumes" Jan 31 15:15:56 crc kubenswrapper[4751]: I0131 15:15:56.421188 4751 scope.go:117] "RemoveContainer" containerID="abd04023c388da640bc843dc513c7f9d80f0d9a758b339a8165a2bc9c26df42f" Jan 31 15:15:56 crc kubenswrapper[4751]: I0131 15:15:56.441688 4751 scope.go:117] "RemoveContainer" containerID="3a1afa3565b79940e6783ac512f7eb65596e794ecce6cdfddc4f941b25daaa31" Jan 31 15:15:56 crc kubenswrapper[4751]: E0131 15:15:56.442119 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a1afa3565b79940e6783ac512f7eb65596e794ecce6cdfddc4f941b25daaa31\": container with ID starting with 3a1afa3565b79940e6783ac512f7eb65596e794ecce6cdfddc4f941b25daaa31 not found: ID does not exist" containerID="3a1afa3565b79940e6783ac512f7eb65596e794ecce6cdfddc4f941b25daaa31" Jan 31 15:15:56 crc kubenswrapper[4751]: I0131 15:15:56.442163 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a1afa3565b79940e6783ac512f7eb65596e794ecce6cdfddc4f941b25daaa31"} err="failed to get container status \"3a1afa3565b79940e6783ac512f7eb65596e794ecce6cdfddc4f941b25daaa31\": rpc error: code = NotFound desc = could not find container \"3a1afa3565b79940e6783ac512f7eb65596e794ecce6cdfddc4f941b25daaa31\": container with ID starting with 3a1afa3565b79940e6783ac512f7eb65596e794ecce6cdfddc4f941b25daaa31 not found: ID does not exist" Jan 31 15:15:56 crc kubenswrapper[4751]: I0131 15:15:56.442188 4751 scope.go:117] "RemoveContainer" containerID="07a29af813c56d3015d9196299ce1c32648d906dd1592919368710f2b8adff3d" Jan 31 15:15:56 crc kubenswrapper[4751]: E0131 15:15:56.442564 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07a29af813c56d3015d9196299ce1c32648d906dd1592919368710f2b8adff3d\": container with ID starting with 07a29af813c56d3015d9196299ce1c32648d906dd1592919368710f2b8adff3d not found: ID does not exist" containerID="07a29af813c56d3015d9196299ce1c32648d906dd1592919368710f2b8adff3d" Jan 31 15:15:56 crc kubenswrapper[4751]: I0131 15:15:56.442601 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07a29af813c56d3015d9196299ce1c32648d906dd1592919368710f2b8adff3d"} err="failed to get container status \"07a29af813c56d3015d9196299ce1c32648d906dd1592919368710f2b8adff3d\": rpc error: code = NotFound desc = could not find container \"07a29af813c56d3015d9196299ce1c32648d906dd1592919368710f2b8adff3d\": container with ID starting with 07a29af813c56d3015d9196299ce1c32648d906dd1592919368710f2b8adff3d not found: ID does not exist" Jan 31 15:15:56 crc kubenswrapper[4751]: I0131 15:15:56.442626 4751 scope.go:117] "RemoveContainer" containerID="abd04023c388da640bc843dc513c7f9d80f0d9a758b339a8165a2bc9c26df42f" Jan 31 15:15:56 crc kubenswrapper[4751]: E0131 15:15:56.442889 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abd04023c388da640bc843dc513c7f9d80f0d9a758b339a8165a2bc9c26df42f\": container with ID starting with abd04023c388da640bc843dc513c7f9d80f0d9a758b339a8165a2bc9c26df42f not found: ID does not exist" containerID="abd04023c388da640bc843dc513c7f9d80f0d9a758b339a8165a2bc9c26df42f" Jan 31 15:15:56 crc kubenswrapper[4751]: I0131 15:15:56.442915 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abd04023c388da640bc843dc513c7f9d80f0d9a758b339a8165a2bc9c26df42f"} err="failed to get container status \"abd04023c388da640bc843dc513c7f9d80f0d9a758b339a8165a2bc9c26df42f\": rpc error: code = NotFound desc = could not find container \"abd04023c388da640bc843dc513c7f9d80f0d9a758b339a8165a2bc9c26df42f\": container with ID starting with abd04023c388da640bc843dc513c7f9d80f0d9a758b339a8165a2bc9c26df42f not found: ID does not exist" Jan 31 15:17:08 crc kubenswrapper[4751]: I0131 15:17:08.896208 4751 patch_prober.go:28] interesting pod/machine-config-daemon-2wpj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 15:17:08 crc kubenswrapper[4751]: I0131 15:17:08.898099 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podUID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 15:17:12 crc kubenswrapper[4751]: E0131 15:17:12.318503 4751 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-config: configmap "openstack-config" not found Jan 31 15:17:12 crc kubenswrapper[4751]: E0131 15:17:12.318769 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config podName:eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb nodeName:}" failed. No retries permitted until 2026-01-31 15:19:14.318753784 +0000 UTC m=+2258.693466669 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config") pod "openstackclient" (UID: "eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb") : configmap "openstack-config" not found Jan 31 15:17:12 crc kubenswrapper[4751]: E0131 15:17:12.318549 4751 secret.go:188] Couldn't get secret glance-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Jan 31 15:17:12 crc kubenswrapper[4751]: E0131 15:17:12.318882 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config-secret podName:eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb nodeName:}" failed. No retries permitted until 2026-01-31 15:19:14.318862597 +0000 UTC m=+2258.693575482 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config-secret") pod "openstackclient" (UID: "eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb") : secret "openstack-config-secret" not found Jan 31 15:17:28 crc kubenswrapper[4751]: I0131 15:17:28.832247 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5fvzz"] Jan 31 15:17:28 crc kubenswrapper[4751]: E0131 15:17:28.833219 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0de97a3-8d1b-4cb9-baf2-94cdac6fa611" containerName="extract-content" Jan 31 15:17:28 crc kubenswrapper[4751]: I0131 15:17:28.833241 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0de97a3-8d1b-4cb9-baf2-94cdac6fa611" containerName="extract-content" Jan 31 15:17:28 crc kubenswrapper[4751]: E0131 15:17:28.833274 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0de97a3-8d1b-4cb9-baf2-94cdac6fa611" containerName="registry-server" Jan 31 15:17:28 crc kubenswrapper[4751]: I0131 15:17:28.833286 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0de97a3-8d1b-4cb9-baf2-94cdac6fa611" containerName="registry-server" Jan 31 15:17:28 crc kubenswrapper[4751]: E0131 15:17:28.833325 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0de97a3-8d1b-4cb9-baf2-94cdac6fa611" containerName="extract-utilities" Jan 31 15:17:28 crc kubenswrapper[4751]: I0131 15:17:28.833338 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0de97a3-8d1b-4cb9-baf2-94cdac6fa611" containerName="extract-utilities" Jan 31 15:17:28 crc kubenswrapper[4751]: I0131 15:17:28.833530 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0de97a3-8d1b-4cb9-baf2-94cdac6fa611" containerName="registry-server" Jan 31 15:17:28 crc kubenswrapper[4751]: I0131 15:17:28.837096 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5fvzz" Jan 31 15:17:28 crc kubenswrapper[4751]: I0131 15:17:28.846243 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5fvzz"] Jan 31 15:17:28 crc kubenswrapper[4751]: I0131 15:17:28.944241 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa2d3a4e-15bd-4b0d-b187-a4db9049522f-catalog-content\") pod \"redhat-marketplace-5fvzz\" (UID: \"aa2d3a4e-15bd-4b0d-b187-a4db9049522f\") " pod="openshift-marketplace/redhat-marketplace-5fvzz" Jan 31 15:17:28 crc kubenswrapper[4751]: I0131 15:17:28.944561 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7p44s\" (UniqueName: \"kubernetes.io/projected/aa2d3a4e-15bd-4b0d-b187-a4db9049522f-kube-api-access-7p44s\") pod \"redhat-marketplace-5fvzz\" (UID: \"aa2d3a4e-15bd-4b0d-b187-a4db9049522f\") " pod="openshift-marketplace/redhat-marketplace-5fvzz" Jan 31 15:17:28 crc kubenswrapper[4751]: I0131 15:17:28.944685 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa2d3a4e-15bd-4b0d-b187-a4db9049522f-utilities\") pod \"redhat-marketplace-5fvzz\" (UID: \"aa2d3a4e-15bd-4b0d-b187-a4db9049522f\") " pod="openshift-marketplace/redhat-marketplace-5fvzz" Jan 31 15:17:29 crc kubenswrapper[4751]: I0131 15:17:29.045445 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa2d3a4e-15bd-4b0d-b187-a4db9049522f-catalog-content\") pod \"redhat-marketplace-5fvzz\" (UID: \"aa2d3a4e-15bd-4b0d-b187-a4db9049522f\") " pod="openshift-marketplace/redhat-marketplace-5fvzz" Jan 31 15:17:29 crc kubenswrapper[4751]: I0131 15:17:29.045761 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7p44s\" (UniqueName: \"kubernetes.io/projected/aa2d3a4e-15bd-4b0d-b187-a4db9049522f-kube-api-access-7p44s\") pod \"redhat-marketplace-5fvzz\" (UID: \"aa2d3a4e-15bd-4b0d-b187-a4db9049522f\") " pod="openshift-marketplace/redhat-marketplace-5fvzz" Jan 31 15:17:29 crc kubenswrapper[4751]: I0131 15:17:29.045880 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa2d3a4e-15bd-4b0d-b187-a4db9049522f-utilities\") pod \"redhat-marketplace-5fvzz\" (UID: \"aa2d3a4e-15bd-4b0d-b187-a4db9049522f\") " pod="openshift-marketplace/redhat-marketplace-5fvzz" Jan 31 15:17:29 crc kubenswrapper[4751]: I0131 15:17:29.046512 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa2d3a4e-15bd-4b0d-b187-a4db9049522f-utilities\") pod \"redhat-marketplace-5fvzz\" (UID: \"aa2d3a4e-15bd-4b0d-b187-a4db9049522f\") " pod="openshift-marketplace/redhat-marketplace-5fvzz" Jan 31 15:17:29 crc kubenswrapper[4751]: I0131 15:17:29.046876 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa2d3a4e-15bd-4b0d-b187-a4db9049522f-catalog-content\") pod \"redhat-marketplace-5fvzz\" (UID: \"aa2d3a4e-15bd-4b0d-b187-a4db9049522f\") " pod="openshift-marketplace/redhat-marketplace-5fvzz" Jan 31 15:17:29 crc kubenswrapper[4751]: I0131 15:17:29.076703 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7p44s\" (UniqueName: \"kubernetes.io/projected/aa2d3a4e-15bd-4b0d-b187-a4db9049522f-kube-api-access-7p44s\") pod \"redhat-marketplace-5fvzz\" (UID: \"aa2d3a4e-15bd-4b0d-b187-a4db9049522f\") " pod="openshift-marketplace/redhat-marketplace-5fvzz" Jan 31 15:17:29 crc kubenswrapper[4751]: I0131 15:17:29.165024 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5fvzz" Jan 31 15:17:29 crc kubenswrapper[4751]: I0131 15:17:29.564228 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5fvzz"] Jan 31 15:17:30 crc kubenswrapper[4751]: I0131 15:17:30.030419 4751 generic.go:334] "Generic (PLEG): container finished" podID="aa2d3a4e-15bd-4b0d-b187-a4db9049522f" containerID="1bb577b2261a76c8b92f3b1e2f277f6b87a8febd1df30295d8a1636ce64ffba7" exitCode=0 Jan 31 15:17:30 crc kubenswrapper[4751]: I0131 15:17:30.030494 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5fvzz" event={"ID":"aa2d3a4e-15bd-4b0d-b187-a4db9049522f","Type":"ContainerDied","Data":"1bb577b2261a76c8b92f3b1e2f277f6b87a8febd1df30295d8a1636ce64ffba7"} Jan 31 15:17:30 crc kubenswrapper[4751]: I0131 15:17:30.030768 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5fvzz" event={"ID":"aa2d3a4e-15bd-4b0d-b187-a4db9049522f","Type":"ContainerStarted","Data":"425504c025aad8a53ebdc18ebf06cc6728cfc8eaec5f1ce8efad85e1606a66c7"} Jan 31 15:17:31 crc kubenswrapper[4751]: I0131 15:17:31.039452 4751 generic.go:334] "Generic (PLEG): container finished" podID="aa2d3a4e-15bd-4b0d-b187-a4db9049522f" containerID="f5cc5de70aba5a8a756cd1dc84d47958d0b1e225fc2bd378950d452ba598ee88" exitCode=0 Jan 31 15:17:31 crc kubenswrapper[4751]: I0131 15:17:31.039517 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5fvzz" event={"ID":"aa2d3a4e-15bd-4b0d-b187-a4db9049522f","Type":"ContainerDied","Data":"f5cc5de70aba5a8a756cd1dc84d47958d0b1e225fc2bd378950d452ba598ee88"} Jan 31 15:17:32 crc kubenswrapper[4751]: I0131 15:17:32.048148 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5fvzz" event={"ID":"aa2d3a4e-15bd-4b0d-b187-a4db9049522f","Type":"ContainerStarted","Data":"f2e5db66425ffe36121edf1059e80641ecbb035af5d94d86cad041ad2f36fd08"} Jan 31 15:17:32 crc kubenswrapper[4751]: I0131 15:17:32.074176 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5fvzz" podStartSLOduration=2.659696797 podStartE2EDuration="4.074159997s" podCreationTimestamp="2026-01-31 15:17:28 +0000 UTC" firstStartedPulling="2026-01-31 15:17:30.031910007 +0000 UTC m=+2154.406622892" lastFinishedPulling="2026-01-31 15:17:31.446373197 +0000 UTC m=+2155.821086092" observedRunningTime="2026-01-31 15:17:32.070810218 +0000 UTC m=+2156.445523103" watchObservedRunningTime="2026-01-31 15:17:32.074159997 +0000 UTC m=+2156.448872882" Jan 31 15:17:35 crc kubenswrapper[4751]: I0131 15:17:35.199425 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-884dp"] Jan 31 15:17:35 crc kubenswrapper[4751]: I0131 15:17:35.200920 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-884dp" Jan 31 15:17:35 crc kubenswrapper[4751]: I0131 15:17:35.211884 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-884dp"] Jan 31 15:17:35 crc kubenswrapper[4751]: I0131 15:17:35.243340 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnvk2\" (UniqueName: \"kubernetes.io/projected/22e797bc-1dbd-481f-bb51-c4a04114ecda-kube-api-access-gnvk2\") pod \"redhat-operators-884dp\" (UID: \"22e797bc-1dbd-481f-bb51-c4a04114ecda\") " pod="openshift-marketplace/redhat-operators-884dp" Jan 31 15:17:35 crc kubenswrapper[4751]: I0131 15:17:35.243465 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22e797bc-1dbd-481f-bb51-c4a04114ecda-utilities\") pod \"redhat-operators-884dp\" (UID: \"22e797bc-1dbd-481f-bb51-c4a04114ecda\") " pod="openshift-marketplace/redhat-operators-884dp" Jan 31 15:17:35 crc kubenswrapper[4751]: I0131 15:17:35.243576 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22e797bc-1dbd-481f-bb51-c4a04114ecda-catalog-content\") pod \"redhat-operators-884dp\" (UID: \"22e797bc-1dbd-481f-bb51-c4a04114ecda\") " pod="openshift-marketplace/redhat-operators-884dp" Jan 31 15:17:35 crc kubenswrapper[4751]: I0131 15:17:35.344785 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnvk2\" (UniqueName: \"kubernetes.io/projected/22e797bc-1dbd-481f-bb51-c4a04114ecda-kube-api-access-gnvk2\") pod \"redhat-operators-884dp\" (UID: \"22e797bc-1dbd-481f-bb51-c4a04114ecda\") " pod="openshift-marketplace/redhat-operators-884dp" Jan 31 15:17:35 crc kubenswrapper[4751]: I0131 15:17:35.344855 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22e797bc-1dbd-481f-bb51-c4a04114ecda-utilities\") pod \"redhat-operators-884dp\" (UID: \"22e797bc-1dbd-481f-bb51-c4a04114ecda\") " pod="openshift-marketplace/redhat-operators-884dp" Jan 31 15:17:35 crc kubenswrapper[4751]: I0131 15:17:35.344926 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22e797bc-1dbd-481f-bb51-c4a04114ecda-catalog-content\") pod \"redhat-operators-884dp\" (UID: \"22e797bc-1dbd-481f-bb51-c4a04114ecda\") " pod="openshift-marketplace/redhat-operators-884dp" Jan 31 15:17:35 crc kubenswrapper[4751]: I0131 15:17:35.345479 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22e797bc-1dbd-481f-bb51-c4a04114ecda-catalog-content\") pod \"redhat-operators-884dp\" (UID: \"22e797bc-1dbd-481f-bb51-c4a04114ecda\") " pod="openshift-marketplace/redhat-operators-884dp" Jan 31 15:17:35 crc kubenswrapper[4751]: I0131 15:17:35.345653 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22e797bc-1dbd-481f-bb51-c4a04114ecda-utilities\") pod \"redhat-operators-884dp\" (UID: \"22e797bc-1dbd-481f-bb51-c4a04114ecda\") " pod="openshift-marketplace/redhat-operators-884dp" Jan 31 15:17:35 crc kubenswrapper[4751]: I0131 15:17:35.370475 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnvk2\" (UniqueName: \"kubernetes.io/projected/22e797bc-1dbd-481f-bb51-c4a04114ecda-kube-api-access-gnvk2\") pod \"redhat-operators-884dp\" (UID: \"22e797bc-1dbd-481f-bb51-c4a04114ecda\") " pod="openshift-marketplace/redhat-operators-884dp" Jan 31 15:17:35 crc kubenswrapper[4751]: I0131 15:17:35.524747 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-884dp" Jan 31 15:17:35 crc kubenswrapper[4751]: I0131 15:17:35.770584 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-884dp"] Jan 31 15:17:36 crc kubenswrapper[4751]: I0131 15:17:36.073057 4751 generic.go:334] "Generic (PLEG): container finished" podID="22e797bc-1dbd-481f-bb51-c4a04114ecda" containerID="a89d2ffa602b194f1e3ddd9004485c6ff999c6fb57b08c748be6c30f28b34770" exitCode=0 Jan 31 15:17:36 crc kubenswrapper[4751]: I0131 15:17:36.073327 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-884dp" event={"ID":"22e797bc-1dbd-481f-bb51-c4a04114ecda","Type":"ContainerDied","Data":"a89d2ffa602b194f1e3ddd9004485c6ff999c6fb57b08c748be6c30f28b34770"} Jan 31 15:17:36 crc kubenswrapper[4751]: I0131 15:17:36.073822 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-884dp" event={"ID":"22e797bc-1dbd-481f-bb51-c4a04114ecda","Type":"ContainerStarted","Data":"03a99742f46b04ca6f4e4681bc1cc7dbb9c2e6413829a7704f70d7b3e8fddd54"} Jan 31 15:17:37 crc kubenswrapper[4751]: I0131 15:17:37.081170 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-884dp" event={"ID":"22e797bc-1dbd-481f-bb51-c4a04114ecda","Type":"ContainerStarted","Data":"45d268c59b0d70d8f64877dfb7af473e60a933653390856ec83bb4377a58cb74"} Jan 31 15:17:38 crc kubenswrapper[4751]: I0131 15:17:38.090299 4751 generic.go:334] "Generic (PLEG): container finished" podID="22e797bc-1dbd-481f-bb51-c4a04114ecda" containerID="45d268c59b0d70d8f64877dfb7af473e60a933653390856ec83bb4377a58cb74" exitCode=0 Jan 31 15:17:38 crc kubenswrapper[4751]: I0131 15:17:38.090350 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-884dp" event={"ID":"22e797bc-1dbd-481f-bb51-c4a04114ecda","Type":"ContainerDied","Data":"45d268c59b0d70d8f64877dfb7af473e60a933653390856ec83bb4377a58cb74"} Jan 31 15:17:38 crc kubenswrapper[4751]: I0131 15:17:38.896317 4751 patch_prober.go:28] interesting pod/machine-config-daemon-2wpj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 15:17:38 crc kubenswrapper[4751]: I0131 15:17:38.896401 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podUID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 15:17:39 crc kubenswrapper[4751]: I0131 15:17:39.165417 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5fvzz" Jan 31 15:17:39 crc kubenswrapper[4751]: I0131 15:17:39.165502 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5fvzz" Jan 31 15:17:39 crc kubenswrapper[4751]: I0131 15:17:39.206345 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5fvzz" Jan 31 15:17:40 crc kubenswrapper[4751]: I0131 15:17:40.102546 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-884dp" event={"ID":"22e797bc-1dbd-481f-bb51-c4a04114ecda","Type":"ContainerStarted","Data":"e2ed793a1f8ffc1705a8422a83553acff596b5770ccaf5a3d6759f957020a04b"} Jan 31 15:17:40 crc kubenswrapper[4751]: I0131 15:17:40.134283 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-884dp" podStartSLOduration=1.640223669 podStartE2EDuration="5.134262649s" podCreationTimestamp="2026-01-31 15:17:35 +0000 UTC" firstStartedPulling="2026-01-31 15:17:36.075107065 +0000 UTC m=+2160.449819950" lastFinishedPulling="2026-01-31 15:17:39.569146035 +0000 UTC m=+2163.943858930" observedRunningTime="2026-01-31 15:17:40.133793817 +0000 UTC m=+2164.508506702" watchObservedRunningTime="2026-01-31 15:17:40.134262649 +0000 UTC m=+2164.508975534" Jan 31 15:17:40 crc kubenswrapper[4751]: I0131 15:17:40.156688 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5fvzz" Jan 31 15:17:40 crc kubenswrapper[4751]: I0131 15:17:40.588466 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5fvzz"] Jan 31 15:17:42 crc kubenswrapper[4751]: I0131 15:17:42.114107 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5fvzz" podUID="aa2d3a4e-15bd-4b0d-b187-a4db9049522f" containerName="registry-server" containerID="cri-o://f2e5db66425ffe36121edf1059e80641ecbb035af5d94d86cad041ad2f36fd08" gracePeriod=2 Jan 31 15:17:42 crc kubenswrapper[4751]: I0131 15:17:42.647850 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5fvzz" Jan 31 15:17:42 crc kubenswrapper[4751]: I0131 15:17:42.742549 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa2d3a4e-15bd-4b0d-b187-a4db9049522f-catalog-content\") pod \"aa2d3a4e-15bd-4b0d-b187-a4db9049522f\" (UID: \"aa2d3a4e-15bd-4b0d-b187-a4db9049522f\") " Jan 31 15:17:42 crc kubenswrapper[4751]: I0131 15:17:42.742612 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7p44s\" (UniqueName: \"kubernetes.io/projected/aa2d3a4e-15bd-4b0d-b187-a4db9049522f-kube-api-access-7p44s\") pod \"aa2d3a4e-15bd-4b0d-b187-a4db9049522f\" (UID: \"aa2d3a4e-15bd-4b0d-b187-a4db9049522f\") " Jan 31 15:17:42 crc kubenswrapper[4751]: I0131 15:17:42.742681 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa2d3a4e-15bd-4b0d-b187-a4db9049522f-utilities\") pod \"aa2d3a4e-15bd-4b0d-b187-a4db9049522f\" (UID: \"aa2d3a4e-15bd-4b0d-b187-a4db9049522f\") " Jan 31 15:17:42 crc kubenswrapper[4751]: I0131 15:17:42.743758 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa2d3a4e-15bd-4b0d-b187-a4db9049522f-utilities" (OuterVolumeSpecName: "utilities") pod "aa2d3a4e-15bd-4b0d-b187-a4db9049522f" (UID: "aa2d3a4e-15bd-4b0d-b187-a4db9049522f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:17:42 crc kubenswrapper[4751]: I0131 15:17:42.747754 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa2d3a4e-15bd-4b0d-b187-a4db9049522f-kube-api-access-7p44s" (OuterVolumeSpecName: "kube-api-access-7p44s") pod "aa2d3a4e-15bd-4b0d-b187-a4db9049522f" (UID: "aa2d3a4e-15bd-4b0d-b187-a4db9049522f"). InnerVolumeSpecName "kube-api-access-7p44s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:17:42 crc kubenswrapper[4751]: I0131 15:17:42.769504 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa2d3a4e-15bd-4b0d-b187-a4db9049522f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aa2d3a4e-15bd-4b0d-b187-a4db9049522f" (UID: "aa2d3a4e-15bd-4b0d-b187-a4db9049522f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:17:42 crc kubenswrapper[4751]: I0131 15:17:42.844534 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa2d3a4e-15bd-4b0d-b187-a4db9049522f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 15:17:42 crc kubenswrapper[4751]: I0131 15:17:42.844569 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7p44s\" (UniqueName: \"kubernetes.io/projected/aa2d3a4e-15bd-4b0d-b187-a4db9049522f-kube-api-access-7p44s\") on node \"crc\" DevicePath \"\"" Jan 31 15:17:42 crc kubenswrapper[4751]: I0131 15:17:42.844582 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa2d3a4e-15bd-4b0d-b187-a4db9049522f-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 15:17:43 crc kubenswrapper[4751]: I0131 15:17:43.122048 4751 generic.go:334] "Generic (PLEG): container finished" podID="aa2d3a4e-15bd-4b0d-b187-a4db9049522f" containerID="f2e5db66425ffe36121edf1059e80641ecbb035af5d94d86cad041ad2f36fd08" exitCode=0 Jan 31 15:17:43 crc kubenswrapper[4751]: I0131 15:17:43.122118 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5fvzz" event={"ID":"aa2d3a4e-15bd-4b0d-b187-a4db9049522f","Type":"ContainerDied","Data":"f2e5db66425ffe36121edf1059e80641ecbb035af5d94d86cad041ad2f36fd08"} Jan 31 15:17:43 crc kubenswrapper[4751]: I0131 15:17:43.122186 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5fvzz" event={"ID":"aa2d3a4e-15bd-4b0d-b187-a4db9049522f","Type":"ContainerDied","Data":"425504c025aad8a53ebdc18ebf06cc6728cfc8eaec5f1ce8efad85e1606a66c7"} Jan 31 15:17:43 crc kubenswrapper[4751]: I0131 15:17:43.122228 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5fvzz" Jan 31 15:17:43 crc kubenswrapper[4751]: I0131 15:17:43.122211 4751 scope.go:117] "RemoveContainer" containerID="f2e5db66425ffe36121edf1059e80641ecbb035af5d94d86cad041ad2f36fd08" Jan 31 15:17:43 crc kubenswrapper[4751]: I0131 15:17:43.140940 4751 scope.go:117] "RemoveContainer" containerID="f5cc5de70aba5a8a756cd1dc84d47958d0b1e225fc2bd378950d452ba598ee88" Jan 31 15:17:43 crc kubenswrapper[4751]: I0131 15:17:43.161246 4751 scope.go:117] "RemoveContainer" containerID="1bb577b2261a76c8b92f3b1e2f277f6b87a8febd1df30295d8a1636ce64ffba7" Jan 31 15:17:43 crc kubenswrapper[4751]: I0131 15:17:43.196905 4751 scope.go:117] "RemoveContainer" containerID="f2e5db66425ffe36121edf1059e80641ecbb035af5d94d86cad041ad2f36fd08" Jan 31 15:17:43 crc kubenswrapper[4751]: E0131 15:17:43.198448 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2e5db66425ffe36121edf1059e80641ecbb035af5d94d86cad041ad2f36fd08\": container with ID starting with f2e5db66425ffe36121edf1059e80641ecbb035af5d94d86cad041ad2f36fd08 not found: ID does not exist" containerID="f2e5db66425ffe36121edf1059e80641ecbb035af5d94d86cad041ad2f36fd08" Jan 31 15:17:43 crc kubenswrapper[4751]: I0131 15:17:43.198533 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2e5db66425ffe36121edf1059e80641ecbb035af5d94d86cad041ad2f36fd08"} err="failed to get container status \"f2e5db66425ffe36121edf1059e80641ecbb035af5d94d86cad041ad2f36fd08\": rpc error: code = NotFound desc = could not find container \"f2e5db66425ffe36121edf1059e80641ecbb035af5d94d86cad041ad2f36fd08\": container with ID starting with f2e5db66425ffe36121edf1059e80641ecbb035af5d94d86cad041ad2f36fd08 not found: ID does not exist" Jan 31 15:17:43 crc kubenswrapper[4751]: I0131 15:17:43.198599 4751 scope.go:117] "RemoveContainer" containerID="f5cc5de70aba5a8a756cd1dc84d47958d0b1e225fc2bd378950d452ba598ee88" Jan 31 15:17:43 crc kubenswrapper[4751]: E0131 15:17:43.200160 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5cc5de70aba5a8a756cd1dc84d47958d0b1e225fc2bd378950d452ba598ee88\": container with ID starting with f5cc5de70aba5a8a756cd1dc84d47958d0b1e225fc2bd378950d452ba598ee88 not found: ID does not exist" containerID="f5cc5de70aba5a8a756cd1dc84d47958d0b1e225fc2bd378950d452ba598ee88" Jan 31 15:17:43 crc kubenswrapper[4751]: I0131 15:17:43.200260 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5cc5de70aba5a8a756cd1dc84d47958d0b1e225fc2bd378950d452ba598ee88"} err="failed to get container status \"f5cc5de70aba5a8a756cd1dc84d47958d0b1e225fc2bd378950d452ba598ee88\": rpc error: code = NotFound desc = could not find container \"f5cc5de70aba5a8a756cd1dc84d47958d0b1e225fc2bd378950d452ba598ee88\": container with ID starting with f5cc5de70aba5a8a756cd1dc84d47958d0b1e225fc2bd378950d452ba598ee88 not found: ID does not exist" Jan 31 15:17:43 crc kubenswrapper[4751]: I0131 15:17:43.200345 4751 scope.go:117] "RemoveContainer" containerID="1bb577b2261a76c8b92f3b1e2f277f6b87a8febd1df30295d8a1636ce64ffba7" Jan 31 15:17:43 crc kubenswrapper[4751]: I0131 15:17:43.204846 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5fvzz"] Jan 31 15:17:43 crc kubenswrapper[4751]: I0131 15:17:43.226185 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5fvzz"] Jan 31 15:17:43 crc kubenswrapper[4751]: E0131 15:17:43.206238 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bb577b2261a76c8b92f3b1e2f277f6b87a8febd1df30295d8a1636ce64ffba7\": container with ID starting with 1bb577b2261a76c8b92f3b1e2f277f6b87a8febd1df30295d8a1636ce64ffba7 not found: ID does not exist" containerID="1bb577b2261a76c8b92f3b1e2f277f6b87a8febd1df30295d8a1636ce64ffba7" Jan 31 15:17:43 crc kubenswrapper[4751]: I0131 15:17:43.226284 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bb577b2261a76c8b92f3b1e2f277f6b87a8febd1df30295d8a1636ce64ffba7"} err="failed to get container status \"1bb577b2261a76c8b92f3b1e2f277f6b87a8febd1df30295d8a1636ce64ffba7\": rpc error: code = NotFound desc = could not find container \"1bb577b2261a76c8b92f3b1e2f277f6b87a8febd1df30295d8a1636ce64ffba7\": container with ID starting with 1bb577b2261a76c8b92f3b1e2f277f6b87a8febd1df30295d8a1636ce64ffba7 not found: ID does not exist" Jan 31 15:17:44 crc kubenswrapper[4751]: I0131 15:17:44.413585 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa2d3a4e-15bd-4b0d-b187-a4db9049522f" path="/var/lib/kubelet/pods/aa2d3a4e-15bd-4b0d-b187-a4db9049522f/volumes" Jan 31 15:17:45 crc kubenswrapper[4751]: I0131 15:17:45.527329 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-884dp" Jan 31 15:17:45 crc kubenswrapper[4751]: I0131 15:17:45.527746 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-884dp" Jan 31 15:17:46 crc kubenswrapper[4751]: I0131 15:17:46.573994 4751 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-884dp" podUID="22e797bc-1dbd-481f-bb51-c4a04114ecda" containerName="registry-server" probeResult="failure" output=< Jan 31 15:17:46 crc kubenswrapper[4751]: timeout: failed to connect service ":50051" within 1s Jan 31 15:17:46 crc kubenswrapper[4751]: > Jan 31 15:17:55 crc kubenswrapper[4751]: I0131 15:17:55.576951 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-884dp" Jan 31 15:17:55 crc kubenswrapper[4751]: I0131 15:17:55.632609 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-884dp" Jan 31 15:17:55 crc kubenswrapper[4751]: I0131 15:17:55.811467 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-884dp"] Jan 31 15:17:57 crc kubenswrapper[4751]: I0131 15:17:57.210262 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-884dp" podUID="22e797bc-1dbd-481f-bb51-c4a04114ecda" containerName="registry-server" containerID="cri-o://e2ed793a1f8ffc1705a8422a83553acff596b5770ccaf5a3d6759f957020a04b" gracePeriod=2 Jan 31 15:17:57 crc kubenswrapper[4751]: I0131 15:17:57.530394 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-884dp" Jan 31 15:17:57 crc kubenswrapper[4751]: I0131 15:17:57.550942 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22e797bc-1dbd-481f-bb51-c4a04114ecda-catalog-content\") pod \"22e797bc-1dbd-481f-bb51-c4a04114ecda\" (UID: \"22e797bc-1dbd-481f-bb51-c4a04114ecda\") " Jan 31 15:17:57 crc kubenswrapper[4751]: I0131 15:17:57.552280 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22e797bc-1dbd-481f-bb51-c4a04114ecda-utilities\") pod \"22e797bc-1dbd-481f-bb51-c4a04114ecda\" (UID: \"22e797bc-1dbd-481f-bb51-c4a04114ecda\") " Jan 31 15:17:57 crc kubenswrapper[4751]: I0131 15:17:57.552317 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnvk2\" (UniqueName: \"kubernetes.io/projected/22e797bc-1dbd-481f-bb51-c4a04114ecda-kube-api-access-gnvk2\") pod \"22e797bc-1dbd-481f-bb51-c4a04114ecda\" (UID: \"22e797bc-1dbd-481f-bb51-c4a04114ecda\") " Jan 31 15:17:57 crc kubenswrapper[4751]: I0131 15:17:57.554607 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22e797bc-1dbd-481f-bb51-c4a04114ecda-utilities" (OuterVolumeSpecName: "utilities") pod "22e797bc-1dbd-481f-bb51-c4a04114ecda" (UID: "22e797bc-1dbd-481f-bb51-c4a04114ecda"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:17:57 crc kubenswrapper[4751]: I0131 15:17:57.557886 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22e797bc-1dbd-481f-bb51-c4a04114ecda-kube-api-access-gnvk2" (OuterVolumeSpecName: "kube-api-access-gnvk2") pod "22e797bc-1dbd-481f-bb51-c4a04114ecda" (UID: "22e797bc-1dbd-481f-bb51-c4a04114ecda"). InnerVolumeSpecName "kube-api-access-gnvk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:17:57 crc kubenswrapper[4751]: I0131 15:17:57.654135 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22e797bc-1dbd-481f-bb51-c4a04114ecda-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 15:17:57 crc kubenswrapper[4751]: I0131 15:17:57.654188 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnvk2\" (UniqueName: \"kubernetes.io/projected/22e797bc-1dbd-481f-bb51-c4a04114ecda-kube-api-access-gnvk2\") on node \"crc\" DevicePath \"\"" Jan 31 15:17:57 crc kubenswrapper[4751]: I0131 15:17:57.681840 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22e797bc-1dbd-481f-bb51-c4a04114ecda-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "22e797bc-1dbd-481f-bb51-c4a04114ecda" (UID: "22e797bc-1dbd-481f-bb51-c4a04114ecda"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:17:57 crc kubenswrapper[4751]: I0131 15:17:57.755874 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22e797bc-1dbd-481f-bb51-c4a04114ecda-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 15:17:58 crc kubenswrapper[4751]: I0131 15:17:58.220650 4751 generic.go:334] "Generic (PLEG): container finished" podID="22e797bc-1dbd-481f-bb51-c4a04114ecda" containerID="e2ed793a1f8ffc1705a8422a83553acff596b5770ccaf5a3d6759f957020a04b" exitCode=0 Jan 31 15:17:58 crc kubenswrapper[4751]: I0131 15:17:58.220703 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-884dp" event={"ID":"22e797bc-1dbd-481f-bb51-c4a04114ecda","Type":"ContainerDied","Data":"e2ed793a1f8ffc1705a8422a83553acff596b5770ccaf5a3d6759f957020a04b"} Jan 31 15:17:58 crc kubenswrapper[4751]: I0131 15:17:58.220755 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-884dp" event={"ID":"22e797bc-1dbd-481f-bb51-c4a04114ecda","Type":"ContainerDied","Data":"03a99742f46b04ca6f4e4681bc1cc7dbb9c2e6413829a7704f70d7b3e8fddd54"} Jan 31 15:17:58 crc kubenswrapper[4751]: I0131 15:17:58.220779 4751 scope.go:117] "RemoveContainer" containerID="e2ed793a1f8ffc1705a8422a83553acff596b5770ccaf5a3d6759f957020a04b" Jan 31 15:17:58 crc kubenswrapper[4751]: I0131 15:17:58.220825 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-884dp" Jan 31 15:17:58 crc kubenswrapper[4751]: I0131 15:17:58.246059 4751 scope.go:117] "RemoveContainer" containerID="45d268c59b0d70d8f64877dfb7af473e60a933653390856ec83bb4377a58cb74" Jan 31 15:17:58 crc kubenswrapper[4751]: I0131 15:17:58.268358 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-884dp"] Jan 31 15:17:58 crc kubenswrapper[4751]: I0131 15:17:58.274614 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-884dp"] Jan 31 15:17:58 crc kubenswrapper[4751]: I0131 15:17:58.279289 4751 scope.go:117] "RemoveContainer" containerID="a89d2ffa602b194f1e3ddd9004485c6ff999c6fb57b08c748be6c30f28b34770" Jan 31 15:17:58 crc kubenswrapper[4751]: I0131 15:17:58.298632 4751 scope.go:117] "RemoveContainer" containerID="e2ed793a1f8ffc1705a8422a83553acff596b5770ccaf5a3d6759f957020a04b" Jan 31 15:17:58 crc kubenswrapper[4751]: E0131 15:17:58.299019 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2ed793a1f8ffc1705a8422a83553acff596b5770ccaf5a3d6759f957020a04b\": container with ID starting with e2ed793a1f8ffc1705a8422a83553acff596b5770ccaf5a3d6759f957020a04b not found: ID does not exist" containerID="e2ed793a1f8ffc1705a8422a83553acff596b5770ccaf5a3d6759f957020a04b" Jan 31 15:17:58 crc kubenswrapper[4751]: I0131 15:17:58.299049 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2ed793a1f8ffc1705a8422a83553acff596b5770ccaf5a3d6759f957020a04b"} err="failed to get container status \"e2ed793a1f8ffc1705a8422a83553acff596b5770ccaf5a3d6759f957020a04b\": rpc error: code = NotFound desc = could not find container \"e2ed793a1f8ffc1705a8422a83553acff596b5770ccaf5a3d6759f957020a04b\": container with ID starting with e2ed793a1f8ffc1705a8422a83553acff596b5770ccaf5a3d6759f957020a04b not found: ID does not exist" Jan 31 15:17:58 crc kubenswrapper[4751]: I0131 15:17:58.299084 4751 scope.go:117] "RemoveContainer" containerID="45d268c59b0d70d8f64877dfb7af473e60a933653390856ec83bb4377a58cb74" Jan 31 15:17:58 crc kubenswrapper[4751]: E0131 15:17:58.299442 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45d268c59b0d70d8f64877dfb7af473e60a933653390856ec83bb4377a58cb74\": container with ID starting with 45d268c59b0d70d8f64877dfb7af473e60a933653390856ec83bb4377a58cb74 not found: ID does not exist" containerID="45d268c59b0d70d8f64877dfb7af473e60a933653390856ec83bb4377a58cb74" Jan 31 15:17:58 crc kubenswrapper[4751]: I0131 15:17:58.299491 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45d268c59b0d70d8f64877dfb7af473e60a933653390856ec83bb4377a58cb74"} err="failed to get container status \"45d268c59b0d70d8f64877dfb7af473e60a933653390856ec83bb4377a58cb74\": rpc error: code = NotFound desc = could not find container \"45d268c59b0d70d8f64877dfb7af473e60a933653390856ec83bb4377a58cb74\": container with ID starting with 45d268c59b0d70d8f64877dfb7af473e60a933653390856ec83bb4377a58cb74 not found: ID does not exist" Jan 31 15:17:58 crc kubenswrapper[4751]: I0131 15:17:58.299523 4751 scope.go:117] "RemoveContainer" containerID="a89d2ffa602b194f1e3ddd9004485c6ff999c6fb57b08c748be6c30f28b34770" Jan 31 15:17:58 crc kubenswrapper[4751]: E0131 15:17:58.299808 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a89d2ffa602b194f1e3ddd9004485c6ff999c6fb57b08c748be6c30f28b34770\": container with ID starting with a89d2ffa602b194f1e3ddd9004485c6ff999c6fb57b08c748be6c30f28b34770 not found: ID does not exist" containerID="a89d2ffa602b194f1e3ddd9004485c6ff999c6fb57b08c748be6c30f28b34770" Jan 31 15:17:58 crc kubenswrapper[4751]: I0131 15:17:58.299839 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a89d2ffa602b194f1e3ddd9004485c6ff999c6fb57b08c748be6c30f28b34770"} err="failed to get container status \"a89d2ffa602b194f1e3ddd9004485c6ff999c6fb57b08c748be6c30f28b34770\": rpc error: code = NotFound desc = could not find container \"a89d2ffa602b194f1e3ddd9004485c6ff999c6fb57b08c748be6c30f28b34770\": container with ID starting with a89d2ffa602b194f1e3ddd9004485c6ff999c6fb57b08c748be6c30f28b34770 not found: ID does not exist" Jan 31 15:17:58 crc kubenswrapper[4751]: I0131 15:17:58.412528 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22e797bc-1dbd-481f-bb51-c4a04114ecda" path="/var/lib/kubelet/pods/22e797bc-1dbd-481f-bb51-c4a04114ecda/volumes" Jan 31 15:18:05 crc kubenswrapper[4751]: I0131 15:18:05.939667 4751 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kjsw9"] Jan 31 15:18:05 crc kubenswrapper[4751]: E0131 15:18:05.940363 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa2d3a4e-15bd-4b0d-b187-a4db9049522f" containerName="extract-utilities" Jan 31 15:18:05 crc kubenswrapper[4751]: I0131 15:18:05.940376 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa2d3a4e-15bd-4b0d-b187-a4db9049522f" containerName="extract-utilities" Jan 31 15:18:05 crc kubenswrapper[4751]: E0131 15:18:05.940389 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22e797bc-1dbd-481f-bb51-c4a04114ecda" containerName="extract-utilities" Jan 31 15:18:05 crc kubenswrapper[4751]: I0131 15:18:05.940394 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="22e797bc-1dbd-481f-bb51-c4a04114ecda" containerName="extract-utilities" Jan 31 15:18:05 crc kubenswrapper[4751]: E0131 15:18:05.940417 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa2d3a4e-15bd-4b0d-b187-a4db9049522f" containerName="registry-server" Jan 31 15:18:05 crc kubenswrapper[4751]: I0131 15:18:05.940425 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa2d3a4e-15bd-4b0d-b187-a4db9049522f" containerName="registry-server" Jan 31 15:18:05 crc kubenswrapper[4751]: E0131 15:18:05.940431 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa2d3a4e-15bd-4b0d-b187-a4db9049522f" containerName="extract-content" Jan 31 15:18:05 crc kubenswrapper[4751]: I0131 15:18:05.940437 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa2d3a4e-15bd-4b0d-b187-a4db9049522f" containerName="extract-content" Jan 31 15:18:05 crc kubenswrapper[4751]: E0131 15:18:05.940711 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22e797bc-1dbd-481f-bb51-c4a04114ecda" containerName="extract-content" Jan 31 15:18:05 crc kubenswrapper[4751]: I0131 15:18:05.940721 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="22e797bc-1dbd-481f-bb51-c4a04114ecda" containerName="extract-content" Jan 31 15:18:05 crc kubenswrapper[4751]: E0131 15:18:05.940729 4751 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22e797bc-1dbd-481f-bb51-c4a04114ecda" containerName="registry-server" Jan 31 15:18:05 crc kubenswrapper[4751]: I0131 15:18:05.940737 4751 state_mem.go:107] "Deleted CPUSet assignment" podUID="22e797bc-1dbd-481f-bb51-c4a04114ecda" containerName="registry-server" Jan 31 15:18:05 crc kubenswrapper[4751]: I0131 15:18:05.940915 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="22e797bc-1dbd-481f-bb51-c4a04114ecda" containerName="registry-server" Jan 31 15:18:05 crc kubenswrapper[4751]: I0131 15:18:05.940969 4751 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa2d3a4e-15bd-4b0d-b187-a4db9049522f" containerName="registry-server" Jan 31 15:18:05 crc kubenswrapper[4751]: I0131 15:18:05.947988 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kjsw9" Jan 31 15:18:05 crc kubenswrapper[4751]: I0131 15:18:05.952352 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kjsw9"] Jan 31 15:18:05 crc kubenswrapper[4751]: I0131 15:18:05.997685 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nqjj\" (UniqueName: \"kubernetes.io/projected/1be979bb-7f99-4bd6-90f4-c2684f713320-kube-api-access-7nqjj\") pod \"certified-operators-kjsw9\" (UID: \"1be979bb-7f99-4bd6-90f4-c2684f713320\") " pod="openshift-marketplace/certified-operators-kjsw9" Jan 31 15:18:05 crc kubenswrapper[4751]: I0131 15:18:05.998467 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1be979bb-7f99-4bd6-90f4-c2684f713320-catalog-content\") pod \"certified-operators-kjsw9\" (UID: \"1be979bb-7f99-4bd6-90f4-c2684f713320\") " pod="openshift-marketplace/certified-operators-kjsw9" Jan 31 15:18:06 crc kubenswrapper[4751]: I0131 15:18:06.000392 4751 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1be979bb-7f99-4bd6-90f4-c2684f713320-utilities\") pod \"certified-operators-kjsw9\" (UID: \"1be979bb-7f99-4bd6-90f4-c2684f713320\") " pod="openshift-marketplace/certified-operators-kjsw9" Jan 31 15:18:06 crc kubenswrapper[4751]: I0131 15:18:06.101277 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nqjj\" (UniqueName: \"kubernetes.io/projected/1be979bb-7f99-4bd6-90f4-c2684f713320-kube-api-access-7nqjj\") pod \"certified-operators-kjsw9\" (UID: \"1be979bb-7f99-4bd6-90f4-c2684f713320\") " pod="openshift-marketplace/certified-operators-kjsw9" Jan 31 15:18:06 crc kubenswrapper[4751]: I0131 15:18:06.101901 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1be979bb-7f99-4bd6-90f4-c2684f713320-catalog-content\") pod \"certified-operators-kjsw9\" (UID: \"1be979bb-7f99-4bd6-90f4-c2684f713320\") " pod="openshift-marketplace/certified-operators-kjsw9" Jan 31 15:18:06 crc kubenswrapper[4751]: I0131 15:18:06.102427 4751 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1be979bb-7f99-4bd6-90f4-c2684f713320-utilities\") pod \"certified-operators-kjsw9\" (UID: \"1be979bb-7f99-4bd6-90f4-c2684f713320\") " pod="openshift-marketplace/certified-operators-kjsw9" Jan 31 15:18:06 crc kubenswrapper[4751]: I0131 15:18:06.102363 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1be979bb-7f99-4bd6-90f4-c2684f713320-catalog-content\") pod \"certified-operators-kjsw9\" (UID: \"1be979bb-7f99-4bd6-90f4-c2684f713320\") " pod="openshift-marketplace/certified-operators-kjsw9" Jan 31 15:18:06 crc kubenswrapper[4751]: I0131 15:18:06.102697 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1be979bb-7f99-4bd6-90f4-c2684f713320-utilities\") pod \"certified-operators-kjsw9\" (UID: \"1be979bb-7f99-4bd6-90f4-c2684f713320\") " pod="openshift-marketplace/certified-operators-kjsw9" Jan 31 15:18:06 crc kubenswrapper[4751]: I0131 15:18:06.121658 4751 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nqjj\" (UniqueName: \"kubernetes.io/projected/1be979bb-7f99-4bd6-90f4-c2684f713320-kube-api-access-7nqjj\") pod \"certified-operators-kjsw9\" (UID: \"1be979bb-7f99-4bd6-90f4-c2684f713320\") " pod="openshift-marketplace/certified-operators-kjsw9" Jan 31 15:18:06 crc kubenswrapper[4751]: I0131 15:18:06.283210 4751 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kjsw9" Jan 31 15:18:06 crc kubenswrapper[4751]: I0131 15:18:06.689517 4751 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kjsw9"] Jan 31 15:18:07 crc kubenswrapper[4751]: I0131 15:18:07.277900 4751 generic.go:334] "Generic (PLEG): container finished" podID="1be979bb-7f99-4bd6-90f4-c2684f713320" containerID="db5726e8f7117d04a15da9374f1fae5fd3062caebf44687ad07700cc1c6d2549" exitCode=0 Jan 31 15:18:07 crc kubenswrapper[4751]: I0131 15:18:07.277993 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kjsw9" event={"ID":"1be979bb-7f99-4bd6-90f4-c2684f713320","Type":"ContainerDied","Data":"db5726e8f7117d04a15da9374f1fae5fd3062caebf44687ad07700cc1c6d2549"} Jan 31 15:18:07 crc kubenswrapper[4751]: I0131 15:18:07.278230 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kjsw9" event={"ID":"1be979bb-7f99-4bd6-90f4-c2684f713320","Type":"ContainerStarted","Data":"4d2c99bfa9a6d824026e8ec15e419c946f638ae4d0eea2478784cbb66e66ee81"} Jan 31 15:18:08 crc kubenswrapper[4751]: I0131 15:18:08.897152 4751 patch_prober.go:28] interesting pod/machine-config-daemon-2wpj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 15:18:08 crc kubenswrapper[4751]: I0131 15:18:08.897549 4751 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podUID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 15:18:08 crc kubenswrapper[4751]: I0131 15:18:08.897612 4751 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" Jan 31 15:18:08 crc kubenswrapper[4751]: I0131 15:18:08.898414 4751 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1255c884133e00fc9c5d808129089de90e3ff1b6af74e3a15a0350ae021f2f6b"} pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 31 15:18:08 crc kubenswrapper[4751]: I0131 15:18:08.898492 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" podUID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" containerName="machine-config-daemon" containerID="cri-o://1255c884133e00fc9c5d808129089de90e3ff1b6af74e3a15a0350ae021f2f6b" gracePeriod=600 Jan 31 15:18:09 crc kubenswrapper[4751]: I0131 15:18:09.309127 4751 generic.go:334] "Generic (PLEG): container finished" podID="1be979bb-7f99-4bd6-90f4-c2684f713320" containerID="549328def8f959fb462b14b23e7a9d4bbfe2f09a052ae1b1417a29da200d9acf" exitCode=0 Jan 31 15:18:09 crc kubenswrapper[4751]: I0131 15:18:09.309181 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kjsw9" event={"ID":"1be979bb-7f99-4bd6-90f4-c2684f713320","Type":"ContainerDied","Data":"549328def8f959fb462b14b23e7a9d4bbfe2f09a052ae1b1417a29da200d9acf"} Jan 31 15:18:09 crc kubenswrapper[4751]: I0131 15:18:09.318329 4751 generic.go:334] "Generic (PLEG): container finished" podID="b4c170e8-22c9-43a9-8b34-9d626c2ccddc" containerID="1255c884133e00fc9c5d808129089de90e3ff1b6af74e3a15a0350ae021f2f6b" exitCode=0 Jan 31 15:18:09 crc kubenswrapper[4751]: I0131 15:18:09.318370 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" event={"ID":"b4c170e8-22c9-43a9-8b34-9d626c2ccddc","Type":"ContainerDied","Data":"1255c884133e00fc9c5d808129089de90e3ff1b6af74e3a15a0350ae021f2f6b"} Jan 31 15:18:09 crc kubenswrapper[4751]: I0131 15:18:09.318399 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wpj7" event={"ID":"b4c170e8-22c9-43a9-8b34-9d626c2ccddc","Type":"ContainerStarted","Data":"4723d2a8bf569fe8e42b360fff841d94c9bc30a64ad462cbd7d1ed8c7287cce3"} Jan 31 15:18:09 crc kubenswrapper[4751]: I0131 15:18:09.318415 4751 scope.go:117] "RemoveContainer" containerID="1e91cdb1164832b2457470e9c5a6b63801cdbe4c69db7c3369929241eb60a130" Jan 31 15:18:10 crc kubenswrapper[4751]: I0131 15:18:10.327477 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kjsw9" event={"ID":"1be979bb-7f99-4bd6-90f4-c2684f713320","Type":"ContainerStarted","Data":"8bc374e7f9872d1345d7f5c7b05d8fc1fb69e7fcc61226e9a5fa05de7371baca"} Jan 31 15:18:16 crc kubenswrapper[4751]: I0131 15:18:16.286165 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kjsw9" Jan 31 15:18:16 crc kubenswrapper[4751]: I0131 15:18:16.286817 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kjsw9" Jan 31 15:18:16 crc kubenswrapper[4751]: I0131 15:18:16.368704 4751 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kjsw9" Jan 31 15:18:16 crc kubenswrapper[4751]: I0131 15:18:16.401048 4751 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kjsw9" podStartSLOduration=8.991680537 podStartE2EDuration="11.401027546s" podCreationTimestamp="2026-01-31 15:18:05 +0000 UTC" firstStartedPulling="2026-01-31 15:18:07.280274026 +0000 UTC m=+2191.654986951" lastFinishedPulling="2026-01-31 15:18:09.689621075 +0000 UTC m=+2194.064333960" observedRunningTime="2026-01-31 15:18:10.351178849 +0000 UTC m=+2194.725891744" watchObservedRunningTime="2026-01-31 15:18:16.401027546 +0000 UTC m=+2200.775740431" Jan 31 15:18:16 crc kubenswrapper[4751]: I0131 15:18:16.435057 4751 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kjsw9" Jan 31 15:18:16 crc kubenswrapper[4751]: I0131 15:18:16.613628 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kjsw9"] Jan 31 15:18:18 crc kubenswrapper[4751]: I0131 15:18:18.387943 4751 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kjsw9" podUID="1be979bb-7f99-4bd6-90f4-c2684f713320" containerName="registry-server" containerID="cri-o://8bc374e7f9872d1345d7f5c7b05d8fc1fb69e7fcc61226e9a5fa05de7371baca" gracePeriod=2 Jan 31 15:18:18 crc kubenswrapper[4751]: I0131 15:18:18.862648 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kjsw9" Jan 31 15:18:18 crc kubenswrapper[4751]: I0131 15:18:18.893576 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1be979bb-7f99-4bd6-90f4-c2684f713320-catalog-content\") pod \"1be979bb-7f99-4bd6-90f4-c2684f713320\" (UID: \"1be979bb-7f99-4bd6-90f4-c2684f713320\") " Jan 31 15:18:18 crc kubenswrapper[4751]: I0131 15:18:18.893663 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nqjj\" (UniqueName: \"kubernetes.io/projected/1be979bb-7f99-4bd6-90f4-c2684f713320-kube-api-access-7nqjj\") pod \"1be979bb-7f99-4bd6-90f4-c2684f713320\" (UID: \"1be979bb-7f99-4bd6-90f4-c2684f713320\") " Jan 31 15:18:18 crc kubenswrapper[4751]: I0131 15:18:18.893687 4751 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1be979bb-7f99-4bd6-90f4-c2684f713320-utilities\") pod \"1be979bb-7f99-4bd6-90f4-c2684f713320\" (UID: \"1be979bb-7f99-4bd6-90f4-c2684f713320\") " Jan 31 15:18:18 crc kubenswrapper[4751]: I0131 15:18:18.894987 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1be979bb-7f99-4bd6-90f4-c2684f713320-utilities" (OuterVolumeSpecName: "utilities") pod "1be979bb-7f99-4bd6-90f4-c2684f713320" (UID: "1be979bb-7f99-4bd6-90f4-c2684f713320"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:18:18 crc kubenswrapper[4751]: I0131 15:18:18.902686 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1be979bb-7f99-4bd6-90f4-c2684f713320-kube-api-access-7nqjj" (OuterVolumeSpecName: "kube-api-access-7nqjj") pod "1be979bb-7f99-4bd6-90f4-c2684f713320" (UID: "1be979bb-7f99-4bd6-90f4-c2684f713320"). InnerVolumeSpecName "kube-api-access-7nqjj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 15:18:18 crc kubenswrapper[4751]: I0131 15:18:18.950420 4751 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1be979bb-7f99-4bd6-90f4-c2684f713320-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1be979bb-7f99-4bd6-90f4-c2684f713320" (UID: "1be979bb-7f99-4bd6-90f4-c2684f713320"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 15:18:18 crc kubenswrapper[4751]: I0131 15:18:18.995377 4751 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1be979bb-7f99-4bd6-90f4-c2684f713320-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 15:18:18 crc kubenswrapper[4751]: I0131 15:18:18.995408 4751 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7nqjj\" (UniqueName: \"kubernetes.io/projected/1be979bb-7f99-4bd6-90f4-c2684f713320-kube-api-access-7nqjj\") on node \"crc\" DevicePath \"\"" Jan 31 15:18:18 crc kubenswrapper[4751]: I0131 15:18:18.995421 4751 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1be979bb-7f99-4bd6-90f4-c2684f713320-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 15:18:19 crc kubenswrapper[4751]: I0131 15:18:19.399479 4751 generic.go:334] "Generic (PLEG): container finished" podID="1be979bb-7f99-4bd6-90f4-c2684f713320" containerID="8bc374e7f9872d1345d7f5c7b05d8fc1fb69e7fcc61226e9a5fa05de7371baca" exitCode=0 Jan 31 15:18:19 crc kubenswrapper[4751]: I0131 15:18:19.399638 4751 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kjsw9" Jan 31 15:18:19 crc kubenswrapper[4751]: I0131 15:18:19.401715 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kjsw9" event={"ID":"1be979bb-7f99-4bd6-90f4-c2684f713320","Type":"ContainerDied","Data":"8bc374e7f9872d1345d7f5c7b05d8fc1fb69e7fcc61226e9a5fa05de7371baca"} Jan 31 15:18:19 crc kubenswrapper[4751]: I0131 15:18:19.401975 4751 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kjsw9" event={"ID":"1be979bb-7f99-4bd6-90f4-c2684f713320","Type":"ContainerDied","Data":"4d2c99bfa9a6d824026e8ec15e419c946f638ae4d0eea2478784cbb66e66ee81"} Jan 31 15:18:19 crc kubenswrapper[4751]: I0131 15:18:19.402036 4751 scope.go:117] "RemoveContainer" containerID="8bc374e7f9872d1345d7f5c7b05d8fc1fb69e7fcc61226e9a5fa05de7371baca" Jan 31 15:18:19 crc kubenswrapper[4751]: I0131 15:18:19.428601 4751 scope.go:117] "RemoveContainer" containerID="549328def8f959fb462b14b23e7a9d4bbfe2f09a052ae1b1417a29da200d9acf" Jan 31 15:18:19 crc kubenswrapper[4751]: I0131 15:18:19.444198 4751 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kjsw9"] Jan 31 15:18:19 crc kubenswrapper[4751]: I0131 15:18:19.447919 4751 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kjsw9"] Jan 31 15:18:19 crc kubenswrapper[4751]: I0131 15:18:19.461208 4751 scope.go:117] "RemoveContainer" containerID="db5726e8f7117d04a15da9374f1fae5fd3062caebf44687ad07700cc1c6d2549" Jan 31 15:18:19 crc kubenswrapper[4751]: I0131 15:18:19.478430 4751 scope.go:117] "RemoveContainer" containerID="8bc374e7f9872d1345d7f5c7b05d8fc1fb69e7fcc61226e9a5fa05de7371baca" Jan 31 15:18:19 crc kubenswrapper[4751]: E0131 15:18:19.479104 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bc374e7f9872d1345d7f5c7b05d8fc1fb69e7fcc61226e9a5fa05de7371baca\": container with ID starting with 8bc374e7f9872d1345d7f5c7b05d8fc1fb69e7fcc61226e9a5fa05de7371baca not found: ID does not exist" containerID="8bc374e7f9872d1345d7f5c7b05d8fc1fb69e7fcc61226e9a5fa05de7371baca" Jan 31 15:18:19 crc kubenswrapper[4751]: I0131 15:18:19.479180 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bc374e7f9872d1345d7f5c7b05d8fc1fb69e7fcc61226e9a5fa05de7371baca"} err="failed to get container status \"8bc374e7f9872d1345d7f5c7b05d8fc1fb69e7fcc61226e9a5fa05de7371baca\": rpc error: code = NotFound desc = could not find container \"8bc374e7f9872d1345d7f5c7b05d8fc1fb69e7fcc61226e9a5fa05de7371baca\": container with ID starting with 8bc374e7f9872d1345d7f5c7b05d8fc1fb69e7fcc61226e9a5fa05de7371baca not found: ID does not exist" Jan 31 15:18:19 crc kubenswrapper[4751]: I0131 15:18:19.479208 4751 scope.go:117] "RemoveContainer" containerID="549328def8f959fb462b14b23e7a9d4bbfe2f09a052ae1b1417a29da200d9acf" Jan 31 15:18:19 crc kubenswrapper[4751]: E0131 15:18:19.479642 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"549328def8f959fb462b14b23e7a9d4bbfe2f09a052ae1b1417a29da200d9acf\": container with ID starting with 549328def8f959fb462b14b23e7a9d4bbfe2f09a052ae1b1417a29da200d9acf not found: ID does not exist" containerID="549328def8f959fb462b14b23e7a9d4bbfe2f09a052ae1b1417a29da200d9acf" Jan 31 15:18:19 crc kubenswrapper[4751]: I0131 15:18:19.479696 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"549328def8f959fb462b14b23e7a9d4bbfe2f09a052ae1b1417a29da200d9acf"} err="failed to get container status \"549328def8f959fb462b14b23e7a9d4bbfe2f09a052ae1b1417a29da200d9acf\": rpc error: code = NotFound desc = could not find container \"549328def8f959fb462b14b23e7a9d4bbfe2f09a052ae1b1417a29da200d9acf\": container with ID starting with 549328def8f959fb462b14b23e7a9d4bbfe2f09a052ae1b1417a29da200d9acf not found: ID does not exist" Jan 31 15:18:19 crc kubenswrapper[4751]: I0131 15:18:19.479715 4751 scope.go:117] "RemoveContainer" containerID="db5726e8f7117d04a15da9374f1fae5fd3062caebf44687ad07700cc1c6d2549" Jan 31 15:18:19 crc kubenswrapper[4751]: E0131 15:18:19.480005 4751 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db5726e8f7117d04a15da9374f1fae5fd3062caebf44687ad07700cc1c6d2549\": container with ID starting with db5726e8f7117d04a15da9374f1fae5fd3062caebf44687ad07700cc1c6d2549 not found: ID does not exist" containerID="db5726e8f7117d04a15da9374f1fae5fd3062caebf44687ad07700cc1c6d2549" Jan 31 15:18:19 crc kubenswrapper[4751]: I0131 15:18:19.480035 4751 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db5726e8f7117d04a15da9374f1fae5fd3062caebf44687ad07700cc1c6d2549"} err="failed to get container status \"db5726e8f7117d04a15da9374f1fae5fd3062caebf44687ad07700cc1c6d2549\": rpc error: code = NotFound desc = could not find container \"db5726e8f7117d04a15da9374f1fae5fd3062caebf44687ad07700cc1c6d2549\": container with ID starting with db5726e8f7117d04a15da9374f1fae5fd3062caebf44687ad07700cc1c6d2549 not found: ID does not exist" Jan 31 15:18:20 crc kubenswrapper[4751]: I0131 15:18:20.411823 4751 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1be979bb-7f99-4bd6-90f4-c2684f713320" path="/var/lib/kubelet/pods/1be979bb-7f99-4bd6-90f4-c2684f713320/volumes" Jan 31 15:19:14 crc kubenswrapper[4751]: E0131 15:19:14.325208 4751 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-config: configmap "openstack-config" not found Jan 31 15:19:14 crc kubenswrapper[4751]: E0131 15:19:14.325439 4751 secret.go:188] Couldn't get secret glance-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Jan 31 15:19:14 crc kubenswrapper[4751]: E0131 15:19:14.325823 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config podName:eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb nodeName:}" failed. No retries permitted until 2026-01-31 15:21:16.32579767 +0000 UTC m=+2380.700510545 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config") pod "openstackclient" (UID: "eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb") : configmap "openstack-config" not found Jan 31 15:19:14 crc kubenswrapper[4751]: E0131 15:19:14.325936 4751 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config-secret podName:eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb nodeName:}" failed. No retries permitted until 2026-01-31 15:21:16.325899913 +0000 UTC m=+2380.700612838 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb-openstack-config-secret") pod "openstackclient" (UID: "eb3a93b7-3d78-48ab-b2e0-d31a7adefaeb") : secret "openstack-config-secret" not found var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515137416616024457 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015137416616017374 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015137411750016511 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015137411750015461 5ustar corecore